Ever since the dawn of computer-generated imagery, artists and technicians have been trying to create virtual people. Personally, I have a very ambivalent relationship with virtual characters that claim to represent reality: Even award-winning visual effects teams keep producing results in their Hollywood productions that send gruesome shivers down your spine. This effect is called "Uncanny Valley" - a character who wants to be real, but is missing small details that make him look "robotic". Over the years, our team has tried their hand at virtual characters again and again - with the conclusion that abstract characters are definitely the better way to go. Until today: because Epic Games has fundamentally changed the entire industry with its "Metahuman" toolbox. With a simple "click, click, done" approach, you create virtual humans that are amazingly similar to their real counterparts and may be used in all Unreal productions.
Das Uncanny Valley ist tot.
The Metahuman Editor delivers ready-to-use characters: At the current state with only a small selection of hairstyles and clothing. The fact that the characters are based on a series of "standard types", which are mixed together to achieve the final result, also means that the results are often visually very similar - after all, they are all based on the "same" DNA. A particularly big challenge is the recreation of natural persons as a virtual copy: Here, you always have to manually apply 3D modeling programs, which is like walking a tightrope, because you can very quickly destroy the predefined properties ("blendshapes") of the character by mistake.
Breathe life into it
Once the virtual character has been created, it can be integrated into Unreal projects in just a few steps. The animation of the character can either be done on the basis of purchased motion databases (which, however, usually turns out to be rather robotic): Or a "real" actor gives the character his soul and the movements through a technology called motion capturing.
The applications for metahumans are manifold. No matter if they are used to decorate a scene - for example in the field of architectural visualization - or if they speak directly into the virtual camera as a centerpiece of attention. In addition to the classic approach of pre-produced animations, the AI toolkit of b.ReX GmbH can connect the character to leading AI providers, for example the artificial intelligence of OpenAI. This enables direct conversation with the character as if it were actually alive. Whereby the expectations are to be limited a bit here and the conversation is more or less limited to a question-and-answer game, which at times still produces somewhat hilarious statements. In order to train conversational AI correctly, very large amounts of qualified data are required, which must be more or less manually input.
I am sure: Epic Games will continue to advance the "metahumans" over the coming years. Applications in the metaverse increasingly require virtual representations in digital space. The next highlight here would clearly be a function that simply generates the virtual character based on a photo. AI-based approaches to this are already available from other providers, but their results are again far inferior to Epic's Metahumans.
Most of the images in this blog post were provided by Epic Games.