For about thirty years, film production has been working with green backgrounds - so-called green screens (sometimes also blue - then blue screen). If actors are placed in front of these monochrome backgrounds, they can be released and replaced by virtual CGI 3D sets. Regardless of whether it's Marvel's Avengers or Harry Potter, the fantastic worlds in which the films are set have been combined in this way with the genuine real image from the camera for decades. In the past, it was also necessary to manually transfer the movement of the "real" camera to a virtual camera: quite an elaborate issue when there are many objects in the foreground of the green screen. Especially objects with reflections, like cars with shiny paint, were a headache for the VFX artists in charge. Since 2020, this has come to an end - at least for some of the productions. The magic keyword is "in-camera VFX". In concrete terms, this means that instead of inserting virtual backgrounds after the fact, they are played on an LED wall in the background and filmed directly from the camera. This shortens (at least in theory) the time required for post-production of the footage.

The good, the bad, the ugly

The modern, LED-based virtual technology has its advantages and disadvantages. The best-known production, which, according to its own statements, now uses a solution developed in-house rather than Unreal, is Disney's "The Mandalorian". Many of the operators for the virtual production caves come from Swabia: Pixomondo, a company specializing in visual effects and 3D animation, is currently providing the virtual production stages for the CBS series "Star Trek: Discovery". The idea is not new: films from the forties and fifties already used rear projection to make it easier to shoot scenes in moving cars, for example.

Advantages

  • The fact that LED panels are self-luminous means that objects set up in the scene, such as cars, are illuminated absolutely realistically. Resulting reflections look deceptively real
  • A lot of time can be saved in post-production because the effects are created directly in the camera during filming instead of afterwards.
  • Via so-called motion capturing or tracking systems, the camera position is transferred to the virtual backgrounds calculated in real time: In this way, the parallax, i.e. the depth shift in space, always perfectly matches the camera position in the "real" world.
  • Extremely elaborate 3D backgrounds are also possible for livestreams for the first time thanks to this technology - viewers could even theoretically interact with the backgrounds in real time

Disadvantages

  • Focusing on the virtual background is not possible, as the LEDs would otherwise lead to so-called moiré effects. The resulting flickering is very disturbing in the image - so the LED wall is particularly suitable for blurred backgrounds.
  • The virtual sets have to be pre-built in 3D before the actual production days and changing the content in post-production is only possible with considerably more effort
  • One Achilles' heel is the transition from the studio floor to the LED wall: This "seam" in the image has to be covered creatively, for example using actually physically existing set components such as furniture, plants or other "set pieces" to conceal this spot
  • The rental costs for an LED cave are still at a relatively high level, so that its use is only possible for productions with a large budget

Conclusion

The technology is particularly well suited for film productions with many reflective surfaces - such as car commercials. Simpler studio shoots, for example, where only individual people are to be inserted into a virtual environment, will certainly be more sensibly solved in the near future with the classic green screen variant, which can also be combined with real-time backgrounds from the Unreal Engine in the meantime.

The footage was kindly provided by Epic Games and the Filmakademie Baden-Württemberg, respectively.