
FAQs
Virtual Production
What are the benefits / drawbacks to Greenscreen vs LED wall when it comes to Virtual Production. What are common challenges you need to overcome in each?
A big benefit to greenscreen is that it is cheap and very accessible. You can make a basic greenscreen setup at home by painting a wall green or putting up a green sheet, and in fact many home filmmakers will do just this. For this reason, greenscreen is considered easier for beginners. Greenscreen may also be considered to have had more refinement, because it has been around longer. However, with greenscreen, chroma keying out or the post-production process of masking out the green, becomes a lengthier process. As for LED filmmaking, there is less focus on post-production, with a shift to pre-production. You get what is called the 'final pixel’ in camera, which essentially means that what you see is what you get. This means the pre-production process is much longer. However, LED also allows for reflections and refractions, allowing filmmakers to capture, for example, a glass of water, which would be extremely difficult with a greenscreen.

What’s the demand for 2D plates compared to creating realistic 3D versions in 3d software - is one better than the other?
A great benefit to 2D plates is that they are more widely attainable, as it is simply a case of shooting a video on location. Many production companies specialise in shooting and selling these plates, allowing them to be easily purchased online. However, without a 3D environment, one key element missing is the parallax effect. This means you are essentially locked into what has been captured in frame. Furthermore, 3D environments running in game engine software allow for increased customisation. It is possible to change objects within the environment, for example, extending a building, moving a tree or changing shadows in real time. If working with a 2D plate, this can be achieved by exporting frames into an editing software, but it is far more time consuming. A downside to 3D environments is that the software has a steeper learning curve and is not as easy for beginners. There might also be content issues which would not show up until the content is tested.
Are there places where a creative team (or an individual creative) can go to play, learn and try out VP ideas of all types - to learn what it can do, find its limits, help with pre-production fixes and save time/money figuring it out in an actual project? I.e. build a skillset?
Advanced Media Production works with organisations such as ScreenSkills and Final Pixel Academy, offering entry level and professional courses. We recently hosted an Introduction to Virtual Production course, which was catered to those in the production industry looking to upskill with VP. Another great resource comes in the form of societies and groups, both in person and on social media.

How good is the green screen key when used simultaneously with an LED background screen?
Greenscreen simulated with an LED can mean the actors are lit with a slight green effect. It can be used in instances where the virtual art department (or VAD) need to make changes to the content. The process could involve shooting all content on a shot list, and then reshooting with greenscreen, but ultimately a whole scene would not be shot in greenscreen on LED--only a single shot or two.
AI and VP – what changes may this bring to virtual production workflows?
The biggest change comes with the availability of AI powered asset creation tools, including video AI tools which use generative AI to edit plates. Content creation is thus more easily accessible. In some ways, camera tracking and the rise of markerless camera tracking solutions could be considered AI as well.
What are the limits of photogrammetry with the use of consumer items such as iPads and 360 cameras? (Context; not having a budget for LIDAR or more expensive means)
If you’re filming at a locked down location or only have a small window to shoot on location, you can shoot all the hero scenes first and then send a photogrammetry team to build an environment from. This can then be edited to make any changes as required and finally put on the wall. However, you would not want to do a whole shoot with this. Reality Capture and Polycam are both good, accessible photogrammetry apps which take images of objects and turn them into 3D assets.
In terms of the actual background plates is there a tendency to lean more toward creating them in Unreal Engine or does real-time video content get used and if so, what resolution does that video need to be? Also is using (hi-res) equirectangular 360 video a possibility as presumably you can also rotate the scene if required.
This differs depending on the production. The question is, is there budget to make a photorealistic environment? Or is it quicker to go on location and shoot some plates? 360 video footage can be used if it is stretched correctly to the wall to avoid the warp of the lens.
What is a 2.5D plate?
2.5D is essentially multiple 2D plates or images staggered in 3D space to simulate a parallax effect. For example, a landscape scene where the background image is clouds and sky, the midground is a hill, and the foreground is bushes and flowers. These layers are staggered to simulate a 3D effect, almost in the same way a theatre or stage production might use set design.
What is the average energy consumption of medium (dimension) VP stage?
It depends on a lot of factors and a definitive number cannot be determined. The conversation over whether VP is more sustainable is ongoing and would differ greatly from studio to studio, project to project and even manufacturer to manufacturer. Some panels require less power or perhaps a full wall is not needed for a scene. Lighting setups would also differ across productions, as would compute.

What do you need to know when you hire a space? What needs to be prepared first?
These are essentials you need to know: The dimensions of the space, including the tracking volume size The dimensions of the wall What tracking solution they are using What is included in the technical rider that the studio provides The day rate and what is included in this rate What type of VP productions the team has done in the past, or their portfolio (some studios may be more familiar with one game engine over another).
How could someone set up a virtual production suite at different price points?
The most cost-effective VP setup is a greenscreen. If looking at mid-range virtual production, this would be a tracked camera solution e.g. Vanishing Point or Vive Mars Camtrack, which still use green screen but track a camera to create a parallax effect. The highest end suite would be full LED stages with huge volumes.
Working with actors in VP - combining mocap character with live actors - how far can realism go - can you use any lens, etc...?
You cannot do both at the same time without the production becoming overly complicated, because in terms of timing it is a very tricky process. How far can realism go depends on the capabilities of the software being used and the budget. Any lens can be used, but for zoom lenses it is about the pixel pitch – the smaller the pixel pitch, the closer you can zoom to the wall without the image looking pixelated. Pixel pitch is the width between the individual pixels – the smaller the pitch the closer the pixels.