Two for each pointed ear. Six to map the wings. Thirty-five along the painted body. In total, Ariel the puppet boasts forty-seven retroreflective motion capture markers, each one crucial in bringing this character to life. Crafted by Tracked Illusions for their foray into VR gaming, Ariel entered StudioT3D, with an aim to progress experimentation in motion capture performance.
Tracked Illusions is a company specialising in performance capture puppetry. They seek to combine puppetry with AI and digital capture technology, specifically how it can be used to enhance character performances within XR narratives.
Puppetry, performance capture, and motion capture are widely used across the film, television, and video game industries, however combining them is less common. For the StudioT3D team, it meant this project was also an experiment in refining tracking processes and testing the limits of the technology.
The Brief
Seeking to continue work that began with R&D at Chichester Festival Theatre and Portsmouth University’s Centre for Creative and Immersive Extended Reality, Tracked Illusions contacted Target3D to utilise StudioT3D, an Advanced Media Production Network studio.
Familiar with motion capture technology, the team sought to take puppetry performance a step further – this time, to support the creation of a VR game demo. Leylines is currently in development by the Tracked Illusions team, a story inspired by Celtic mythology and rich in lore, with a focus on characterisation in XR narratives.
Ariel, the folklore-inspired sprite, would have to be captured as she moved around the volume, supported by puppeteers to create the illusion of a flying fairy. The shoot provided an opportunity to study how physical props can offer a more authentic and immersive experience compared to VFX.
In an exploration of characterisation, the aim was also to have actors record their individual performances, meaning voice and facial expressions, captured simultaneously alongside motion capture data. Earlier R&D work experimented with using ChatGPT to train different tones and attitudes, but performance capture would allow them to blend all these elements – voice, expression, and movement – through live actors, merging traditional puppetry with the digital world.
However, there were creative challenges to tackle in delivering this project.
The StudioT3D team had to decide how to actually track Ariel in the space. Minimising occlusion with a puppet roughly half the size of an average human in a 16m by 12m volume whilst also streaming into a 3D environment via Unreal Engine presented an interesting problem.
The Production
Studio 1 features a 32 camera OptiTrack motion capture system, consisting of 6 PrimeX 41 cameras and 26 PrimeX 22 cameras. This combination was crucial due to the size of the markers and the expansive volume the actors were performing in.
Given the puppet’s small body, the 12mm markers used for human bodily motion capture were too big to have in close proximity, increasing the risk of occlusion. Therefore, the team opted to use 4mm markers, typically used for facial and finger tracking.
Initially, the StudioT3D team questioned if a custom solve would be required to track the puppet effectively. However, thanks to OptiTrack’s Motive software, this wasn’t an issue, as it recognized the puppet as a human skeleton using a baseline 41 markerset. This allowed them to create Ariel’s skeleton within the software – essentially, a scaled-down human solve.
Combined with the incredible range and sub-millimeter accuracy of OptiTrack’s PrimeX 41 and PrimeX 22 series cameras, tracking Ariel’s tiny markers posed no issue.
On the day of the shoot, three actors suited up, were markered up and given iPhone head rigs. The performers, Phoebe Hyder, Arina Ii and Kate Rowsell, recorded multiple takes, both with and without audio, experimenting with different ways in which a performance flows. In a previous R&D session, one puppeteer stated,
“It was quite empowering. Traditionally, you never get to speak unless you are puppeteering the head.”
Voice Director Keith Higinbotham assisted the Tracked Illusions team in the collaboration. With experience in video game development, including voice and performance direction on Larian Studios' multi award-winning RPG Baldur’s Gate 3, Keith contributed to the shoot by guiding the puppeteers and inspiring experimentation to enhance the performance.
Also on set was Puppetry Director and Co-Founder Michael Jean Marain and
Unreal Technical Consultant Curtis Mason, overseeing the production alongside the StudioT3D team to ensure the capture was as desired.
It was tiring work as the actors simultaneously performed and moved around the volume to simulate Ariel flying.
Tracked Illusions Puppeteer, Creative Lead & Co-founder Phoebe Hyder explained,
“Performance capture puppetry is not just possible; it truly enhances the performance, letting our character fly, hover, and backflip effortlessly!”
Real-time previsualisation, also known as “previz”, is a standard part of StudioT3D shoots. It is a live feed of the 3D environment with the motion capture character inside, meaning producers, performance directors and technicians can see how the character will look in the space. This meant the puppeteers could instantly see the results and adjust their performance accordingly.
The Result
The Tracked Illusions team was very satisfied with the quality of the captured data, feeling inspired in their development journey. Leylines production continues, now with data to create a 30-second demonstration of how puppetry and performance capture techniques can enhance VR experiences.
However, it isn’t just original content that Tracked Illusions is creating. The shoot was a key component in the team’s continuous efforts to test and refine the technology and processes of performance capture puppetry, allowing them to offer a collaborative creative service solving the challenges of traditional CFX/VFX.
We may very well see puppetry and performance capture techniques becoming standard in the creative industries as the processes continue to evolve, opening up possibilities for storytellers and creators alike.
Phoebe continued,
“Our next steps are to investigate the integration of AI training and ChatGPT into creating content using performance capture puppets. Our vision is to bridge new and traditional methods, informing best practices for future developments.”
The Feedback
“Huge thanks to the amazing creative souls at Target3D for providing us the space and collaboration to make this possible.”
- Phoebe Hyder, Puppeteer, Creative Lead & Co-founder, Tracked Illusions
Robots? Puppies? Cameras? We can track it all. Contact StudioT3D to chat about our motion capture, performance capture, volumetric capture and virtual production studios, located in the heart of London: info@studiot3d.com.
Comments