The head of broadcast at Mark Roberts Motion Control, Paddy Taylor, reflects on the development of VP workflows, the company’s recent projects in film and broadcast, and his expectations for the next stage of this technology’s development.
Have you been surprised by the phenomenal rise of virtual production over the past few years?
I think it was fairly obvious that it was going to be a kind of watershed moment in the way that content is produced. [From an early stage] it was clear that there were going to be so many obvious advantages in terms of just how photorealistic activity can be with regard to lighting the situation, through to cost benefits and the ability to manage the environment you’re shooting in that when it came together it was always going to be a significant shift in the industry.
[Contributing to this awareness] was the fact that, both in my personal background and with MRMC, there had been a lot of involvement with customers doing projection-mapping and other virtual production workflows for some time before this new explosion came about. Then, when [VP] really started to make an impact, The Mandalorian was the first project of this kind which used MRMC robotics, and it attracted a huge amount of attention worldwide.
The workflows behind VP were in development for years before they were utilised, so it isn’t a surprise that it has become as big as it has. The growth has slowed a bit recently, but it will become bigger as the [overall market] develops. From our perspective, a lot of major developments have happened outside companies like MRMC, but I think we have had an important role as an enabler and a provider of solutions to help ease the transition.
From your standpoint, what are the aspects of virtual production that excite you most – both in terms of the creative and commercial opportunities?
Lighting is a great example here – being able to shoot content in various locations and times. It allows you to create the scenes you need without having to build up a whole set with all its associated costs. So for example, you can limit the amount of production teams and talent that is needed in a physical location by using content you have created. You can even do re-shoots with content that was taken on site and then re-shoot in a hybrid environment. Of course, this comes with a huge environmental bonus as well.
I still think the magic of [shooting physically on location] is something that we shouldn’t lose, but VP does give you the chance from an artistic standpoint to try something again and again without having to reset everything. It also creates a real opportunity for those lower-budget projects to expand [their scope] and utilise multiple locations without having to move production elsewhere.
Please tell us a little bit about the history of your company’s involvement with virtual production.
Going back a few years, The Mandalorian was the first project to use MRMC robotics. We also do a lot of motion control-based visualisation and effects, which includes productions like House of Dragons, where the dragons were all virtually controlled.
There have also been a number of major sports broadcast-related projects recently. One was a mixed reality application using our studio bots at the Olympics, which involved the real-world Paris skyline with the Eiffel Tower in the backdrop and a VR/hybrid reality environment where Studio B was shooting to that space through a green screen and creating graphics with our [broadcast partner].
The Catalyst Stage for ESPN is another good example. This is primarily a studio environment for the coverage of live sports, but at other times it will be used for different sorts of production, such as the filming of promos.
It’s worth noting that we have several aspects to our company structure that help us work across these different sectors, including a volumetric division, a film and commercial division – which includes all of the really exciting, high-speed robotics used in shows like The Mandalorian – and the broadcast division, which is the department I look after. We have plenty of places where people can try out their systems, while the different parts of the business all flow into and provide advancements to each other.
What are you currently working on in terms of virtual production projects and/or new solutions?
We’ve just had a sports broadcaster purchase a solution at IBC, so it will be exciting to see that in use. I think readers would also be amazed to know how many of the Christmas adverts that they’ll come to love over the next few weeks and months have been shot on our systems. Some of them are animated, some are done in VP, some are chroma key, some are love action, and some are a hybrid. But yes, our team has been working on a significant number of seasonal adverts lately!
How are you viewing the development of related technologies, such as those grouped under the Mixed Reality banner, and what are your predictions for their growth in media applications over the next few years?
We are already seeing some examples of this – AI, VR headsets in games, phones being used during live sports games, etc. AI is already being used very often, but there is still a lot of growth [potential] here. It’s also the case that to make something look good and work well, it requires a lot of skill – and that takes time.
Festivals and theme parks could start to see more of these technologies being integrated over the next few years, while experience centres – eg. for live sports and music concerts – can help bring greater accessibility and take away a huge environmental impact [associated with travel].
[More generally] I would say that the convergence between gaming and film is a natural progression. I can see the line between the two being blurred very quickly in the transition from a film to an interactive game.
How would you like to see virtual production evolve in the next few years? For example, do you think that more standardisation is needed (as we are starting to see happen with camera metadata) and do you have any views on how this should be implemented?
Standardisation is needed and is already starting to happen.
Virtual production is an ideal experience for live events, but this isn’t where [the technology is really] happening right now. [It’s more about focusing on] people watching from home, rather than the person in the stadium.
The really exciting next stage will probably be achieving the perfect hybrid of being able to do something live for an audience locally, as well as for live audiences at home.
Read the whole issue here.
Single page full-screen view link for mobile devices
https://bit.ly/Production360-V1-2
Double-page full-screen view link
https://bit.ly/Production-360-V1-2