When Cyber Group Studios created its first augmented reality test in 2007— almost a decade before Pokémon GO brought AR to the masses—CEO Pierre Sissmann says neither retailers nor kids were interested in characters that virtually popped out of t-shirts.
Around the same time, the Parisian studio produced a stereoscopic 3D episode for its preschool series Tatonka. But broadcasters weren’t technically capable of airing the format, and because 3D TVs didn’t ever make it to the mainstream, the episode never found an audience. The company behind Gigantosaurus (pictured, above) has often been an early adopter of new technology, says Sissmann. While that’s not a bad thing, it has meant that some tech investments—like its 2007 initiatives—have sat on the shelf waiting for the rest of the market to catch up. But for its latest R&D play—real-time animation production—the studio’s timing may be just right.
Kidcos from Disney to Aardman, and Blue Zoo to Cyber Group have been increasingly experimenting with real-time game engines, which make it possible to simulate images on a computer fast enough for a viewer to interact with them. The tech has been used in video gaming for de- cades (a user’s ability to fluidly walk, run or fly in a game is often driven by a real-time engine), but producers have been slow to adopt the format.
Increased demand for quicker turnarounds and high-quality content, however, has spurred the industry to give the process a second look. And decreasing costs and off-the-shelf software are making it more accessible.
One study from research firm Forrester, commissioned by real-time game engine maker Epic Games, found that 69% of the technology’s current users do so because of the time it helps shave off the production process. And more than half (59%) of survey respondents said they planned to adopt it in the next 12 months.
This tech isn’t new. North Carolina-based Epic, a software developer and video game publisher, introduced its off-the-shelf Unreal Engine in the ’90s, and has since made a name for itself developing top-selling console games like Gears of War and Infinity Blade (though it’s now best-known for mega-hit franchise Fortnite). According to Epic, just 35% of its client base operates in the film and TV space, with the majority of Unreal users sitting firmly in the gaming world.
The slower convergence of video game tech and visual effects may be a bit surprising, considering both media often use the same or similar tools to build and animate CG assets. But free- to-download software like Unreal or competitor Unity have recently begun to pop up in animation and VFX work on series, commercials, features and shorts.
And as with most things in film and television lately, the rise in streaming platforms has further altered the technology’s course. SVODs created greater choice for kids and family audiences, which means animation companies need to produce more content faster to feed demand. With real-time rendering, studios don’t have to wait hours or sometimes even days for complex images to be rendered by graphics processing units in studio workstations or large render farms (clusters of networked computers that render an animated or VFX sequence in less time).
Directors, animators, riggers and lighters using game engines also have the ability to make changes (think camera angles, lighting, texture, actor movements or prop placements) to full-resolution shots in real time, much like on a live- action set. This can enable a freer creative and collaborative environment, and save time and money on fixing mistakes or making adjustments in post-production. The quality of real-time rendered images is also getting closer to what traditional renderers like Arnold and V-Ray can deliver as real-time hardware, software and developer experience improves. The result is more mainstream adoption of the process.
On Disney+ series The Mandalorian (pictured, below), for example, Epic’s Unreal Engine was used to build responsive CG environments that were projected onto giant LED walls to give actors, directors and VFX teams on-set live reference points. Beyond helping from a pre-visualization perspective to scout locations and pick the best shots, the quality of the backgrounds was so realistic, many made it into the series as is.
For Cyber Group, the decision to invest in real-time rendering came last year, following successful test runs with the tech in its video games division and in an effort to find ways of telling stories better, faster and more cost-effectively. Once Cyber Group committed to Unreal for animation, the studio’s coding team created a new pipeline adapted to the company’s existing processes and threw some different pieces of technology like motion capture (MOCAP) into the mix. Cyber Group will be opening a real-time animation studio at its 2D- and tech-focused lo- cation in Roubaix, France later this year.
So far, the studio is piloting one real-time production, a 40 x 10-minute preschool series entitled Giganto Club. The series won’t be a direct spinoff of Gigantosaurus, since it will feature new char- acters and settings. The primetime talk show- style series will feature an original MOCAP, CG- animated dino host who engages with the audi- ence through games, songs, news, featured guests and geographical segments. Sissmann expects pro- duction to begin in September and run through next February. For distribution, Cyber Group will target a rollout on YouTube and traditional broad- casters on a weekly schedule. Speed to market and frequency of new content delivery necessitated the quicker animation output, says Olivier Lelardoux, Cyber Group’s co-founder and SVP.
A second kids series and a preschool show made with both real-time and more traditional keyframe animation are also in the works.
While Cyber Group won’t reveal how much it’s spending on the studio and new productions, Sissmann says it’s a sizeable investment. But the price-tag should be offset by the tech’s savings. If a regular animated series in Europe costs up- wards of US$8 million, Cyber Group could cut 35% of the budget on a real-time production. “The savings will mostly be on the animation and post-production,” he says. “The tech is evolving so much, we figured we could explore new ground in creating better animated images in terms of emotion and movement.”
Lelardoux points to MOCAP as a prime example of that evolution. It was previously expensive to use the technology to render in studio, and so it was primarily used on features.
“With the previous generation of systems, the price was higher, and the infrastructure was heavier,” says Lelardoux. “The tech has totally changed. It’s lighter, cheaper, more accessible, and we can use it in the real-time process.”
Today, the company’s pipeline for real-time MOCAP (maximum of two actors simultaneously, including face, full body, hands capture and live audio recording) costs around US$113,000, includ- ing the hardware and software. But not everyone is ready to jump on the real- time bandwagon. While using real-time render- ing can result in fewer post-production steps, JAM Media managing director Richard Gordon says the technology is still evolving and doesn’t eliminate the need for post-production.
“We did some feasibility testing last year, but for our new [BBC series] Tiara Jones, we felt more comfortable going with a traditional shoot,” says Gordon. “Though the [Unreal] technology is impressive, we felt the level of post that would still be required to achieve the look we wanted counteracted the potential savings.”
What’s more, there’s a talent crunch. As more companies get on board with the process, the industry now has to clamor for skilled professionals. According to a 2019 report from labor market analytics firm Burning Glass Technologies, demand for Unreal Engine skills is growing faster than any other segment in real-time 3D, and those jobs are paying the highest salary premiums in all of 3D graphics.
To meet the demand, Epic offers Unreal for free as a teaching tool in high schools. The company also offers free curriculum materials through its website, along with an educational livestream that occurs every Friday.
Commercial companies can also pay for sup- port from Epic, which includes in-depth project reviews, tips and tricks, and hands-on expertise at its London lab and at animation studios looking to adopt the tech. There are also concerns around the scope of the technology’s capabilities.
“There’s a big barrier because people think we can only do photorealism,” says Ben Lumsden, Epic’s UK-based business development manager for Unreal Engine. “But we can do everything from MOCAP with puppets, to live- action stop-motion with LED virtual production backgrounds, to green screen virtual worlds and 2D projects like Blue Zoo’s new monochromatic short film Ada.”
Epic’s main competitor Unity, meanwhile, has been improving its own photorealistic capabilities. The company remains differentiated from Epic in its focus on the mobile gaming market, which is how UK stop-motion specialist Aardman got into the game.
“We’ve primarily been using Unity because we were developing a lot of mobile apps, but now we are working with both Unity and Un- real because they each give us different benefits depending on the project,” says Aardman’s executive creative director of interactive, Dan Efergan.
The company is rebuilding the CG trailer it launched in May for its Wallace and Gromit augmented reality game The Big Fix Up in a real- time engine. Aardman plans to build the trailer in Unity, and test parts of it in Unreal. The studio is also in the early stages of an untitled project that will involve building games and animation simultaneously.
“Using a real-time engine for that production process is a no-brainer,” says Efergan. “We want the universe to feel very similar in both, and the cost benefit of having a single pipeline is immense.” Efergan’s confidence in real-time being a necessary output for Aardman is so high, he says, that he would be surprised if the company doesn’t have a real-time pipeline by this time next year. “And I’m hoping it’s before that,” he adds.
For Cyber Group, Sissmann says he hopes to hire live-action directors to work alongside ani- mation directors, and the company’s real-time studio could potentially create jobs for nearly 200 artists. The results will be a long time coming for the company.
“Back in 2007, we were very frustrated [at tech delays],” he says. “But now the tools are converging, and it is finally happening.”