The technology powering the latest video games has advanced rapidly, and it’s not just games developers that have taken notice.
The film, automotive, and architecture industries are just a few who have been integrating it into their non-gaming workflows.
The perception of software such as Unreal Engine and Unity being solely for games development is shifting.
This is partially a result of changing licensing agreements that now put these tools in the hands of more people than ever before. Use of this industry-grade software is completely free for Unity creators with revenue under $100,000 in a calendar year or Unreal users whose products earn under $1 million over their lifetime.
Inevitably, this has also resulted in individuals and companies outside of games development reappraising the software and integrating it into non-gaming workflows. Moves to improve accessibility to these tools have been part of a grand strategy by both Epic Games and Unity to not just encourage more companies to invest and use their software, but to increase the use cases for these creative tools beyond the gaming industry.
Both engine developers have added new features relevant to other businesses and in some cases are even supporting companies through grants and funding for projects using their technology. Epic Games recently announced its MegaGrants initiative would help fund the movie Gilgamesh, the latest feature-length production Argentinian animation studio Hookup Animation, which is created in Unreal Engine. Meanwhile, Unity has the Unity for Humanity initiative that funds projects using the engine in impact-driven, real-time 3D projects, such as Clean A/R.
As a result of growing toolsets, greater experimentation, and proactive work from engine developers, this software is increasingly becoming a cornerstone of workflows for companies in a variety of industries, from film to architecture to the automotive industry. The increased reliance on digital technology and the growing importance of artificial intelligence are only likely to increase the importance of these engines in the years to come.
These new use-cases are making product visualisation and project development more efficient than ever while opening up possibilities in certain industries that wouldn’t exist without games engines revolutionising traditional methods of work.
Big Screen Secrets
Take, for example, film and TV production, where the use of real-time rendering offered by the likes of Unreal have changed the pre-production and pre-visualisation processes, as well as how the engine has contributed towards the growing sector of virtual production.
Virtual production is a term used to describe a variety of creative techniques that use the real-time nature of engines like Unreal to either help the production teams plan the direction of a scene before filmmaking begins, or to integrate visual effects into filmmaking in real time. Game engines can be used to render a set virtually to plan shots out in a way that helps both to speed up filming and to plan complex and effects-heavy scenes.
It can also be used to render large environments that may be projected onto LED screens in order to visualise the environment in real time while filming. In this latter situation, a virtual background can be rendered in-engine that moves in time with the scene and the camera – a technique used in Disney’s hit Star Wars series The Mandalorian. While the use of artificial sets and pre-visualisation isn’t new, the capabilities of engines like Unreal and Unity allow for visual effects studios to render more complex and artificial environments in real time, as opposed to later in post-production.
We spoke with Simon Jones, Director of Unreal engine enterprise at our games frenzy in November.
Stargate Studios is one such company taking advantage of Unreal Engine to produce real-time, in-camera visual effects for on-set shooting and pre-visualisation. The company has worked in special effects on major movie and TV productions since 1989, when traditional paintings on film were the norm.
“The tools have changed radically, but the basic design philosophy or composite imaging has remained remarkably consistent,” said Sam Nicholson, founder and CEO of Stargate Studios. “The biggest technological change in the past few years is based on very fast GPU-based rendering and Unreal Engine democratising the software for real-time VFX work. Much of what used to take weeks or months to complete can now be accomplished in real time.”
A synergy between the needs of studios and companies working in production, and the engine developers introducing new features based on industry needs, also broadens the range of what is possible through the use of virtual production. One example noted was the implementation of an NDisplay feature that has become a central part of Stargate’s ThruView process for off-axis projection, which corrects the perspective of a virtual scene to match a moving camera.
“Having access to an efficient real-time tool is essential – particularly in the area of virtual blocking, pre-visualisation and VFX layout,” Nicholson continues. “Unreal Engine has also replaced some of our traditional software for rendering. Depending on the complexity of the finished effect, it provides an alternative approach which can greatly accelerate delivery of finished VFX and the entire production.”
Designing the future
Unity has also aggressively broadened the capabilities of its engine by releasing new products and features to support companies in other industries. In architecture, the engine and some of its tools, such as Unity Reflect, is used by companies like SHoP Architects. Unity Reflect enables creators in architecture, engineering and construction to transfer building information modeling (or BIM) data into Unity. SHoP used this to creating 3D render while planning the 9 DeKalb residential tower set for Brooklyn in New York.
Other examples include Unity MARS, which assists in VR and AR workflows to build software responsive to physical space, and Unity Simulation helps with testing the spatial awareness of self-driving cars. Unity Forma, meanwhile, is used to help companies create interactive 3D marketing experiences.
According to Edward Martin, director of Verticals Product Management at Unity Technologies, these and other products launched over the last few years “are a response to the new industries using Unity and real-time 3D.” He adds: “The products that we have launched over the last few years are a response to the new industries using Unity and real-time 3D, and Unity is impacting the ways companies work throughout the product development lifecycle.”
These use cases often involve the visualisation and product development side of the business, with Audi using the software to develop the e-Tron GT – the company’s first car designed without any physical prototypes, thanks to Unity.
In the animation industry, there is growing recognition of the capabilities of game engines to enhance current workflows and play a growing role in the future of animation. Short films produced in Unity such as WiNDUP and Disney’s Baymax Dreams were produced within the engine. To varying extents, each film handled modeling, layout, animation, lighting, VFX and rendering all within the game engine.
Martin adds: “WindUp uses out-of-the box Unity capabilities, built on a highly typical animation pipeline, where only the shaders are custom and everything else is out of the box. Along with Baymax Dreams, it showcases what is possible with real-time animation production today, not in some customised, dream vision of the future.”
One firm taking interest in how games engines have the potential to expand their capabilities as an independent animation studio is E.D. Films, which has made use of both Unreal Engine and Unity to varying degrees on several projects such as Three Trees and Giant Bear. Alongside this, the team has developed publicly-available tools such as Scene Track, which makes it easier to export animations created within Unity Engine into other software for external use.
Admittedly, the use of these engines in these mediums has not been entirely smooth. Despite always having the potential for deployment within animation, early attempts to take advantage of the software stumbled while the engines were tuned to the needs of the industry. Stylistic visuals would be difficult to render as effectively in games engines as opposed to using traditional animation methods. As a result, the technology was initially most effective in the early stages as a way to pre-visualise and develop a shot. But this has changed.
“The companies have become more open to change within their engines to support the specific requests and demands of the animation industry,” says E.D Films CEO Emily Paige. “One of the issues we had at first is that some of the capabilities of the issues were well-suited to realism but not the artistic approach taken towards animation.”
Of course, one of the advantages of using game engines and their automation features is the cost savings compared to expensive traditional methods. On paper this is appealing, but there are practical issues to this.
“Some of the value prospects are outweighed by the cost of getting a senior, experienced AAA animator in to work on it,” says Paige. “Especially on the smaller side and indie animation, the salaries to bring in an experienced animator can come to over half or even all of the budget on such a production. This is something we’ve addressed by hiring and training animators in-house in our workflows with the engine.”
Thanks to ongoing updates, what was impossible within these engines just a few years ago can now be completed easily and effectively. Attempts to recreate the introduction of Return to Hairy Hill in Unreal Engine for an early animation test in 2018 floundered due to the lack of the necessary tools needed to recreate the look of the film. But if such a production started today, the results would be very different.
“Over time, these engines have evolved with updates to blueprints and shaders, while other aspects of the in-engine workflow have been changed to be more intuitive and friendly if you were coming from other software,” Paige says. “If we worked on that Return to Hairy Hill render now and many of the issues we came across at the time, it would be much easier to find solutions to them and make it work.”
Both the companies behind these engines and those using their technology in other industries agree there is boundless potential for games engines to change how businesses in many sectors approach development and production.
“Companies and individuals are still in the infancy of what’s possible when applying real-time 3D to their industry,” Unity’s Martin concluded. “We feel that the entire ecosystem of real-time 3D experiences can benefit innovation and further development into areas we never imagined.”
These products have the potential to become only more important in the realm of virtual production and the film and TV industries.
“Unreal Engine is undoubtedly becoming more capable very quickly," pointed out Stargate’s Nicholson. “Combined with the new capabilities of Unreal Engine 5, this community will develop new applications which go well beyond what we are able to accomplish today.”
Considering that technology as it exists today already allows for CG sets, virtual locations and synthetic actors that can decentralise production for remote work and so assets can exist in multiple places at once, this decentralisation could potentially grow further.
“Greater photorealism and modeling detail in Unreal Engine 5 and beyond will gradually replace ‘real world’ photography of sets, locations and actors,” predicts Nicholson. “This will not be an overnight transition, but we are already seeing – or not seeing – invisible set extensions, virtual stunts, avatar extras, photoreal humans and beyond.”
In animation, this growing embrace of gaming engines is already happening. According to Paige, game engines have “grown in visibility and viability [within the animation industry]” on an international level.
“I often see studios in places like Europe and South Africa and India working on these products. Not only that, we’re beginning to see the use of these engines discussed as part of the pipeline for animation projects from an early stage.”
As much as the public and even the games industry itself can often narrow the potential of these products to strictly gaming applications, businesses from architecture firms to animation are already disproving this preconception with real-world applications of the technology.
As the potential for real-time technology grows, with the support of engine creators themselves, these use cases will likely only further expand in the future.