We’re thrilled to announce the return of GamesBeat Summit Next, hosted in San Francisco this October, where we will explore the theme of “Playing the Edge.” Apply to speak here and learn more about sponsorship opportunities here.
Unity has announced the launch of its Unity Wētā Tools division for making film and real-time 3D animations and games at the Siggraph 2023 conference.
This division aims to provide artists with accessible and trusted solutions for collaborative 2D and 3D content creation. At Siggraph, Unity Wētā Tools will showcase the latest advancements that empower creators to build captivating digital worlds, breathe life into hyper-realistic characters, and elevate film production techniques.
At Siggraph 2023, Unity Wētā Tools will deliver a presentation reaffirming Unity’s commitment to pushing the boundaries of digital production and fostering innovation and creativity.
Unity bought the Wētā tools division of Peter Jackson’s in 2021 for $1.6 billion, and now the company is ready to showcase these tools as the high end of its product line, said Timoni West, vice president of product for Unity Wētā Tools, in an interview with GamesBeat. Now we’re seeing the outcome of that deal for Unity’s full product line. The upcoming products from that acquisition will be demoed at Siggraph.
“We’ve been really just doing the work over the last year and a half, trying to figure out what best we can take from that acquisition. The tools are very successful. They’re used on hundreds of different movies and films today,” West said. “But figuring out what we can extract and start to really productize — from something already used by a lot of professionals — for our next service is what this is. The idea is to take this technology and figure out how it can be used for other studios, for games, for any kind of interactive content, is Unity’s bread and butter. The thesis here is to take what is used for high-end art and figure out how to make it accessible for everyone.”
Presenters include Allan Poore and Natalya Tatarchuk from Unity, Steve May from Pixar, and Joe Letteri of WētāFX. They will demonstrate how Unity Wētā Tools are currently utilized in award-winning films. And they will showcase how these tools accelerate creative workflows, facilitate seamless implementation of creative changes, and enhance the quality of final shots.
Poore, senior vice president of Unity Wētā Tools, expressed Unity’s dedication to research and development.
“We are excited to showcase the work we are doing with high-end 3D artist tools, and we are looking to bring new innovations to help creators be successful, unleashing creativity for 3D artists,” said Poore, in a statement.
Some of the tools will be used to make both games and films, West said. And they could also blur the lines.
“Like with Netflix’s Black Mirror: Bandersnatch [interactive film], I think there is a real opportunity there for filmmakers to experiment,” West said. “Unity has been deeply interested in AI since our first AI team started in 2018. We apply machine learning to help everyone make better experiences more quickly.
West noted how game makers care deeply about “time to pixel,” or how fast you can render a frame, while filmmakers can spend tons of money and millions of dollars on render farms just to make sure the pixels are perfect. Game developers care about real-time performance, while the filmmakers care about making it look real. Still, the tools are converging.
Modern content is complex to create. CG worlds need to be built to match artistic direction, whether that’s life-like or fully stylized. Live production methods are increasingly leveraging CG during live capture, broadening the scope of art departments on set. Then, both elements need to be combined in order to be seen as one image, often as part of a linear or waterfall process.
Unity Wētā Tools will demonstrate its latest character tools, enabling creators to match creative briefs in any design language for film, animation, and games. Two stunning examples, the real-time digital human, Samir, and the stylized creature, Nova, will showcase exceptional biomechanics as part of their performances.
Collaborative tools like SyncSketch will also be highlighted, facilitating smoother collaboration and providing real-time and asynchronous feedback for faster iteration. Additionally, the introduction of Unity Wētā Tools’ Deep Compositing and Eddy aims to redefine post-production workflows, allowing creators to achieve unprecedented depth and detail in their productions.
Unity’s focus time-saving tooling is aimed at accelerating content creation while reducing repetitive tasks, like making real-time wigs for digital humans. And Unity Wētā Tools for environment artists includes SpeedTree, which can be combined with Real-Time engines, such as Unity Engine, to create rich vegetation in real-time scenes during film and game production.
“The advances in technology have been pretty significant in the last decade, as far as I’m sure you’ve seen, like the hair in Disney’s Tangled film,” West said. “We have our own tech from Wētā Digital called Wig. We’re going to be releasing this as a product, a standalone plugin.”
Another tool coming is Deep Comp, or short for deep compositing. Compositing is the process of combining all elements of a shot (live action and computer-generated) into a final image. Deep compositing turns the typical 2D compositing process into a 3D process. Instead of just RGB, there’s a depth channel storing spatial information for each pixel.
Depth maps are used to take advantage of this data, applying the relative distances between objects (live action and CG) in a scene. Objects are therefore composited more accurately in 3D space, leading to better quality creative via a number of workflow benefits. The result can be scenes where the developers can magically de-age an actor, West said.
Wētā Tools’ Deep Compositing nodes gives compositors greater creative control of CG objects, resulting in more accurate shots with compelling effects that can be edited without re-rendering the entire scene, reducing costs. Scenes are compiled faster than traditional methods as shots can be assembled as objects are made. Compositors can focus on creative decisions and isolate objects for editing or adding visual effects using non-destructive workflows, whilst applying complex effects that previously weren’t possible.
There are thousands of compositors who work in film effects in games to make the actual final frame look as beautiful as it does. It’s just getting it all together.
“Deep compositing allows you to add in a third dimension to compositing. This allows you to create volumes that you can then use as cutouts if you’re creating a scene,” West said. “So for example, if you have a real car in a scene, you need to have an explosion around it, you can define the dimensionality of the car, though you don’t have to do the painstaking work or frame by frame going in and trying to fix each single frame as the camera moves around the park.”
And Eddy is a fluid simulation tool that lets compositors apply volumetric effects without dependencies
on other artists in the production pipeline. This gives compositors more creative control of a shot with instant visual feedback, facilitating creative adjustments so shots are a closer match to creative direction. Eddy is compatible with Deep Compositing.
“Eddy allows you to do things that are not conventional 2D effects, like fire or wind or water. You give it three dimensionality to make it look more realistic,” West said. “This also means that you can push that VFX pipe further down the chain. That’s been used in multiple films already. So, all of these features are tried and true. They are just now being released to a wider public, and we’re starting that process over the next two quarters.”
West noted that some of the tools have been used in movies like Oscar-nominated The Banshees of Inishirin to make landscape views seem less populated to fit the time of the film.
“It’s rare to find a movie now that doesn’t have so much CGI work,” West said.
“Our focus has always been on helping filmmakers, and part of that process is extending what is possible in visual effects,” said Letteri, WētāFX senior VFX supervisor and five-time Academy Award winner, in a statement. “Deep compositing, Wig, and the integrated solver framework for Loki are all examples of how we took our experience with some of the largest and most complex projects and embedded the knowledge we acquired into the tools themselves.”
Natalya Tatarchuk, vice president of Unity Wētā Tools, said in a statement, “We’re thrilled to showcase groundbreaking work in character creation, environment building, rendering, and compositing. We’re working hard to make these powerful solutions available to everybody, accelerating creative workflows, enabling seamless implementation of creative changes, and delivering blockbuster-quality final shots.”
The sessions will provide insights into the creative challenges faced while building characters for Netflix’s The Sea Beast using Unity Wētā Tools’ Ziva VFX.
Unity is keeping a close eye on generative AI tech to help with game creation.
Across the industry, she said, “I would expect to see something that’s kind of an end to end solution, or a film based on that or a movie or crafts game within the next one to two years.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.