Oddworld’s ‘Real’ Reel World

Ellen Wolff talks with Oddworld Inhabitants Lorne Lanning as he considers the realtime CG technologies that will change gaming along with movies and TV.

Oddworld Inhabitants again pushes the boundaries of CG with its forthcoming game, Stranger. All images unless otherwise noted © Oddworld Inhabitants.

Oddworld Inhabitants again pushes the boundaries of CG with its forthcoming game, Stranger. All images unless otherwise noted © Oddworld Inhabitants.

Attendees at SIGGRAPH 2004 saw among the festivals selections a game cinematic from Stranger, the Oddworld Inhabitants videogame thats slated for release next spring by Electronic Arts. The clip features an exotic bounty hunter on a mission that makes the classic "wild west" genre look tame.

Oddworld has been favored by SIGGRAPH juries before, earning notice for cinematics from an award-winning game catalog that includes Abes Oddysee, Abes Exoddus and Munchs Oddysee. What SIGGRAPH audiences are glimpsing of Stranger may look like classic Oddworld offbeat characters presented in high-quality Maya animation. The animation has the distinctive look that company president and creative director Lorne Lanning once described as Muppets meets The X-Files.

But this years Stranger cinematic signals the start of a sea change in the way that the 10-year-old Oddworld is creating CGI. Many of the clip's richly-textured elements were actually created as realtime game elements unlike the pre-rendered approach that has typified most game cinematic production. Using a proprietary technology dubbed The Strange Engine, Oddworld is pursuing a strategy that Lanning thinks will make the process of creating CGI more like live-action production than keyframe animation.

Whats significant about Lannings view is not simply that realtime animation technology is enabling people in game production to work more cost-effectively by reducing the amount of time-intensive software rendering. Lanning thinks that realtime and A.I.-driven tools will have an influence beyond games, changing how CGI will be produced for movies and television as well.

The Backstory

We've been looking at the realtime capabilities in the engine that we've built for Stranger, says Lanning. We first wrote this for the Xbox (Microsoft published Oddworlds Munch's Oddysee). So it's still capped by 64MB of RAM and the processing power that's in the Xbox, which is basically an under-$200 commercial technology. But taking that into account, we're still looking to get performances out of a realtime engine, which is less like the way computer animation directors get performances out of CG characters and a little closer to the way live-action directors get performances on a set.

This is the big boundary that is starting to crumble. We could never think of CG as a live-action set before. In conventional CG production, a director could never say, Cut! Don't WALK over there, RUN! A live-action director can say, Run over there and on the way be who you are. The actor knows how to navigate his world, so if hes running uphill, for example, hell be running a little slower.

Even in earlier Abe titles, Oddworld was creating cutting-edge 3D work.

Even in earlier Abe titles, Oddworld was creating cutting-edge 3D work.

The Role of Artificial Intelligence

In the world Lanning envisions, CG characters will possess navigational skills as well. The proliferation of A.I.-driven technology is key to Lanning's view of realtime CG production. "In realtime, youre thinking of characters more as autonomous actors. You think, What is this characters personality? What are his abilities? That gives you a walk cycle and a run cycle. Then you ask, What can he do in this world? He can throw punches and climb ropes and jump off buildings. So then you start building the artificial intelligence, which we call the character motion code. We also build all the navigation information about the world that the character inhabits. Were not thinking that this is a guy who shows up in a particular shot this is a guy who LIVES in that world.

Oddworld uses a linear performance editor to execute this approach. Its an editing interface that we created that interfaces to Maya and a game engine at the same time. It allows us to control the performance of the characters through a very simple interface that enables us to tell a character: Run over there, and the camera will truck over the necessary frames. It's like an Avid interface, but instead of moving clips of video around, were moving modules of A.I. commands around. Watching it play back is like watching an edited sequence.

After you see a sequence, you might say, Oh, no we need to have that character arrive a second sooner. You can have the system re-perform' that sequence in just a few seconds. The actors re-hit their marks. Youre now polishing iterations by watching the entire sequence play back in realtime.

Unreal Capabilities

Lanning acknowledges that CGI rendered in realtime still has limitations. Where realtime production is still challenged is that it doesn't do anti-aliasing that well. No one has really nailed that down. So instead, as in the early days of computer graphics when they didn't have reasonable anti-aliasing algorithms like on The Last Starfighter (1984) they rendered at 8K. That's kind of like where the realtime world is. But there are images that are playing at greater-than-HD quality in realtime.

Unreal Engine 3 is being utilized on such big games as Star Wars: Knights of the Old Republic. © LucasArts Ent.

Unreal Engine 3 is being utilized on such big games as Star Wars: Knights of the Old Republic. © LucasArts Ent.

A technology that Lanning sees as emblematic of realtime productions future is Epic Game's Unreal Engine 3, a tool for creating realtime interactive worlds for the newest generation of consoles and PCs. Unreal turned heads at E3 this year, and Epic partnered with Alias and others to present Unreal classes at SIGGRAPH, too. The Unreal 3 engine has already been embraced by the creators of the next Star Wars and Men of Valor games.

Lanning believes Unreal is, hands-down, the best realtime rendering weve ever seen. It looks like you're walking through streets of old Venice. What it's doing is normal mapping that allows a 7,000-polygon character to actually look almost identical to a 5 million-polygon character. The thing that the hardware manufacturers in the videogame business don't quite understand yet is that mapping and memory is more important than polygons they think that polygons are more important than texture mapping, and they're wrong. The people in this medium who are artistic understand that mapping can create the illusion of billions more polygons.

The Efficiency Factor

Lanning thinks its just a matter of time before technology such as this is used to generate linear as well as interactive game CG for reasons of efficiency, if nothing else. The world of pre-rendered cinematics in videogames will keep getting smaller, while the world of cinematics created with realtime game engines will get bigger. Eventually, in games we'll probably be looking at virtually all in-engine whether it's a cinematic that you can interact with, or an interactive situation that feels much more cinematic.

In the past, our games had, lets say, 24 minutes of computer animation. Well, it is too expensive today to do that with pre-rendered. So we're only using pre-rendered CG in areas where real-time cant achieve what we want primarily in the emotional arc of the hero character. In Stranger, we broke that down to about six different clips the opening, the ending and a few in between. We thought, What if the game engine was capable of delivering parts of the story that we used to do in pre-rendered?

While Stranger is still a work-in-progress, and the answer to this question is still evolving, Lanning is pleased with what has been achieved thus far with realtime techniques. "We used two completely different sets of assets ones that we would use for pre-rendered that had greater polygon counts, greater facial expression capability, more bones in the body, all that. And another set of assets used for realtime that didn't have as many bones, not as much facial capability and not as many high-resolution texture maps. We're trying to run 50 of those characters in a world at one time. So we said, `Let's try some experiments. On some of the ones that we want pre-rendered, let's bring in the set that was built for the game.' It's all Maya format, so we could bring that in and we wouldn't have to build a second set of assets, like we've always done in the past. That worked out amazingly well. So we thought, `Let's bring the characters in and do the same thing.' We were able to think of the shots in a very pre-rendered way, but we brought in the realtime assets that were built for the game and it worked just fine.

The Bigger Picture

In the wider world of linear CG production, the impetus might not yet be there to embrace more realtime strategies. The time-intensive process of software rendering remains feasible for many studios, especially those that are able to throw greater numbers of inexpensive processors at number-crunching challenges. Lanning observes, The pre-rendered film world has not been very concerned about efficiency. You can tell that by the speed of Maya and RenderMan if something renders in a reasonable amount of time, then its fine.

Oddworld Inhabitants president and creative director Lorne Lanning.

Oddworld Inhabitants president and creative director Lorne Lanning.

Yet hardware rendering technology has advanced tremendously, with companies like NVIDIA and ATI. Those advancements have been utilized by the game industry, which is in the battle for efficiency every day. That has produced different approaches to computer graphics than the traditional pre-rendered world has done like using A.I. so that characters know how to navigate their worlds on their own instead of having to be keyframed for each shot. We see that technology evolving in interesting ways.

The Machinima Movement

One of the most interesting developments Lanning cites is Machinima, which he characterizes as a kind of virtual filmmaking environment. Machinima is not as much a technology as a genre of game engines producing linear footage. I'm sure that just like virtual reality there will be people claiming that they own it. However, no one does.

To illustrate how Machinima might apply to his company, Lanning offers, Let's say Oddworld wanted to do a 90-minute, direct-to-DVD movie. If we wanted to do it with pre-rendered CG, wed probably be looking at a $30 million-dollar budget, even with very aggressive economics. If instead we went the Machinima route, and ported it to the PC and took advantage of 2MB RAM for texture mapping instead of 64MB, then we could do 90 minutes for $6 million. And what would come out of that would be far more epic than anyone would expect. We could generate enough quality for HD.

This also lends itself to producing a series. If the first 90-minute piece using Machinima cost us $6 million, the second one becomes a serial. Its like having the sets built for a TV show. And the sound cues are in the sampler for the audio guys. We know what this show is, and now were running episode after episode. For the first one you're paying to build all the databases. The second one is derivative. In 10 years, it will be the same database, except it will be realtime, and it will be used for film. (More information about Machinima can be found at www.Machinima.org.)

The Creative Possibilities

Lanning thinks that Whats interesting about the Machinima approach is that once those databases are in that world, a potential director can sit with someone whos interfacing the system and prove if they've got it or not. And they can prove it in a day not in a month or three months. If a director has it in him to know where to put the camera and how to time the action and build tension, he can do it with the databases that exist.

With increased detail and Machinima technology, the potential of gaming and cinema is unlimited.

With increased detail and Machinima technology, the potential of gaming and cinema is unlimited.

Lanning, who previously worked at Rhythm & Hues before launching Oddworld with R&H executive Sherry McKenna, thinks that the impact of working more interactively will be profound. "To be a computer animation director takes a lot of understanding. Typically, in the history of CG, being a director of the medium has taken precedence over just being a great director, because the medium is so challenging. There's so much you have to know in order to direct it well."

CG directors also have been constrained thus far by the high costs involved in rendering and re-rendering, when necessary in conventional CG production. Ive seen directors say, I just want the character to lean over for another second so that we feel the tension! And the producer replies: You want another second? Thats another week. Make your choice you want another second on this scene or do you want that bathroom scene?

Lanning hopes that the development of realtime animation tools will help liberate artists to think more about story and character and less about technology. Im excited about getting into what I feel is the new emerging virtual film culture. The more we head to the future, the more well be thinking about integrating realtime and pre-rendered assets. We'll have to the tools to get performances out of characters fluidly in real time. Of course, in the world of linear entertainment, no one cares how you've made your imagery it had just better be great!

Ellen Wolff is a Southern California-based writer whose articles have appeared in publications such as Daily Variety, Millimeter, Animation Magazine, Video Systems and the Website CreativePlanet.com. Her areas of special interest are computer animation and digital visual effects.

Tags