Thomas J. McLean takes a peek under the hood of the hot new cinematics for Halo 3, Mass Effect, Hellgate: London and Tabula Rasa.
As videogames have become more cinematic, the need for movie-quality visuals has grown, particularly in the cut scenes and short sequences between game play called cinematics.
Producing cinematics is not dissimilar from 3D animation and vfx work for film and TV, though the specifics of the game industry and the needs of particular games have a tremendous impact on how these short spots are made.
Here, VFXWorld takes a look at the very different circumstances and techniques used in the creation of cinematics for a few of the current top games.
Of course, no game has made a bigger splash this fall than Halo 3. The previous games helped establish the Xbox as a major gaming platform and raised the bar on cinematics. The third entry in the series had to deliver big time, and that required some special work on the cinematics side.
C.J. Cowan, cinematics director for Bungie Studios, says the entire cinematic pipeline was rebuilt for Halo 3. "In Halo 2, we had to write a script many pages long for each cinematic calling specific animations on characters at certain times, creating and destroying objects and effects, calling out start times for dialog and music... basically everything that happened in the scene," Cowan adds.
Most of that information already existed in the Maya files, and the data that wasn't could easily be added. "Our tools engineers created an export process out of Maya that basically auto-generated the script our engine needed to run the cinematic," he says. "Because we didn't have to hand author it anymore, it allowed us to iterate much more quickly and make changes without worrying about how it would effect the timing of the script."
The game ended up with 30 cinematics running about 50 minutes. Cowan says the cinematics team at Bungie did all the layout and rough animation, sending to animation houses a Maya file with all the camera work, timing and blocking done. "This worked particularly well because we had over 40 animators all over the country (and overseas), and they were able to open a new scene and have very few questions about where I wanted them to take it."
Bungie, meanwhile, used three houses to animate the scenes: DamnFX in Montreal; Zoic Studios, a vfx house that handles such TV shows as Battlestar Galactica; and Rare Studios, a fellow Microsoft Game Studio that lent some animators to the project at key points.
Cowan says that while the houses' job was essentially to polish the animation in the layouts, they all had creative input as well. "All of the animators had quite a bit of latitude in the 'acting' of our characters, which worked out well because the individual animators were really free to concentrate on subtle details and character bits that really brought our cinematics to life."
The houses returned their work as Maya files to Bungie, which handled lighting and effects in house and incorporated the results into the game engine.
Cowan says internal layout took place over two years, while the animation was done in a tight five-month timeframe.
One of the challenges facing developers of RPGs is how to make the often-extensive conversations between characters look cinematic.
For the Xbox 360 game Mass Effect, developer BioWare went back to study the basics of cinema and applied what they learned through new tools that let them automate many reactions.
"One of the achievements of cinematic design was creating systems that gave us a good head start in creating content, such as the digital acting and the camera work," says Ken Thain, lead cinematics designer. "We had cinematography tools that allowed us to add depth of field to camera shots automatically, and that would allow facial expressions on the digital character just by choosing an emotion."
While Mass Effect was built on top of the Unreal 3 game engine, which BioWare licensed from Epic, enough customization and proprietary tools have been added to make the engine essentially a proprietary one.
The tools made it possible to place the camera for a close-up or an establishing shot "just by assigning a number to a line," he says. "We were able to generate a lot of high quality content in a short amount of time."
That was important with more than 20,000 lines of dialog written and recorded for the game. Still, it was also important to have the cinematics match the story and emotions of the characters within it. Thus, the tools also allowed for an easy override of the auto-generated material.
"If I'd created a digital acting performance where I just felt that line or that moment needed a bit more attention, I could jump in and modify it and up the dramatic level of he performance," he says.
Between 2,500 and 5,000 of the game's 20,000 lines required some extra attention, Thain offers. "We'd go in and tweak the performance a little bit, either through a facial expression or a body gesture or camera move to bring out the cinematic elements."
Thain says one of the goals of the game was to elevate the cinematic element for an RPG. "The history of cinema within games is rocky at best," he continues. "There's been some real glowing examples and some titles that have failed to meet that goal. But we really sat down and said, 'OK, what is it about cinema that keeps people interested? What's the grammar of film that has people understand the message and the stories?'"
After studying films by the likes of John Ford and Alfred Hitchcock, they introduced more film techniques into the game, including the use of depth of field, motion blur and trying to match the types of lenses cinematographers would use for similar shots. "It was really about raising the bar for cinema within the game to a level that the average person playing, the casual gamer, would relate to it a lot stronger," says Thain.
Planning all this out required close collaboration with the game's developers and writers -- a task made easier by their all being in-house at BioWare's offices in Edmonton, Alberta.
"If I needed to talk to a writer he was just down the hall," Thain says. "That level of accessibility within the team was really strong."
From a production standpoint, Kevin Margo, CG supervisor for Blur Studios, says the PC game Hellgate: London was an unusual project because of the number of assets required and the lengthy timeline.
"It was this three-year undertaking," he says. "It was always this project that was kind of running and at various times it was at full swing and at other times on the back burner. Usually, it's three, four months, do it, done and see ya."
Margo says Flagship Studios was a new company when it contacted Blur to create a game trailer that could be shown at E3 to generate buzz for the products it would eventually release. The game itself was still largely undefined, and Blur had an unusual free hand in creating concepts, characters and the look of things.
"They were very open to our concept guys doing sketches and paintings of characters and environments and us saying, 'What do you think of this?' And they'd be like: 'Cool! We won't be modeling that character for the game for another six months, but you guys go ahead, you make it for the cinematic you're doing and we like it enough that we'll just mimic what you guys do when it comes time for us to make the in-game model,'" Margo says.
Work on the project came in cycles, with each cycle having a more typical production timeline of three to four months. Blur had a core of eight to 10 artists on the project, augmenting them as needed. Margo estimates that easily 50 to 60 artists had a hand in the game.
Flagship provided no storyboards, so it was up to Blur whether to develop their own storyboards or work out scenes in the layout phase, which is easier for conversation scenes. Margo says one artist did the entire layout to keep it consistent, before sending scenes off to an animation supervisor who makes sure everything gets properly rigged and animated.
Margo says they used a stacked pipeline to handle the modeling, rigging and animation that works on all those processes simultaneously. For example, a model that's maybe 90% done will be handed off to a rigger who creates a rough setup so the animator can get started. As each element is refined, the updates are passed on to the others until the process is complete.
On Hellgate: London, they used 3ds Max for all the character modeling, environment modeling, rigging and animation. Brazil was the renderer and compositing was done in Fusion.
The project ended up requiring a large number of assets. Counting background characters, Blur created 60 to 70 characters and 10 high-resolution environments for what ended up being between eight and 10 minutes of cinematics.
As the project progressed, Flagship's requests became increasingly specific. "They had all their characters fleshed out and they knew exactly what they wanted and are trying to wrap up the game."
Cinematics also have to match up with how game itself looks, which can sometimes be a creative challenge. "It's kind of hard when they say, 'It has to look like this,' but we're capable of doing more," Margo suggests.
While Hellgate: London offered a long schedule and creative collaboration, Blur's work on another PC game, Tabula Rasa, was much more typical. Margo says they were given scripts and storyboards, with a production schedule of three to four months.
"Having stuff that they know they want is in a sense better because you can do it and it's done and there's no question about it," Margo explains.
Blur created 21 shots for a running time of just under two minutes for the game. The challenge on this project for Blur was the number of assets required for those shots and making sure they were both completed on time and moved along to the next stage of the process as soon as possible.
On the tech side, Tabula Rasa was a transitional one for the company as it experimented with upgrading its rendering software.
With Brazil 1 beginning to show its age, Margo says Brazil 2 was in pre-release at the time and many of the bugs and features were still being worked out. He used mental ray for a lot of environment work and the versions of Brazil for different needs when it came to characters. For example, Brazil 2 was used for all the characters that had no hair because at the time the program couldn't handle hair effects. Characters with hair were done in Brazil 1.
"That was the biggest challenge," he says, "trying to juggle these three renderers and make everything cohesive looking."
Thomas J. McLean is a freelance journalist whose articles have appeared in Variety, Below the Line, Animation Magazine and Publishers Weekly. He writes a comicbook blog for Variety.com called Bags and Boards, and is the author of Mutant Cinema: The X-Men Trilogy from Comics to Screen, forthcoming from Sequart.com Books.