Peter Plantec looks at how realtime technology has advanced the digital entertainment industry.
Back in the early days, creating 3D scenes was a tough gig. 3D programs like POV didnt even have user interfaces. Youd write little scripts in POV and hope for the best. After five to seven hours of rendering on your desktop, youd get to check out what your 640x480 rendering looked like. Typically consisting of mirrored balls on an infinite plane and a marble column or two, they were nice, but often not what you had in mind. Weeks might be invested in getting a single frame right. Later came GUIs; easing the task of building, texturing and animating geometry. Yet you still had to work with confusing static wire frame images with no back-face culling or shading. The workflow was awkward and point selection was pretty much hit or miss. But progress from there has been continuous and rapid.
In the early 80s, it took the worlds fastest multi-million dollar super-computers many days to render relatively short 3D sequences. Remember The Last Star Fighter, that little gem released in 1984? This was the first feature film to use photorealistic CGI space ships instead of practical models. To give you an idea of how far from realtime we were at that time: Digital Prods. (DP), the company doing the work, had to employ a double headed DEC VAX 782 computer to feed their massive Cray X-MP supercomputer. The Cray could not be used directly because of connectivity and software issues. DP was also running a separate farm of Evans and Southerland PS300s for the digitizing/modeling and IMI 500s in front of the VAX 782s, to set up the motion sequences. It took more than six months to render the films 36,000 animation frames! And their lease on the Cray alone was more than $250,000 per month! The situation did not foreshadow the coming digital invasion of Hollywood. On top of all that, the render quality of what we can generate in realtime today can actually be better than what they were able to achieve with all that heavy equipment and time. Dear readers, this was just over two decades ago.
It was the release of SGIs cross platform OpenGL API in 1991, and its various extensions, that set the standard for computer graphics; truly inspiring the first serious realtime video card engines. My first OpenGL card was massive, took up two full-length slots, ran very hot and cost more than $5,000 for the card alone. It was also mighty slow (5 to 12 fps) by todays standards. But OpenGL has evolved and remains the Real Time Rendering (RTR) standard of the day.
From the very first, achieving RTR has been a hot hardware/software race. They feed each other like spooning canaries, pushing the edge all the way. At first there was a flurry of activity with several different companies designing competing realtime shaded model manipulation and animation strategies. They sort of worked too. That was remarkable considering the sloth-pace of computers in the early 90s. Special video processing chips and memory were needed to accelerate video and cache textures, but there were no standards yet. Back then 2 to 4 megs of video memory was extravagant, and painfully expensive. But, again, progress was steady and rapid.
Ever cheaper and bigger video memory fed the cycle along with fast proprietary 3D engine chips from ATI, 3D Labs and NVIDIA and E&S. After OpenGL was well-established came Microsofts Direct3D and the two now compete head to head. OpenGL, as its name implies, is an open standard that is overseen by the OpenGL Standards Committee and is frequently updated. D3D is a proprietary standard owned by Microsoft that is only updated about once a year. D3D started out very different from OpenGL, but over time it seems to be getting more and more like it. Since Macs and Linux are both also very big in realtime graphics, OpenGL is more widely supported.
Recently, dirt-cheap open-architecture desktop super-computers (any 4GHz desktop PC with 2GB of RAM and a hot OpenGL card is a super-computer) spawned a separate realtime graphics industry with several players spicing up hot competition spinning out an ascending spiral of speed and resolution. All this pretty much spelled the demise of SGIs proprietary MIPs processor-based machines as the core platform of the vfx industry. Today, with cheap terabytes of RAM and high-speed disk storage, massive render farms both local and distant, all things seem possible. Though we are still in the shallow end of the exponential curve, realtime visual feedback is exerting a profound influence upon our industry, vectoring us in new directions as you read.
Realtime Tools Today
3D tools are evolving rapidly, so fast that its hard to keep up with what is now possible. According to Paul Debevec, developer of image based modeling and lighting, Realtime graphics has clearly advanced enormously in the last few years. A watershed event was at SIGGRAPH 2002 in San Antonio when ATI released the Radeon 9700, the first graphics card to simultaneously support floating-point computation, images and programmable shading instructions.
This combination of capabilities has made it possible to perform any operation on the graphics card now called the GPU that used be run on the standard microprocessor. And while the clock speeds of GPUs are not as fast as those of CPUs, they are designed to be inherently parallel, and most graphics algorithms can be parallelized, many trivially. Scientific computing applications may soon run on graphics cards; people have already written dynamics engines and fluid simulation algorithms on GPUs.
The increased numerical precision available in both ATI and NVIDIA graphics cards has abruptly elevated GPUs from the mathematical sophistication of a Commodore 64 to the full single-precision power of a Pentium processor. Its amazing what video game developers were able to achieve using 8-bit arithmetic tricks, but now that the mathematical canvas is open we are just beginning to see what is possible. Surfaces now reflect light the way real materials do, rooms will soon be bathed in diffusely reflected light, and people will have more and more of the subtle movements, textures and shading that make human performances compelling and real.
Finding the Razors Edge
Vfx is a pressure cooker of evolving artistry and technology. In order to find and stay at the very cutting edge of whats happening now, you need to create tools that nobody else has. Places such as PDI, Pixar and Rhythm & Hues have long focused on proprietary tools to maintain their edge. Other houses develop proprietary plug-ins for the big standard suites, refining their capabilities.
Now small, fast, passionate and hungry companies are starting to give these mega studios a run for their money literally. If you look at the credits of the latest vfx films, youll notice that effects work is being spread out among more and more companies, many of them small ones and not necessarily in the U.S.
The key to success is the ability to create in-house software that does new things and/or does old things better than anybody else can. One example of these highly competitive areas is fluid dynamics. Ocean and Storm scenes are very expensive. If you can create believable oceans, flood effects and atmospheres (as in The Day After Tomorrow storm cell), youre going to save the production company time and money. But it has to be more than good; it has to indistinguishable from reality.
Scanline
Last year, there was a compelling fluid dynamics seminar at fmx in Stuttgart, Germany, presented by Stefan Trojansky, one of the pioneers in GPU simulation programming at Scanline Production Gmbh in Bavaria.
Scanline clearly has a handle on the artistic side as well as the technical.
Trojansky has been working on a film titled Megalodon, which is about a huge and very nasty prehistoric shark with all CG oceanic footage done with RTR technology. The results achieve incredible photorealism.
According to Trojansky, For our special needs at Scanline, fast realtime graphics have become an essential workflow element. Since we have a strong in-house research & development department, we are continuously developing all kinds of particle, fluid a hydrofluid dynamic software.
These software tools have to handle incredible amounts of data, which the effect artists want to have visualized so they can see what is happening. For these purposes Scanline developed for all of its modules, realtime rendering interfaces. For example, our in-house particle system can play millions of particles in realtime, and our fire-and-smoke-simulation-engine can show volumetrics with transparencies and shadowing in the viewportThe hydrofluid simulations are rendered in realtime with complete shading directly into the artists viewport via OpenGL. In many shows these high-quality hardware-renderings have been directly used for previs on feature films. In that way artists and directors had benefits from a more interactive workflow that provides faster feedback cycles...
What about using GPU power to get all this fantastic performance? He confessed: Yes, were looking at the NVIDIA QuadroFX4000 SDI board for use in our simulations. NVIDIA gives our programmers direct access to its extremely fast GPU engine, which can handle simulations much faster than a Pentium can. The important thing is that realtime is now being used to display immensely complex simulations of all kinds. When it used to take hours to compute a single frame, were now outputting up to 15-24 fps of accurate visual reference.
The Collaborative Pipeline
A hot new trend fueled by photorealistic realtime, rendering time is the collaborative 3D workspace. This is where two or more people can work together creating 3D content via the Internet. With regard to 3D entertainment, a very hot new 3D Collaborative Workspace is slated for release early this summer. Its called trueSpace 7 from the innovative Caligari Corp. in Mountain View, California. Version 7 is an entirely new animal. Caligaris underlying engine is highly sophisticated with a message-passing kernel that supports the next generation of massively parallel, multi-core processing hardware. I suspect were all going to be learning the fine points of working together in one space in short order. trueSpace creator Roman Ormandy explains the importance of realtime to his development efforts: trueSpace7 is designed to provide first-of-a-kind collaborative 3D platform for rich human interaction, and 3D semantic repository on web. (Note: Semantic repositories are a standard means of naming data elements so that the structure of files and objects can be communicated between on-line computer systems.)
Without realtime photorealistic rendering, rich eye to eye interaction simply would not be possible. Human visual perception has an uncanny ability to use subtle visual clues such as motion parallax, lighting changes etc to construct aspects of a situation. The eye of a purposeful, motivated human artist wants to see in this way. We worked very hard to ensure that trueSpace7s realtime rendering delivers what the discerning human eye needs in order to be artistically effective.
Clearly such advanced tools will be making the world a much smaller place and will be contributing to the decentralization of our industry.
Zbrush 2.0 An Awesome Tool Made Possible By RTR
One of the favorite artistic tools is Zbrush. This is a unique kind of canvas that lets you paint in 2D, 2.5D and 3D. I think it is an exceptional example of how RTR has opened new doors for artists. Generally uncomfortable with highly technical/text oriented interfaces, artists love Zbrush because its nearly all visual. With it you can do simple 2D texture painting much as you would in Corel Painter IX. But then you get into the 2.5D brushes with lighting effects. Nice, but for me its the 3D brush sculpting that rattles my doors. Using virtual material and sculpting tools with your Wacom tablet, you can very quickly sculpt materials in great detail, in realtime. The control you get with this Zbrush 2.0 allows talented sculptors to transfer their skill directly to the digital realm. You saw the face image by Antropus (Krishnamurti Martins Costa) done in Zbrush 2.0. Now meet Meats Meier, a very talented sculptor at the Gnomon Workshop, when hes not working on films, who actually started sculpting in digital. He often works back and forth between Zbrush and Maya, sculpting in the former and animating in the latter. Meier loves Zbrush, but he wants more. As for realtime, he says, with a bit of a grin: Realtime is the holy grail for digital artists. We are only passing the time until we get it... But, in fact, he uses realtime daily to produce his visual magic.
Building Games in Realtime
Games are by their nature realtime entities, but Im talking about the tools to build all that highly efficient geometry manipulation. In gaming, three of the key players are 3ds max, Maya and LightWave. All three feature realtime scene lighting in low-res shaded RTR so artists can see what theyre building. They also have the ability to port their animated assets directly to the game platform of choice for testing.
The process varies but in general goes like this: The preproduction team is given a premise, then noodles and develops the game story line. Full motion video (FMV) sequences are laid out for various locations in the game to enhance the story. Usually a storyboard artist does some previs on paper developing characters, sets and action suggestions. This material is given to the 3D team that creates the 3D sets, characters and textures that will bring the game to life. Here is where the big impact of realtime previs comes in. As the artists work they can sculpt new characters much like clay Zbrush is becoming ever more important here that can be ported to 3ds max (i.e., at 3DO), where they are rigged and animated; all with the help of various types of realtime shading from wire frame to fully baked textures with lighting. Once models are established, the FMVs are animated and tested. They can be run through on workstation then quickly ported to the Xbox or Nintendo64 platform to test the look and feel that the player will get. All this is done in realtime.
Senior artist Michael Spaw of The Orphanage, explains. We use Maxs realtime workspace to see and refine what the player sees. Were then able to port our animated assets directly to the appropriate game platforms, say the Nintendo64 or Xbox so we can see exactly what the player is going to see. This saves countless hours and produces a better overall product.
Game developers are serious power users who require extraordinary number crunching capability that seems to be increasing at an exponential rate. The goal seems to be to make realtime games as on-the-fly-cinematic as possible. So despite all the low-poly tricks used to achieve good-looking realtime on game platforms, it seems the poly count is going way up as you read. Fortunately evolving game boxes are becoming mini super-computers; whodathought?
Spaw adds: The amount of power we require is a moving target. Its hard to nail down because the more silicon horses they give us, it seems the sooner we bog them down. Where we used to work with five or 10 thousand polygons, we now work with hundreds of thousands, and we need to do that in realtime, nicely shaded. Super fast game platforms and cheap powerful computers with multi-gigabytes of fast ram have changed the entire landscape of our industry.
Michael Johnson at R!OT, Santa Monica, is a young old hand at 3D animation in games, commercials and film. Hes both a talented artist and a techno-geek. He says realtime has definitely improved his artistic expression: Realtime feedback allows me as an artist to get instant visual confirmation of each step in the process. Whereas, before I would tweak something, then wait for a render or the computer to update, then tweak it again, wait... tweak... wait you get the picture. Now its much more interactive, having the feel of an artistic experience. RTR has definitely improved my ability to express myself artistically its refreshing really.
Clearly the future of games is all about creating a psychologically immersive experience. That means more artistry, detail, color, reflections, shadows, glints, volumetrics and action in realtime are on the way.
Realtime in Film
Clearly Hollywood has become addicted to and dependent upon vfx. They want ever more intense effects on one hand and ever more invisible ones on the other. The seamless integration of vfx and live action is key and realtime tools have become indispensable.
Director Alex Proyas of The Crow and I, Robot recalls his favorite realtime tool in shooting I, Robot: I used a tool we called Encodacam. It allowed me to see my virtual sets when I was shooting the greenscreen material with the actors. This was invaluable.
Interestingly, Joe Lewis, owner of General Lift, developer of Encodacam, suggests that many people on set find the digital equipment and procedures plain annoying and distracting. Virtual clearly lacks the charm of more traditional approaches. I like to take the ATM approach to technology on the set, Lewis says. Ordinarily when the motion control people show up everybody on the set gets a bit depressed because its such a pain. So we put motion encoders on all the cameras and made the technology as invisible as possible. People on set adjust to it very quickly. In short order everyone on the I, Robot set got comfortable with Encodacam and it soon became indispensable.
Do actors have an aversion to working on the greenscreen set? Generally, people lose a few IQ points when they get in front of the greenscreen. They have to act and direct without visual references. That requires special skills that not everybody has. Encodacam changes all that by compositing 3D virtual set components being rendered in realtime, with the live-action elements as they are happening. The equipment is portable, so we took it on location and overlaid the 3D set of Future Chicago over parts of Vancouver in realtime.
We build the 3D virtual environment and put the actors, DP and director smack in the middle of it. The director shoots with a real camera thats hooked up to a virtual camera via the encoders. It all comes up composited on screen. We render the virtual components at low resolution, at 24 fps using NVIDIA QuadroFX cards. This allows us to track at a full 24 frames per second. The new faster NVIDIA Quadro FX 4000 cards afford us much higher resolution on screen and their system gives our engineers direct access to the NVIDIA realtime engine.
Ive been playing around with NVIDIA FX 4000 video card and its impressive. My private benchmarking results are a little scary. This card is capable of delivering extraordinarily beautiful vfx overlays on live action at relatively high frame rates. It can also deliver smooth, antialiased, nicely shaded realtime 3D virtual sets synched to the hands-on Panaflex.
Artists in the trenches have much to say about using realtime tools as well. Even though Pixar officially denies using them, at the recent VES Awards, VFXWorld got the winning Incredibles artists to admit: The characters [on upcoming projects] will be uninhibited, more flexible; theres more realtime, no limits.
Henson
Jim Henson was a puppeteer who parleyed that into an entertainment dynasty. Although hes gone, his traditions linger at the Creature Shop in London. Wisely, Henson engineers were among the very first to hook traditional puppeteers up to technology no strings attached. They were the first to use of realtime digital puppeteering. Working with PDI back in the mid-80s using realtime animated digital puppets, they produced a TV spectacular staring Waldo C. Graphic for The Jim Henson Hour. They mixed real and computer graphic images using both real Muppeteers and the computer graphic Muppeteers to perform scenes together. It was a first.
Mike Pryor, head of digital production at Jim Hensons Creature Shop, where theyve developed the HDPS (Henson Digital Performance System) realtime animation system, approaches digital animation with a reverence for tradition. Realtime technology has become a priority for the work we do at Jim Hensons Creature Shop. We now use the waldos (mechanical puppeteer devices used to control traditional Henson animatronic creatures) that have been a part of our work for over 20 years to control CG characters in a digital world. We perform these characters in realtime and have the ability to incorporate true subtlety and nuance into a characters behavior, as we have done with puppetry for half a century. Id even say our system yields a spontaneous performance.
As we use it, realtime motion capture truly allows the puppeteer or live actor to transfer personality to the character theyre animating. The best part is that our performances become organic the puppeteers interact with a director just like an actor on a stage and the characters delivery is immediately altered or adjusted to accommodate a change in script, blocking or the other performances in the scene.
And has HDPS helped Henson stay on top of their game during these very competitive times?
Brian Levant (the director) came to us to animate a bobble head of Satchel Page featured in the film Are We There Yet? The head is an important character so he needed a great performance. Brian was able to direct our talented puppeteers (Bruce Lanoil and Dave Barclay), just as he would human actors on a set, trying different ideas until he got the performance just the way he wanted it. In fact, we were able to deliver 35-40 shots in only three days! As a result, Brian walked away with the exact 3D-animated performance he wanted, in-the-can in that time. We would not have been able to do that without our proprietary realtime animation system and the exceptional human performing element that has been a constant here at the company for 50 years.
Well, if that doesnt say it all. All the software and fast rendering hardware in the world cannot substitute for human talent, persistence and vision. When you get right down to it, realtime rendering only gives us the clay we need to sculpt. It still takes that spark of human brilliance to make it all come to light.
Peter Plantec is a best-selling author, animator and virtual human designer. He wrote The Caligari trueSpace2 Bible, the first 3D animation book specifically written for artists. He lives in the high country near Aspen, Colorado. Peters latest book, Virtual Humans) is a five star selection at Amazon after many reviews. Your can visit his personal Website.