J. Paul Peszko dives into the extraordinary advances in CG water on display this summer in Poseidon and Pirates of the Caribbean: Dead Mans Chest.
... But not a drop to drink. Why not? Because its all CG!
When you go to the theater this summer, you may wish that you had brought a life vest. But no need for panic. The oceans you see on the big screen in Poseidon and Pirates of the Caribbean: Dead Mans Chest, no matter how real they appear, are still CG simulations albeit spectacular ones. Not only have we come a long way from the old plastic model ship in a tank days, but even from a few short years ago when we were nearly deluged by CG water in The Perfect Storm. We have Industrial Light & Magic and Scanline to thank for the latest advancements. Both are at the forefront of creating cutting-edge fluid simulation systems.
Going with the Flowline
Scanline is a Munich-based visual effects company that has formulated fluid simulation software called Flowline. Together with London-based vfx house, MPC (The Moving Picture Co.), which licensed and integrated Flowline adeptly into its pipeline, they collaborated on Poseidon.
Stephan Trojansky, Scanlines head of R&D, points out the limitations that these techniques have always had. First, the simulation detail didnt match real world properties, especially on large scale scenes preventing correct fine scale motion. Second, the rendering techniques didnt provide the correct detail and had been rather cheats with volumetric particles or polygons than real physically correct refractive droplets, sheets and volumes of water. Third, and sometimes even more important, there has always been the struggle about how to control dynamic simulations so that they fit the directors need but still keep the natural behavior and look of water and fire.
According to Poseidons overall visual effects supervisor, Boyd Shermis, the creative and technical challenges they faced forced them to make advancements over then-existing fluid emulation technology. The semantics of emulation vs. simulation notwithstanding, we needed to be able to do things with water or fire, oil, smoke and dust on a scale that hadnt been approached before. And there were no proven pipelines to make it run up and render in the way we were expecting it to work. For their part, ILM was being asked to simulate full, volumetric (3D) water on a scale that allowed a 1,200-foot ship to interact with it. [In other words], the ship (with a record number of polygons) had to be hit by, then rollover down into, through and up out of the water and then roll back down into it again. And that was just in one scene.
I needed to have the bubbles and/or foam and/or spray to be truly borne from the volume, not as an added element on top of the volume. I needed to see the true interaction of the ship displacing and motivating motion within the volume of water. And to scatter light within the volume and the bubbles/foam/spray. These are all things that might have been done before, but at nowhere near the scale or complexity that we required, both in terms of area coverage and in number of shots. So ILM was asked to rewrite their water code on our behalf to accommodate all these things. [Then] there was the issue of time. Running these kinds of simulations have been notoriously compute intensive and were literally taking weeks to run up. ILM, in association with Stanfords computer science department, managed to break the frames down into smaller bites, or tiles, in order to parallel-ize the computing effort over several processors (between 8-16) to speed up the process of simulation run ups. This was a tremendous leap forward for them.
In terms of the MPC and Scanline efforts, there were a couple of areas that forced them to advance the Flowline software. Flowline is a very unique and capable piece of liquid simulation software. Scanline has been using it well for several years, obviously. And while it has had the ability to do water, fire, smoke or other viscous fluids, it has never been called upon to do all of those at once, in a single simulation, all confined in a very complex, hard surface environment. And it has never been asked to scale up to 4K for run up and rendering. And it has never been asked to do it in a massive Linux render farm. When MPC researched and ultimately licensed Flowline, they realized that it was mental ray-dependent and MPC themselves had to switch their entire rendering pipeline to mental images mental ray to accommodate the use of Flowline. Having now stretched the capabilities of Flowline as we have, I think that, within the industry in general, there will be a greater inclination toward using viable 3D solutions where once only practical, filmed elements would have been considered. 3D water and fire of this caliber will become the expected norm in the vfx toolset.
Just how has Flowline managed to overcome these limitations, especially so that an oceanic movie as large as Poseidon looks realistic? In traditional approaches, the volumetric particles used for large scale water effects, each visually represented tens of thousands of droplets, continues Trojansky. But, in the moment of a collision, those would bounce all together and destroy immediately the illusion of realism since all of them would have the same new motion at once. With Flowline, our artists can now create simulations and renderings, where the question is no more whether or not they look realistic, but rather what the artistic behavior should be and how the water should be directed. Thats where Flowline reveals its full power, since for any point in time and space, our artists can control any properties of the look and motion of the water or fire while still maintaining a realistic behavior.
Trojansky and key Scanline staff played a pivotal role technically directing simulation shots during a two-month placement at MPC. How did their collaboration work out on Poseidon? According to Steve Moncur, CG supervisor at MPC, it went very well. The scale of the action and types of shots we were asked to produce on Poseidon pushed the boundaries of what Flowline can achieve right to the edge, but we still kept finding new ways to get it to solve these problems.
A colossal rogue wave capsizing a huge luxury liner in the middle of the north Atlantic Ocean, how was Flowline able to solve all that? Moncur explains: The basics are that you split up your scene and sim into basic components that will be in the shot geometry, characters, fluid, spray, fire, and create a basic sim of each fluid element against the set and character animation geometry. Then you refine each of the different sims reaction to and against each other. You continue doing this increasing solver accuracy and sim detail levels until you achieve the desired look. This is a great over simplification of the process, as it can get quite complex and very involved with the sort of interactions and math that are involved in these types of effects.
By using Flowline, a producer now has many options with respect to fluid simulation. Flowline is not limited to one specific type of water either, Trojansky notes. It can handle small scale effects like water pouring into a glass, characters interacting with water and creating splashes, floods rushing through streets, waves breaking on the beach or tsunami waves taking out oil rigs. Small-scale flames or big explosion balls are also possible as well as multiple phases of fluids, like the mixture and interaction of water and oil. Especially the correct interaction of multiple media like water, oil and fire creates completely new possibilities. For example, oil poured into water will rise up and float on the surface. It can be ignited and the pressure waves from the explosion will push the oil stream and create ripples on the water, which creates again different looking flames. Now, for the first time in visual effects, we can provide directors the full creative freedom to design shots with the physically correct interaction, simulation and rendering of all kinds of fluids.
The research team at Scanline, led by Trojansky, and now fast at work on Flowline3, had to develop a solution that covered the full pipeline from the design of the simulation for shots all the way through to the final renderings. To enable this we didnt limit ourselves to a single approach of computational fluid dynamics, states Trojansky. The roots of Scanlines R&D go back to1989 and over the years lots of approaches have been generated. And today we can look at a wide range of data structures and solvers that are all seamlessly integrated into Flowline. In fact, artists wont recognize which solver types or data structures or rendering algorithms are used for their shots, since Flowline handles most of those questions internally.
One thing that CG artists will recognize: no exporting. The workflow for our artists is quite straight forward, states Trojansky. Flowline runs directly in Maya or 3ds Max, depending on the artists choice, which allows an interactive design of the simulation along with modifications of the environment or interacting characters. At no point is there an export step necessary. The artists can define the sources of water, spray, fuel, bubbles, as well as heat sources to ignite fire or explosions. Extensive controls allow you to modify the timing of simulations and the adjustments of properties at any point in space. All kinds of Rigid Bodies or forces can interact with the simulation. In general, the workflow goes rather towards realtime than toward overnight simulations. Our philosophy is to provide the artists interactive simulation feedbacks directly in their Maya or 3ds Max viewport, but for heavy simulation work, they can also easily use network simulations. Once the simulation is done, Flowline works seamlessly together with mental ray or vray to create final production images. Shading artists can access any property of surfaces or fire and create fully refractive and reflective renderings with caustics and GI, as well as different types of atmospheric effects at any render resolution Our philosophy is to be able to run Flowline on any workstation without having to invest in expensive hardware, which also creates an efficient and flexible workflow...
ILM & Stanford Advance Hybrid Fluid Sim
Halfway around the world, another team of researchers has been developing its own fluid simulation process for use on large fluid motion projects such as Poseidon and Dead Mans Chest, this summers sequel to Pirates of the Caribbean. There are several differences between the two systems. Unlike Flowline, this particular system doesnt have a name. Also, it is a result of collaboration between researchers at Stanford Universitys computer science department and ILM rather than the efforts of a single company. Finally, it is not a stand-alone system that can be integrated with other 3D software such as Maya or 3ds Max.
Ron Fedkiw, a professor of computer science at Stanford, who splits his time between his research lab in Palo Alto and ILM in San Francisco, explains: We put together a hybrid thing here through ILMs proprietary Zeno pipeline. We havent named it anything. Its got three pieces basically. Theres a PhysBAM engine at Stanford (University), which is like a core math engine. It stands for Physics Based Modeling. Theres a Zeno interface, which is like an equivalent to Maya that the artists use. And we connected those two together with an engine that was created by the R&D group here at ILM.
A little more than a year ago, Fedkiw and his researchers still had not figured out a way to improve upon the methods used to simulate oceanic effects in Perfect Storm. Then came the breakthrough. It used to be that you had two choices for fluid action, suggests Fedkiw. One was the way they did Perfect Storm. Because you could use a lot of computers or multi-processors but you had to use inferior algorythms. The algorithms, themselves, werent the favorite algorithms. The other way of doing it was the way I had done it in the past was to use the best possible algorithms, but you could only use a single processor. So, for years it has been this battle. Use one processor and these really nifty algorithms to give you really good results. Or use a whole font of processors and the algorithms are much more crude. And in the end, theyre pretty much even. The big breakthrough this year is that we figured out a way of how to take those nifty algorithms, the best possible ones, and actually get them into multiple processors. So what has changed is that we can run with like 20 or 30 processors using the real standout algorithms that up to now could only run on one.
Cliff Plumer, Lucasfilm cto, likes the time saving efficiency of this new method. The process that Ron was talking about that we had for Poseidon was much more integrated in comparison to what we had done in the past on things like Perfect Storm, where there were a lot more layers and elements that went into those shots from the early fluid dynamics stuff that we did years ago to particles to even live-action elements. So, now were able to create more interaction between the fluids, or in this case the water, and the ship than the way we did in the past.
All those layers and elements had to first be created and then composited together, but not so with Poseidon, as Fedkiw notes, The whole ship is CG. A big chunk of the ocean is CG, and we do it all with one integrated simulation as opposed to layering all the elements...
Because the entire system is self-contained in-house, the artists at ILM had access to the source for Poseidon and Dead Mans Chest. We can do whatever we want with it, states Fedkiw. Were not handcuffed like you would be with something like a Maya. Were able to create an environment for the artist, where they can actually set up multiple processors. So, its just like doing something in say Maya except that its a little more customized. I can set up a fluid shot, pick a domain, pick a chunk of the ocean to simulate, bring in your ship and different elements and bodies, life boats or whatever you want to interact with the simulation, and ours places all that and then picks a number of processors to run it on.
Plus there are more controls on the simulation now, notes Plumer. Getting back to something like Perfect Storm, your basic run on simulation back then could take days. It would be a tough process to integrate because whatever the result of the simulation was if it worked, great! If it didnt, you were back to the drawing board. So there was a lot more trial and error back then. But now weve built more controls into the system so we can get a quicker response and integrate it much quicker.
Fedkiw agrees. Using this (Zeno) interface, you can introduce particle controls, Soft Body and Rigid Body controls, and all kinds of things into the fluid itself. You can run lower resolution simulations first and see how they look and then upgrade them and run them overnight afterward.
A simulation that they can now set up and run overnight, as little as a year ago, wouldve taken them nine or 10 months on a single processor to complete. Fedkiw says that the best way to describe this Zeno interface is to imagine Maya on steroids.
Through the Zeno interface, Fedkiw hopes to develop fluids to the point where many more artists can use them. Poseidon was a large show with lots and lots of water shots all the way through it. So we had to train a whole lot of people to use it. I think on Perfect Storm, there were one or possibly two people that could use the software. Even on Terminator 3, we had three or four people that could run a fluid shot for it. But now there are lots of people who can do that. Thats because of the way the Zeno interface works. Were able to customize it.
Leo Mohen, ILMs associate visual effects supervisor on Poseidon, maintains there are challenges as well as benefits associated with simulation. There is, however, an inherent limitation to approximating nature by purely artistic means. The complexity of something as simple as a water splash or a campfire under close scrutiny surpasses not only what even the most skilled artist can create technically, it frequently also surpasses their imagination. Using tools that have little in common with actual physics, artists have to rely on matching reference. Generating a natural effect that looks real but isnt closely based on reference by these means is extremely challenging.
In reality, when shooting practical effects, some of the most interesting and successful elements are the results of unpredictable behavior, small, unforeseen details and happy accidents. Advances in research as well as technology have now brought us to the threshold of approaching computer simulations in a similar way. Rather than micro-directing and hand-shaping the final result of a crude simulation to resemble natural complexity, the artist can now focus more on tweaking and adjusting the setup and starting condition of the simulation, similar to a special effects technician setting up a dump tank or pyrotechnics, while the computer takes on the task of creating realism and complexity.
Similar to practical effects, the artist can choose to set up his simulation close to the action depicted in the shot, or exaggerate and cheat to get a more exciting effect. Furthermore, unlike with practical effects, results are fully repeatable and the ability to nudge simulations in the desired direction and bend the laws of physics where necessary provides greater artistic control and flexibility.
According to Fedkiw, Poseidon was the first time that you had total CG integration of the spray, foam and bubbles effect. Some of the underwater shots, both the bubble generation and the rendering of them, are some of the nicest things in the film, in my opinion Ive never seen underwater shots that look like that.
Concludes ILM visual effects supervisor Kim Libreri: On Poseidon, our simulations revealed so much detail that breaking waves would emerge from the body of the water. This phenomenon alone saved us months of work that would have been needed to fake in braking waves over the top of rendered simulations. When two waves collided splashes would be automatically jettisoned from the collision area and vortices would form in the body of the water that in turn would generate hundreds of thousands of bubbles under the surface. Each of these would traditionally have to be hand-animated and painstakingly placed into each of the shots. Even though arriving at such a high fidelity fluid solver took many years of work at both ILM and Stanford, the results more than justify the investment. We can now turn large-scale water shots around in a matter of weeks (and in some cases days) compared to the months that it would have taken with a more traditional approach. This frees our artists up to spend time making beautiful shots instead fighting with technology.
J. Paul Peszko is a freelance writer and screenwriter living in Los Angeles. He writes various features and reviews as well as short fiction. He has a feature comedy in development and has just completed his second novel. When he isnt writing, he teaches communications courses.