HALON taps Fabric Engine’s Creation Platform to develop a virtual production environment capable of re-targeting motion capture data in real time.
Time is money. It is an adage that studios entrenched in the complex world of 3D and live-action production understand all too well. Keeping large crews on set, or teams of artists cranking away on 3D characters and scenes, takes a lot of money; and those costs just continue to climb.
The concept of previsualization, or previs, grew out of this pressure to keep production costs under control. As the amount of digital data being used in production increased, the need to see it earlier in the creative process became critical. Using 3D and motion capture technologies to create rough versions of sets and sequences enabled meaningful creative choices to be made earlier in the production process.
Today, these technologies are being used to enable “virtual production,” which allows directors to see digital and live assets together at the time of shooting. Furthermore, it enables directors to make decisions about lighting, camera positions and shot construction at the time when changes can be most easily made.
At this year’s SIGGRAPH, there was one solution that grabbed a lot of attention in the area of previs and virtual production; and strangely, it wasn’t a commercial product at all. Rather, it was a marriage of technologies and ideas: OptiTrack’s Insight Virtual Camera System, the vision of previs studio HALON Entertainment, and an application development platform (in beta) from Fabric Engine.
Setting the Stage
This is where a bit of history might help. Three years ago, in 2009, Los Angeles-based previs company HALON Entertainment developed a handheld camera to help them capture camera passes on OptiTrack motion capture (mocap) data.
OptiTrack caught wind of the camera and productized it, calling it the Insight Virtual Camera System, or Insight VCS. Together, the camera and OptiTrack’s mocap software formed what is now called the “virtual camera previsualization pipeline,” which has been used on a number of high-profile projects, including Prometheus, Snow White and the Huntsman, Battleship, and John Carter.
For years, the Insight VCS offered HALON a solution for testing camera passes, framing shots and setting up scenes before actual production began. However, the camera had some limitations.
“As a previs company, there are many technology barriers we constantly try to break down to create a previs process where technology gets out of the way of the film making process,” says Daniel Gregoire, owner and previsualization director at HALON Entertainment. “One of these was figuring out how to make the virtual camera process more like the on-set experience directors, DP's and the like are familiar with.”
Currently HALON's virtual camera pipeline follows the same methodology developed by James Cameron on Avatar. Record the necessary performances, boil it all down into a setup, and then film it with the virtual camera. This is effective but not very flexible. Inevitably a filmmaker will change their mind and request changes to blocking, timing and a host of other things. These changes nearly always require the file to go back to an artist to be manipulated off the stage, which has the effective outcome of killing momentum on stage and frustrating the filmmaker.
With an eye to making this process far more efficient, the HALON team started speaking with a young company from Montreal founded by a handful of Autodesk and Softimage veterans well-versed in production and software engineering: Fabric Engine.
The Fabric Engine team had just announced a new application platform, called Creation Platform. Still in beta, “Creation” gives studios a framework built with Python and Qt on top of what they call the “Fabric Core,” which is a multi-threading engine that taps into the power of modern CPUs and GPUs. They provide a range of tools and modules built with their own framework -- covering areas like animation and simulation -- but completely open to be changed or replaced according to the customer’s needs.
“Over the years we've squeezed every ounce of performance and visual acuity out of our long-time tool set. But in the last six months we've seen a convergence of technology that finally brings the things we've been coveting from game engines -- real-time performance and beautiful images -- together into accessible reality,” continues Gregoire. “The team at Fabric Engine gets that. Unlike the demands of shipping a single product, which is what most game engines are built for, the Creation Platform is built for the demands of continuous production. The team there has demonstrated a plethora of understanding where we need to go as an industry by focusing on true cloud-based computing, real time heavy data set manipulation, interoperability and strong extensibility.”
Back to the Future
The weekend before this year’s SIGGRAPH conference and exhibition, the teams from HALON Entertainment and Fabric Engine sat down to discuss the virtual camera system and tackle some of the new features HALON needed.
“We agreed to meet on the Saturday before SIGGRAPH on set, and I gave them the details of what we needed,” said Justin Denton, previsualization & VFX supervisor at HALON. “Not only did the guys from Fabric Engine develop a completely custom application in less than a day; they also made it completely live on the vcam monitor and allowed everything in the environment to be manipulated while streaming live mocap data. This kind of real-time visualization and interactivity is key to understanding how changes affect your scene.”
So what exactly did Creation Platform do for HALON? Using its built-in extension framework, the Fabric Engine team hooked Creation Platform into OptiTrack’s Ethernet mocap stream. That mocap data was then retargeted in real-time onto the 3D characters in the scene, complete with support for blendable IK targets. In the SIGGRAPH demo, the 3D set also contained 470 fully textured objects, all rendered in real time by Creation Platform at 30 frames per second in the viewport of the Insight VCS, allowing directors to get a visual grasp of the characters and the scene on the fly.
The second step in the process was to enable the director to interact with the scene. Creation Platform was used to tap into the stick controllers on the virtual camera to enable the director to “lasso” an object in the scene, move it or rotate it, and essentially edit the 3D scene in real time.
The entire application took just six hours in total to build. By the following Tuesday, OptiTrack was not only demoing the virtual camera at SIGGRAPH, they were showing how a huge 3D scene and characters could be manipulated on the fly by the previsdirector.
“It’s important to note that this was not only a rapid prototype of a tool that Fabric created,” Gregoire said. “Using the Creation Platform the Fabric team helped us create a solid, production-based interactive demo for our Insight VCS -- one that could completely change our workflow and help us provide better options for our clients. Developing something similar but not even as good would take us weeks or months.”
Virtual Production of Production Tools
So what does any of this mean? Challenges arise every day in production studios. In the past, studios have struggled to find workarounds while either waiting for technology providers to solve problems with upgrades, or developing costly in-house solutions themselves.
But at this year’s SIGGRAPH, HALON, NaturalPoint and Fabric Engine presented a new option, one that seems to provide more flexibility to studios in the midst of production.
More importantly, people have begun thinking about virtual production in completely new ways. If virtual production is about enabling studios to get to and through production faster, then certainly the rapid creation of high-performance, internal tools is part of that virtual production equation.
While it’s easy to predict this won’t be the last time we’ll see a solution like this grace the show floor at SIGGRAPH, it will be even more interesting to see how development tools such as Creation Platform are adopted in real-world production environments. Virtual production is already here. But how will we use it?