Autodesk's SVP Marc Petit talks to Bill Desowitz exclusively about the strategy behind his company's recent acquisitions.
VFXWorld gets an exclusive Q&A with Marc Petit, Autodesk Media & Ent.'s SVP, to explain the strategy behind this week's acquisitions of Realviz and Kynogon. With Realviz, headquartered in Sophia Antipolis, France, Autodesk adds image-based content creation software to its arsenal. And with Paris-based Kynogon, Autodesk additionally gets the Kynapse AI middleware.
Bill Desowitz: You're closing two acquisitions on the same day; why is Autodesk so focused on acquisitions lately?
Marc Petit: Well, the timing is kind of a coincidence. We have a very deliberate and highly selective approach to acquisitions. Believe me; we get a lot of solicitation! We like talented teams, we like top notch technologies that complement our own core technology and we like products that complement our portfolio because our customers are asking for more integrated solutions. This was the case in the past for the Unreal/Character Studio, Colorfront/Lustre and Skymatter/Mudbox acquisitions, and is once again the case for both Kynogon and Realviz. These acquisitions bring Autodesk highly talented teams, best-of-breed technologies applicable across multiple markets and great entrepreneurs with a strong reputation in the industry.
BD: Curious that it's two French companies.
MP: Merely happenstance and nothing to do with the fact that I'm French myself for sure! Seriously, France has a history of producing strong innovators and there are many very interesting companies and projects over there, but this is also true of many other countries.
BD: In terms of Realviz, I take it the main draw was Stitcher, ImageModeler and Movimento?
MP: Yes, these are great products. Realviz was created almost 10 years ago as a technology transfer from the French research institute INRIA. It's been built from day one upon a very advanced technology foundation and has led the field of computer vision and image-based modeling ever since. We view this technology and set of products as fundamental as they simplify the transition from 2D to 3D, from the real to the virtual.
Many processes, from entertainment to architecture or graphics to digital photography require tight integration of the virtual and the real. With this technology, it's easy to extract a lot of 3D information from stills and image sequences and that information can be useful to our existing customers in many ways -- from being able to extract measurements from a couple of pictures, building full 3D digital models of cities from satellite imagery to automatically revealing a 3D camera move or capturing motion from a movie shot. Realviz's products are important, mature technologies and we believe that now is the right time to integrate these technologies more deeply into our pipeline solutions. That is why we are getting together to do this.
BD: Talk more about the importance of bridging the gap between 2D and 3D, which was so vital, for instance, to the aesthetic of Speed Racer.
MP: Exactly -- we are seeing some fundamental changes in the way movies, television programs and games are produced. Being able to gather 3D data from 2D material can help at many stages of the production process. We see it in pre-visualization, where it's easy to build 3D environments or virtual sets by stitching pictures together or derive 3D models very quickly from set photos using image-based modeling. Games now require a lot of facial animation. Animators can rough out facial animations using a webcam and optical motion capture. Adding assist cameras on location to capture scenes from multiple points of view is a good insurance policy for post-production. Of course, you can extract camera moves to enrich the post-production process but you can also rebuild sets in 3D from these multiple points of view. Multiple cameras enable optical motion capture of actors in non-intrusive ways without the need for suits or dedicated stages. The performance of an actor can then be modified or augmented during the post-production process using regular 3D tools such as Maya or Flame. Moreover, combining all of these technologies with spatial image-based lighting allows for highly realistic integration of CG elements with live action. We believe that weaving these technologies and products more tightly into our existing portfolio should lead to some interesting new capabilities and will provide for a more efficient production environment.
For customers in many industries that Autodesk serves, it is about being able to factor the "real" into the virtual. So, for example, with ImageModeler, when you do renovation projects, architects can take a few shots and extract measurements and even 3D models from photos of existing building facades to inform their designs. Realviz technologies can also be used to assemble or stitch together multiple images to create immersive environments with 3D views. Photographers use this technique when they need to shoot closed confined environments or very large panoramas. These tools are also a good entrée into 3D for digital artists and photographers accustomed to working in 2D.
BD: So will you be integrating Realviz technologies into other Autodesk products?
MP: We anticipate that certain elements will be equally appealing to customers of 3ds Max and Maya as they would be to Revit and Autocad customers, so, yes, most definitely.
BD: And, with Kynogon's Kynapse, you now have another middleware solution to offer Autodesk customers.
MP: This is very exciting to our games customers and since we announced this acquisition back at the Game Developers Conference in February, we've had many inquiries from top game studios because they're seeing the value of integrating these core middleware building blocks with their own technology. How a character evolves and interacts with its environment is a critical part of game play across many genres, so there is a lot of interest around our solutions.
We entered the games middleware market over a year ago with the introduction of HumanIK. HumanIK is a realtime full body IK system that helps breathe life into digital characters, including the athletes in EA's sports lineup, as well as into Altair, the horses and other characters in Assassin's Creed from Ubisoft.
One of the things we have observed is the importance of tightly integrating AI and animation to produce believable characters that can interact realistically with their environments. Kynapse is a very sophisticated AI solution; one of the things it does is bring spatial awareness to characters so they can find their way around the large scale, dynamic environments that exist in current games. This is essential for game play. Path-finding and line of sight analysis used to be relatively easy but with the scale of current game levels and destructible worlds, using an advanced off-the-shelf solution like Kynapse helps developers focus on what makes their game unique, and that's why it has been so successful.
BD: Are there synergies between asset creation and middleware or are developers looking at these as very different problems?
MP: People are used to considering middleware and asset creation as two different processes with different characteristics, but we started to demonstrate how you can bridge the two. Our customers got very excited and are now beginning to tell us how this can streamline their workflows. We will be enabling the authoring of assets within 3ds Max and Maya that contain more than geometry, UVs, animations and textures. These assets can be read directly by the game engine and are "middleware-ready," helping ensure that assets comply with the engine requirements and perform well within the engine. This will be invaluable to game developers as they tap more distributed production resources and outsourcing companies, and may at times struggle with the quality of the assets they receive.
Integrating middleware into the art creation allows developers to review their art in a similar context to how it will appear in the game engine without leaving 3ds Max or Maya. For example, if the AI guides the character to turn 180 degrees, you can preview the results right inside of Maya and you can start catching more gameplay red flags early in the process. Our middleware solutions will provide total fidelity between 3ds Max and Maya and the game engine that implements them so you'll have a "What You See Is What You Play" solution!
BD: And will we see more integration of Kynogon technology into Autodesk products?
MP: Definitely. We have plans to use these technologies across a wide spectrum of industries… Every country in the world is starting to model their cities in 3D, and the Kynapse technology offers some very relevant potential for these applications. The path-finding and AI technology can be used to do simulations of crowds inside of buildings or stadiums or to model and simulate car traffic in digital models of cities. As the teams get to know each other, I expect we will certainly find a lot of unexpected use for both Realviz and Kynogon technologies inside of Autodesk's Media & Ent., but also inside of our AEC, Manufacturing, Plant and Geospatial groups.
BD: I take it you will also continue developing and marketing standalone versions of Realviz tools and Kynogon's Kynapse?
MP: For sure -- we'll continue to sell the Stitcher product line, ImageModeler and Movimento. These products are all great complements to Autodesk's existing product portfolio. We will also continue to support Kynogon's Kynapse as a standalone middleware product.
BD: Will the employees and development teams of these companies remain in France as Autodesk employees?
MP: Yes. The Realviz team will remain in the south of France and we'll be happy to visit them once in a while. The Kynogon team is both in Montreal and in Paris, so the Montreal team will move to Autodesk's Media & Ent. headquarters in Montreal and the Paris team just moved into bigger offices in the heart of Paris.
Bill Desowitz is editor of VFXWorld.