'Building Interactive Worlds in 3D:' Advanced Virtual Camera Development for 'The Diner' Project — Part 2

VFXWorld presents the last excerpt from Building Interactive Worlds in 3D by Jean-Marc Gauthier. This month concludes a look at using the virtual camera for The Diner project.

All images from Building Interactive Worlds in 3D: Virtual Sets and Pre-Visualization for Games, Film & the Web by Jean-Marc Gauthier. Reprinted with permission.

All images from Building Interactive Worlds in 3D: Virtual Sets and Pre-Visualization for Games, Film & the Web by Jean-Marc Gauthier. Reprinted with permission.

This is the last in a series of excerpts from Building Interactive Worlds in 3D by Jean-Marc Gauthier.

Step 2: Cameras and Ways to Navigate Inside the Virtual World

1. Designing a new virtual camera called Dialogue camera. The Dialogue camera uses a stabilization system that is similar to the one described in Chapter 6. Once the camera is attached to a character, the stabilization system keeps the camera stable around its target. We can detach an orbital camera from a character the cameras center of rotation by increasing the distance between character and camera. At a given distance the camera moves out of the characters range. The camera breaks the Keep at Constant Distance behavior and becomes a free camera. The viewer can also choose to detach the camera from a character by pulling the camera away from the character. We chose to pull the camera away from a character because we thought that pushing a camera inside a character was a counterintuitive way to move a camera. We created a possibility for the camera to slide when getting close to the character.

The Dialogue camera is triggered by sound coming from the viewer or the space of the installation. Sound input, a great way to replace the keyboard and the mouse in interacting with a computer, is discussed in detail in Chapter 8. The volume of sound coming from the sound sensor, a microphone, is tested for values; above a given threshold, the Test building block triggers a message telling the camera to change its target. This is the starting point of a sequence of adjustments in space that will take the camera from one character to the other. The following diagrams illustrate the architecture of the behaviors created for the Dialogue camera.

The volume of sound coming from the microphone is tested. Above a given threshold, the Test building block triggers a message telling the camera to change its target.

The volume of sound coming from the microphone is tested. Above a given threshold, the Test building block triggers a message telling the camera to change its target.

The Dialogue camera receives a message from the sound sensor and activates two strings of behavior.

The Dialogue camera receives a message from the sound sensor and activates two strings of behavior.

The Dialogue camera behavior receives a message from the sound sensor and activates two strings of behaviors. The string on the top controls the translation and orientation components of the cameras motion in space. The string at the bottom picks up a new target, which is the other character.

Character viewed from the Dialogue Camera in orbital mode.

Character viewed from the Dialogue Camera in orbital mode.

The camera design can respond to many situations that may arise randomly during a viewers interaction with the scene. The following list covers various responses to situations involving the virtual camera:

  • The camera can collide with characters without pushing the characters.

  • The camera can check the distance between the character and itself and then choose a character according to his or her proximity to the camera.

  • When looking at a character, the camera moves until it targets the characters head. After being attached to a character, the camera can still turn around the character. The camera can be detached from the character and attracted again after two seconds if the camera remains close to the character.

2. Navigation. As mentioned earlier in this chapter, this installation uses a microphone built into the floor as a sensor. The audience can guide the camera and the characters by clapping their hands, tapping their feet or snapping their fingers.

The repetition of the same sound increases the duration of the input for example, a repeated sound will trigger a longer backward motion, similar to holding down a keyboard key.

Lighting interactive characters inside the scene.

Lighting interactive characters inside the scene.

Step 3: The Viewers Experience

In addition to the sound interface, the project is displayed on multiple screens and multiple cameras. The multiple screen technique allows the use of several screens or video projectors receiving video signals coming from multiple virtual cameras embedded in the virtual world.

The final design was rebuilt from scratch following the lessons from the previous phases. The virtual camera seems to perform differently according to the file size of the animated characters. Please open the copy of the diner on your CD-ROM and go to the Schematic window. You will find the variables for the virtual camera grouped in a control panel so you can tune up the rendering speed of the scene without interfering with the cameras behaviors.

Find more turnkey tutorials that detail all the steps required to build simulations and interactions in Building Interactive Worlds in 3D: Virtual Sets and Pre-Visualization for Games, Film & the Web by Jean-Marc Gauthier: Focal Press, 2005. 422 pages with illustrations. ISBN 0-240-80622-0 ($49.95). Check back to VFXWorld frequently to read new excerpts.

Jean-Marc Gauthier teaches at New York University in the graduate studies department of Interactive Telecommunications and is the author of Interactive 3D Actors and Their Worlds (Morgan Kaufmann Publishers 2000). He is also a consultant at www.tinkering.net and an award-winning 3D artist.

Tags