Computer graphics have remade moviemaking, enabling the creation of virtual worlds, animations, images, and special effects that would otherwise have been difficult or impossible to achieve.
But until recently, the work of adding CG to live shots has had to take place after a movie has been filmed.
In these cases, directors generally must wait weeks to figure out what shots works best with the CG they want to use, said Marc Petit, senior vice president at Autodesk, a vendor of 3D design, engineering, and entertainment software.
They then must begin the costly and time-consuming process of reshooting scenes that should have been done differently.
Now, though, virtual moviemaking is starting to solve this problem.
Intro to virtual moviemaking
Virtual moviemaking — a type of augmented reality — superimposes computer-generated images, animation, and special effects over camera-captured scenes in real-time, said Jon Peddie, president of Jon Peddie Research, a market-analysis firm.
This lets directors see the way a scene will look with CG while they're shooting it.
In some cases, they arrange monitors around a scene so that actors can see the special effects that will be used and perform accordingly.
Directors can also try out different effects during filming and change the lighting, camera angles, or other aspects of the way a scene is shot in conjunction with the graphics.
All this is difficult or impractical when the director must add the CG after finishing the live shooting.
Virtual moviemaking has already been deployed in big-budget movies such as James Cameron's Avatar, the first to use the technique extensively, as well as Steven Spielberg's Tintin and Robert Zemeckis' Christmas Carol.
Moviemaking takes a virtual turn
Directors use standard CG techniques to create the graphics they plan to use in scenes.
While shooting scenes with actors, they use multiple cameras to capture high-resolution imagery of the performers from different perspectives. This lets the directors choose the angles they like best during the editing process.
Products such as Autodesk MotionBuilder receive and integrate live-scene and CG data captured by camera, motion-capture, facial-expression-capture, animation, and other equipment so that directors and others can see them together.
The systems generate views of a movie through a peripheral designed to look like a traditional camera viewfinder.
Various tools lets users manipulate the resulting blended scenes, changing elements such as lighting and viewing perspective.
The Gamecaster GCS3 peripheral communicates with the master system via a USB port and renders the computed scene on its display. The device has sensors that identify the way users rotate or otherwise move the device, allowing directors to see scenes from multiple angles and perspectives.
Virtual-moviemaking systems have changed the film-production process dramatically because, thanks to software and graphics-processing hardware improvements, they can produce useful low-resolution graphics that directors can view in real time, noted Rob Powers, vice president of 3D development at NewTek, a vendor of portable live-production and 3D-animation systems.
Once directors decide exactly how they want a scene to look, their crews can generate high-resolution versions of the selected graphics and create the final movie.
Down the road
Currently, Petit said, a basic system typically costs from $50,000 to $90,000.
However, the price is dropping, making them accessible to more studios.
And some hobbyists are working on lower-cost, lower-quality versions using $150 Microsoft Kinect motion-sensing systems and tablet computers, Petit explained.
He noted that virtual moviemaking's rise could create problems, as well as benefits.
For example, he said, using virtual moviemaking to create various effects could eliminate or reduce the use of lighting, set design, and other craftsmen, which could generate union resistance.
Nonetheless, he anticipated that virtual moviemaking will have a profound affect not only cinema but also on fields such as television and video games.
Computer magazine, the IEEE Computer Society's flagship publication, covers all aspects of computer science, computer engineering, computing technology, and applications.
IEEE Software offers pioneering ideas, expert analyses, and thoughtful overviews for professional developers and managers who need to keep up with rapid technology change.
IEEE Computer Graphics and Applications magazine bridges the theory and practice of computer graphics, from specific algorithms to full system implementations.
IEEE Pervasive Computing explores pervasive, mobile, and ubiquitous computing for developers, researchers, and educators.
IEEE Internet Computing is a crossroads for researchers and industry developers, providing journal-quality exploration and review of emerging and maturing Internet technologies and applications.
IEEE Security & Privacy presents research articles, case studies, tutorials, and columns for workers at all levels of the information security industry.
IEEE Intelligent Systems magazine covers the theory and application of systems that perceive, reason, learn, and act intelligently.
Computing in Science & Engineering addresses the need for efficient algorithms, system software, and computer architecture to address large computational problems in the hard sciences.
IEEE Annals of the History of Computing covers computer history with scholarly articles by leading computer scientists and historians, as well as first-hand accounts.
IEEE MultiMedia magazine serves readers who are interested in harmoniously using multiple media types to create new experiences.
IT Professional magazine examines emerging technologies, security issues, data management techniques, and systems integration challenges and how they affect the enterprise.
IEEE Micro focuses on the technologies of computers and peripherals � systems, components, and subassemblies, as well as communications and software.