Search form

NVIDIA Untethers Virtual Camera Technology in SIGGRAPH Demo

NVIDIA technology uncouples traditional virtual camera capabilities from the limitations of stage environments, allowing filmmakers to interactively visualize CG assets composited into any live action set.

Imagine a filmmaker, tablet in hand, walking through a live-action set and recording actors as they perform, while simultaneously seeing fluid updates of computer-generated elements within the shot.

Imagine no more. At SIGGRAPH this week, NVIDIA is demonstrating virtual camera technology that does just that.

Virtual camera technology is powerful stuff, but until recently was primarily confined to stage environments equipped with tracking markers and motion-capture cameras.

NVIDIA's technology uncouples the virtual camera from the controlled environment of a set and lets directors frame scenes from any angle. By putting the camera quite literally in the hands of filmmakers, they have the freedom to tell stories in new ways, combining computer-generated images and a live-action set.

To make virtual production more accessible, NVIDIA has worked with Google to use the robot-vision capabilities of its NVIDIA Tegra K1-powered Project Tango tablet, as well as a host of other graphics computing power of the GPU.

The technology works by transferring the position of the “camera” (in the case of the SIGGRAPH demo, the Project Tango tablet) to a digital content creation application for rendering.

Computer-generated elements are rendered by NVIDIA Quadro GPUs, and then streamed back to the tablet using NVIDIA GRID technology. The Tegra K1 then mixes live action with the computer-generated content.

So whether on-set in a controlled stage environment, or off-set in real world, live-action shoot scenarios, filmmakers can get the shot right the first time.

Source: NVIDIA

Jennifer Wolfe's picture

Formerly Editor-in-Chief of Animation World Network, Jennifer Wolfe has worked in the Media & Entertainment industry as a writer and PR professional since 2003.