SIGGRAPH announces Real-Time Live!, a fast-paced, 90-minute show of cutting-edge, aesthetically stimulating real-time work at the 2013 Computer Animation Festival.
Chicago, IL –
Real-Time Live! is the premier showcase for the latest trends and techniques for pushing the boundaries of interactive visuals. As part of the Computer Animation Festival, an international jury selects submissions from a diverse array of industries to create a fast-paced, 90-minute show of cutting-edge, aesthetically stimulating real-time work.
"Celebrating its 5th Anniversary, Real-Time Live! 2013 is a can't-miss, one night only event at SIGGRAPH," said Abe Wiley, Real-Time Live! Chair from AMD. "This year's show is as diverse as ever and attendees will have their minds blown by live, interactive demonstrations of the latest advancements in photorealistic facial animation, medical visualization, crowd simulation, hair rendering, massive destruction, real-time cinematics, and more!"
Each live presentation lasts less than 10 minutes, and is presented by the artists and engineers who produced the work.
SIGGRAPH 2013 Real-Time Live! Highlights:
Digital Ira: High-Resolution Facial Performance Playback
Graham Fyffe, USC Institute for Creative Technologies; Jorge Jimenez, Activision, Inc.; Oleg Alexander, Jay Busch, Paul Graham, Borom Tunwattanapong, Koki Nagano, Ryosuke Ichikari, Paul Debevec, Andrew Jones, USC Institute for Creative Technologies; Javier von der Pahlen; Etienne Danvoye; Bernardo Antoniazzi; Michael Eheler; Zbynek Kysela, Activision, Inc.; Curtis Beeson; Steve Burke; Mark Daly, NVIDIA Corporation
Video performance capture drives a facial blendshape model made from high-resolution facial scans that are compressed and realistically rendered in a reproducible game-ready pipeline designed for current-generation PCs and next-generation consoles.
Renaldas Zioma, Unity Technologies
Butterfly Effect is a real-time-rendered short developed in a collaboration among Unity Technologies, Passion Pictures, and NVIDIA. During development, many traditional offline CG techniques were adopted for real-time rendering: physically based shading, Catmull-Clark subdivision, texture-space diffusion for subsurface scattering, pyroclastic noise-based volumetric effects, etc.
Unreal Engine 4 Infiltrator Demonstration
Dana Cowley, Brian Karis, Epic Games, Inc.
Infiltrator, Epic Games’ Unreal Engine 4 real-time demonstration, presents high-end rendering features including physically based materials and lighting, full-scene HDR reflections, advanced GPU particle simulation, adaptive detail with artist-programmable tessellation and displacement, dynamically lit particles that emit and receive light, and thousands of dynamic lights with tiled deferred shading.
Thomas Mann, Still; Daniel Szymanski; Andreas Rose, Framefield GmbH; Wolf Budgenhagen, Still
Square is a Demoscene project that harnesses the power of mandel-box-fractals and real-time raymarching to create a series of stunningly beautiful, ever-changing, procedural environments. The cinematic experience is set to music by Wright and Bastard, but the true power of the techniques are exposed by hooking up a LEAP controller to the custom Tool2 software to allow for camera control and dynamic interaction with the fractals.
Source: SIGGRAPH 2013