written by Eric Post
Preston Smith began with SIGGRAPH as a volunteer 15 years ago and is the current Chair of the Emerging Technologies section. Preston is a network administrator. When he graduated from college, he thought seriously about getting involved in CG. The more he watched what the kids in this industry were doing with technology, the more he became interested in supporting that technology. Today, Preston is the Linux administrator at the Laureate Institute for Brain Research. He supports the fMRI, or function Magnetic Resonance Imaging machine.
At Laureate, the doctors and staff study brain disorders such as eating disorders. The new idea is that the MRI runs when the patient is performing certain tasks, hence the “functional” portion of the test. This allows the researchers to see how normal or not so normal specific portions of the brain reacts when given tasks that target those specific locations. So Preston has a keen eye to emerging technology when it has a potential to benefit people.
Preston is quite impressed with the openness of the inventors. Just like Linux is open for all to use and improve, likewise the emerging technology participants who have openly discussed their work is equally impressive. For Preston, being Chair of the Emerging Technology at SIGGRAPH is like being a kid in a candy store.
This year, Preston limited the selections to those that were either entirely new, or a substantial breakthrough to existing or cutting edge technology.
The www.siggraph.org  website has an attendees section with a link to Emerging Technologies. There you will find a trailer to watch on YouTube about the participants and a brief description of each of the 22 who were selected.
For those still into animatronics, Lanny Smoot and Katie Bassette have a gimbaled electromagnetically controlled glass eye. It is about 2 feet in diameter and can move faster than the human eye. It is not just for movies. Lanny and Katie have visions that this technology can be used to make a prosthetic eye.
Acroban the Humanoid is a robot, the first, that can demonstrate playful interaction with children designed by Olivier Ly and Pierre-Yves Oudeyer. Specifically designed to engage children, this should be a great way to work at home and have long periods of concentration on the computer. One might even find it of interest to hook up a MoCap device.
Gesture World Technology by Kiyoshi Hoshino, Motomasa Tomida, and Takanobu Tanimoto allows people to control computers and devices by gesture only. The team hopes to see this technology in virtual games and remote controllers. Perhaps if the keyboard is a limitation to the future of CG, this tool will help the programming, compositing, modeling, and animation of the future.
Similarly, the team at the University of Tokyo came up with an In-air Typing Interface which is targeted for mobile devices and has a vibration feedback. The hope is to replace the keyboard and reduce the need for physical space. Perhaps a heads up display for texting? Again, this may be just the technology for CG artists to work more efficiently at the computer, without being AT the computer.
Another hands free device comes from the team at the National Taiwan University. Their Beyond the Surface device not only recognizes tabletop touch commands, but also with IR technology, recognizes movements above the surface. Architects might enjoy working with their buildings in 3D space.
Watching high speed action movement of a CG character on screen is more and more common with the great MoCap tools available. So what about high speed hand and finger movement? The team at the University of Tsukuba invented the Gesture-World Technology that can track finger movement and reduce occlusion problems. Possible applications include gesture based computer operations and remote control along with the idea of digitally archiving hand and finger movements of artists. All, of course, without the need to attach sensors.
Subtle facial expression in movie making can make or break a character’s impact on an audience. Remember Jack Elam’s wild eye? Ever try to recreate that in a CG character rig? The team at the University of Southern California, Institute for Creative Technologies has a Head-Mounted Photometric Stereo for Performance Capture device that can be worn by the actor. This device can record minute and subtle facial movements, especially around the eyes and mouth, which are among the most difficult to rig.
Advances in biomedical and robotics continue at SIGGRAPH 2010. This seems to be the year of remote capture and remote control.