NVIDIA shows first interactive demo of an Isaac lab trained robot in NVIDIA Project Holodeck at SIGGRAPH 2017.
SIGGRAPH 2017 -- LOS ANGELES -- dominoes with a simulator-trained robot. Computers can beat humans at chess and can categorize images with superhuman accuracy and now NVIDIA wants to show how to create machines smart enough to safely interact with us in our daily lives.
NVIDIA is presenting the first hands-on demo of Isaac, a robot trained with our NVIDIA Isaac Lab robot simulator, to show how simulation -- and virtual reality -- can help robots learn the much more nuanced task of interacting with people.
Skills like the ability to pour a cup of coffee, provide care for the elderly, perform surgery, or play a game of dominoes, are the key to putting robotics to work in the lives of the world’s more than 7 billion people.
They’re skills too subtle to be taught by programmers banging out line after line of code. Our demo at SIGGRAPH shows how AI can be used to achieve do this.
The NVIDIA demo lets you get hands on with two technologies they announced earlier this year at our GPU Technology Conference:
NVIDIA Isaac is an AI-enabled robot that has been trained using a powerful simulation environment called the Isaac Lab.
Project Holodeck -- a collaborative and physically accurate virtually reality environment -- enables humans to enter a simulation and interact with robots in a VR environment the same way they will in real life.
Attendees will be able to see how these two technologies work together by interacting with Isaac in two ways: by going head-to-head with Isaac in the physical world on the show floor, and by strapping on a VR headset to enter a simulation via Project Holodeck.
Deep learning and computer vision have been combined to teach a robot to sense and respond to human presence, to identify the state of the play of the game, to understand the legal moves of the game, and to determine which tile to select and how to place it.
The key: a pair of neural networks that help Isaac not only understand the game, but understand how to put that understanding to work when interacting with humans.
Using classification methods, the first neural network will identify the state of the play based on captured images of the domino tiles. It will determine the various legal moves in the game.
The data then will be transferred to another neural network, which uses reinforcement learning to determine which tile to select how to place it.
Once trained in the Isaac lab environment, the knowledge can then be deployed and transcended between the physical and the virtual realms.
By developing and training robots in a simulated world and then working with those robots in a virtual reality environment like Project Holodeck, researchers can deploy them to the real world in a way that is safer, faster, and more cost-effective.
Learn more on NVIDIA Isaac here.
Beyond robots, Holodeck is an ideal platform for game developers, content creators and designers wanting to collaborate, visualize and interact with large, complex photoreal models in a collaborative VR space.
NVIDIA Project Holodeck is a highly realistic, multi-user, VR environment that makes it easy for developers to import and interact with high quality models, including iterating and tuning a robot and test methodology.
In addition to Isaac, NVIDIA is featuring the first hands-on demo at SIGGRAPH of the Koenigsegg Regera supercar in Holodeck. The Koenigsegg virtual car model is represented by more than 50 million polygons.
Trade show attendees will be able to change the color of the model, apply a clipping sphere to view hidden parts, and virtually explode the model to visualize complex assemblies.
The experience is highly collaborative, with participants in different physical rooms seeing and talking to each other in a shared virtual space.
The Holodeck early access beta program will be available to the public starting September 2017. Sign up here for updates on Project Holodeck.