Henry Turner delves into how JPL is using visual effects to advance science in the area of space exploration.
In Pasadena at the Jet Propulsion Labs (JPL) there exists a true League of Extraordinary Gentlemen teams of researchers and artists whose work is so fascinating that it expands the mind with ideas of the potential and possibilities of the Universe. Among the many aspects of the research conducted at JPL are computer-generated images of spacecraft and space images, created to explain the nature of the missions, and as predictive tools in determining how missions will be planned and conducted. And much of the same technology used by motion picture visual effects artists is used by and in some cases invented by artists and scientists at JPL.
A Brief History
Computer-generated imagery is nothing new at JPL. In 1977, Bob Holzman established the JPL CG Lab, called DIAL (Digital Imaging Animation Lab). Holzmans colleague Ivan Sutherland brought in Jim Blinn, a graduate student at the time, who was ultimately responsible for many computer imaging innovations. Sutherland is quoted as once saying, There are about a dozen great computer graphics people. Jim Blinn is six of them. In his days at JPL, Blinn created simulated flybys of Voyager and Pioneer, and Galileo flybys of Jupiter, Saturn and their moons.
Today, DIAL is run by scientists and artists no less great than Blinn, and happy to be traveling the path he blazed. Among these researchers and artists are Dr. Eric De Jong, research director of the Dial digital image laboratory, Dr. Vince Realmuto, supervisor of the Visualization and Scientific Animation group, Rob Baldwin, vfx artist at Dial, and Zareh Gorjian, chief animator in Dr. Realmutos visualization and scientific animation group.
This is actually the home of image processing, says Dr. De Jong. In the early days JPL was literally crashing cameras into the moon to take pictures of it the Ranger Series. They would literally fire rockets to the moon, and on the way toward crashing send back signals. At first transmitted on AM radio waves, the eventual switch to FM allowed the images to be cleared of excessive visual noise. So the first frame buffer was built here. The whole idea was to take these images and digitize them, and then from that you could actually remove some of the noise and get cleaner images.
In addition to this refinement, Dr. De Jong offers that other technologies were created to improve the images. Using radar, elevation maps were created. In radar you can tell how high something is, and how far away it is in terms of the height and if you drape over that an image, then you can put that in the computer and have a synthetic camera fly anywhere around it. Dr. De Jong also points out that the digital cameras used on space missions were invented by Cal Tech, a partner of JPL, for the Mariner series of missions. The cameras are built at JPL, in a basement lab necessarily free of vibration to assemble the delicate components. JPL is the planetary mission center, and so for missions to Mars and Venus and so forth, most of the image processing is done here, Dr. De Jong adds.
Baldwin also points out that texture mapping was invented on a device created by Blinn: Theres a lot of history, as far as CG goes, right here.
Digital cameras, improved methods of transmission, and tools such as texture mapping have allowed the public to see images rendered of deep space exploration for the first time.
There is a famous acronym at JPL WYSIWYG, or what you see is what you get. WYSIWYG helps explain the critical need for visual aids in service to the planetary missions. As Dr. De Jong points out, When you are going to launch a space craft to another planet, and youre using animation to help you understand whats going on, thats when you definitely want a WYSIWYG approach. It is 17 years since Dr. De Jong wrote his first proposal for the Cassini mission to Saturn, and the probe took seven years to reach the planet after its launch. Hence, CG computer graphics were used to explain the mission, its planning and proposed outcome, to the 17/odd international groups associated with the mission.
Dr. De Jong is quick to point out that much of the techniques used to visually simulate space imagery and to gather images in space are first used for earth science. His colleague, Dr. Realmuto, is a vulcanologist a specialist in the study of volcanoes. In a fantastic presentation of time-lapse images taken of volcanoes from the air and by satellite, Dr. Realmuto showed this writer a number of graphics and animations revealing volcanic activity. Dr. Realmuto describes volcanoes as very large engines filled with chemistry, solid rock, gases and dynamic in that they push on the ground around them, creating interesting temperature features, and hot cracks. In the graphics he presented, a landmass surrounding a volcano is seen to pulsate. These images of animated topography are photographed by satellite, and from myriads of images, certain images are culled, to create time lapse. Then, computer-generated material is used to fill in the time-lapse gaps, giving, in a few minutes, a record of the volcano over a period of years. Dr. Realmuto suggests that deciding which time-lapse shots to use is critical in creating an accurate record of the volcanic activity. It takes a lot of experience to know how to pick time intervals. Other animated sequences show in startling colors the lava flow down a volcano toward a shoreline. These images also use time-lapse and computer-generated material in the gaps. We morphed the shape of the flows over time, interpolating our animation with the data, Dr. Realmuto says.
The earth-bound use of the equipment gives an idea of how it will perform in space. As Dr. De Jong insists, When it comes to planetary science we first need to use our knowledge of the earth to predict what we are going to see. And we actually transport our knowledge based on earth geology and earth climate systems, and take that with us when we send our robots out to other planets.
Computer graphic pre-planning is also used in setting the controls of the spacecraft for landing. Dr. De Jong explains: Light may take eight to 11 minutes to travel from Mars, so what a craft is doing during landing is determined by the preprogramming. Of course, if its landing at Titan, now youre talking about 90 minutes of light-travel time. You cross your fingers.
LightWave, The Tool of Choice
Gorjian points out that creating CG animation for JPL is done on a fast timeframe comparable to television. When an image comes down from Cassini or one of the Rovers, the scientists are anxious to get a press conference going, and share whats happening with the public. Hence there is a very little time to use the real data before the finished animation is presented to the media by the scientists. The tool of choice for us is LightWave, Gorjian says, and I dont think its a coincidence that if you look at the Sci Fi Channel on Fridays, all four of their shows Battlestar Galactica, Stargate, Atlantis and Andromeda, all use LightWave. Its a tool thats so well suited for a fast paced environment with quick turnarounds. We are doing space craft animations, with rigid bodies, were not doing character animation. And thats where LightWave shines.
Baldwin says that the images are created for another important reason: The Saturn orbit insertion animations were done for a celebration this mission had been going on for close to 20 years. A lot of people devoted large chunks of their career, if not their entire career, to the mission. Large presentations are given at both the JPL Space Flight Center, and simulcast at auditoriums at universities, to celebrate the mission for the participating scientist and their families. Of course, the international media are also on hand to capture the moment, the images, the science, the excitement and tension. Baldwin says, It makes for good television people spending huge chunks of their lives, billions of dollars in some cases, on something that is inherently risky.
This writer was shown a presentation of animated sequences of Cassini orbit insertion, orbit and the launching of the Huygens probe to Titan, as well as numerous other images and sequences. The Cassini sequences were short but each presented a significant detail of the mission. One sequence shows Cassini after seven years of interplanetary travel, moving up to a fully synthetic Saturn, based on data from Hubble, Cassini and even the Voyager probes. Fine detail is a highlight of the animation. Moons float in space. The colorization and texture of Saturn, with its rings and ring shadows, are majestic in conveying silence and immensity. The animated Cassini space craft is based on actual telemetry data from Cassini itself. Gold foil on Cassini reflects light from Saturn. Details of the engines are based on test footage from White Sands Missile range. The glowing ceramic exhaust funnels glow red and quickly cool. The animation shows shades and variations of the planets atmosphere, with detail of the sunlight diffusing and tapering to darkness along the planets rim. In Saturns case, light bounces off the rings, throwing patterns of light and shadow across the planet surface.
Another animation shows Cassini soaring across the rings. Baldwin explains the presence of a graph on the side of the image that records the number of particle hits on the space craft. Even though the area it was going through was fairly free of particles, comparable to two particles of smoke in a large room, it actually did hit quite a few. As the Cassini moves, it turns slightly, using its radar dish antenna as a shield against the particles. Baldwin, who as a vfx artist has worked on numerous television and feature films, is proud of the star field background. Its completely accurate, using the Tycho Two star catalogue. Ive done quite a few space animations at quite a few different companies, and I think weve got the best star fields and the motion blur is perfect.
Certain images of the surface of Mars are in digital 3D. The spectator wears 3D glasses and sees surface textures that are incredibly detailed. Rock abrasions on Mars, made by an abrasion tool mounted on a Rover, removes the outside varnished layer, so a spectrum can be taken of Martian soil. Other animation show theoretical views of craters filled with water, following the hypothesis that Mars was once covered with bodies of water.
A great 3D-animated sequence using real images from the Hubble space telescope shows one of the fragments of a comet hitting Jupiter and carbonizing the top of the atmosphere the carbonized area as big as the planet earth. Another shows a gaseous geyser 300 kilometers high.
The Mars Reconnaissance orbiter, to be launched in August, will take high res images in stereo, from Mars.
Space is Endless
We know now that JPL shares technologies with vfx. And Hollywood itself is getting in on the act. You are starting to see people like James Cameron, who are very interested in the space program, Baldwin states, and he is going to be a co-investigator on one of the cameras for the next Martian rover.
Dr. De Jong concludes, Our connection with visual effects has a long history. Were trying to make sure we make the best use of it. In the future, visual effects will give people the chance to visit a planet before anyone actually lands on the planet. And this will be a precursor for a long time to come. And we certainly use the best of what Hollywood has to offer to help us tell the story of what its like to go into space.
Henry Turner is a writer and award-winning filmmaker, whose Lovecraft-inspired horror feature, Wilbur Whateley, won top awards at the Chicago International Film Festival. His writing on film has appeared in the Los Angeles Times, Lecran Fantastique, Variety and many other publications. A longtime film festival executive, he has programmed for the Slamdance Film Festival, and currently heads FilmTraffick L.A.