Bill Desowitz gets the lowdown on crowd duplication in Moneyball from Edwin Rivera of Rhythm & Hues.
Moneyball, the critically acclaimed biopic starring Brad Pitt as the iconoclastic Oakland A's general manager Billy Bean, was tailor made for Edwin Rivera, the visual effects supervisor for Rhythm & Hues. He most enjoys movies that blend in the backgrounds and don't take away from action in the foreground, which was perfect because Moneyball was all about virtually filling the seats of The Oakland Coliseum.
In fact, Moneyball offered some new digital twists. Rather than shooting in a traditional way with 300 extras in a section and moving them over, Rivera figured out a more agile solution because of the Coliseum's bowl shape, which necessitated shooting from different angles.
"So what we did was," Rivera explains," we shot 50 different people at four different angles: 90 degrees, 66 degrees, 45 degrees and 22 degrees. That gave us coverage for every person. What we did then was shoot them in pairs of twos and isolated each of the people on a 1K card. And we also shot them with seven different emotions at the exact same time from a blue plate for 20 seconds. They were sitting bored, the Wave, sitting down again, seated clapping, home run, standing and clapping, sitting back down and booing. Let's say, for example, frames 1,000-1,500 we knew that every single person from every single angle would be sitting down bored. And from frames 2,250-2,750 they would be jumping up and screaming as if it were a home run.
"We shot on HD, which worked out really well. That allowed us to have the same color and we could import them all using our Infernos, so we didn't have to get them scanned or processed. And the quality held up better than we hoped. By default we would render whole stadium and they would cull the people that they needed. And there were a couple of shots where we realized that if we used people for only a third of a shot, that we could easily get away with it because they were all shot at the correct angle."
Rivera underscores that their system was very agile and flexible. It had to be because if you look at real footage of a baseball stadium, the crowd is never doing the same thing at the same time. "So we could go in and say, 'I want [so many] of the people sitting down looking bored, [so many] clapping, [so many] standing and clapping and [so many] booing,'" Rivera adds."And then we'd go and render it and randomly take those people and populate it and say, 'Let's lose that person, and lose that person.' And the turnaround was really quick. The other nice thing is that since we had tracked it all, and they were CG people, the compositors never had to track one frame. Typically, that's the biggest nightmare on any crowd duplication show. We got parallax, which you never get with traditional plates. And there were no crazy camera moves. It was rooted more in a traditional way of filming it."
They could also cheat people even if the angle dictated that they were looking, say, in the outfield. They could take 50% of the people at 90% that look like they're looking at the action that's happening right by camera. Rhythm & Hughes used its in-house tracking software and rendered it in mantra and then composited also using its Icy in-house software.
Thus, the overall significance of this system allowed helmer Bennett Miller to direct the crowd: A little more cheering here; more people standing up; make it little sparser as you get closer to the outfield and denser at it gets to home plate. They had total control. "
However, there were lighting challenges, as you might expect with Oscar-winning cinematographer Wally Pfister (Inception) shooting Moneyball, so Rhythm & Hues had to come up with a lighting system as well.
"He lit the stadium much differently than you would normally see a stadium," Rivera admits. "Usually at night you're lit pretty much everywhere, but he lit it much more stylistically where one side was completely lit and the other side was completely dark. So on any given shot that could change based upon where the camera was or based upon how he changed his lighting. We not only had to create people that would work for the different angles, but we also had to create lighting for every angle. So, using Houdini, we wrote a shader that, based upon the light distance and intensity, we were able to create a matte offset for the 2.5D people.
"Let's say that we had a huge bank of lights at 45 degrees to the screen right of the character. The matte offset would then scoot the matte down and to the left ever so slightly so we had a rim matte that the compositors would then use to keep the original lighting or brighten up the 2.5D people."
This also worked for daytime shots, allowing them to, "Play ball," in as a believable a manner as possible.
Bill Desowitz is former senior editor of AWN and editor of VFXWorld. He has a new blog, Immersed in Movies (www.billdesowitz.com), and is currently writing a book about the evolution of James Bond from Connery to Craig, scheduled for publication next year, which is the 50th anniversary of the franchise.