Bluescreen vs Greenscreen - How to choose
Just a few days ago we had a press conference with some 120 reporters from all over the world attending. Whenever we show footage or before-and-after shot breakdowns, there's one question that's ubiquitous: What's the difference between bluescreen and greenscreen (except for the obvious, that is). Or why and how do you choose green over blue?
The first and most obvious reason is the foreground object. Since the screen color is the color that will be removed by the compositing program, it's not a good idea to shoot, say, Kermit in front of greenscreen. Or the Na'vi from Avatar, were they real, in front of bluescreen.
But practically, that's a lesser reason today.
First, those clear-cut cases are rare. Usually, we're tasked with extending live action sets or creating fully digital environments in which dozens of people, sometimes cars, plants, etc appear in front of the chroma screen. And while we're always working close with the costume, set dressing and props department to determine "illegal colors" early on, you can’t entirely exclude any hint of blue or green.
Second, it's a scheduling and budgeting question, even on high-budget movies. For example, on "2012" we were shooting on 17 different stages and exterior sets in Vancouver. In each of those locations, sets were constantly constructed, shot and torn down again to make room for the next set. The cost of the amount of bluescreen we used ran into several hundred thousands of Dollars. It would have been a multiple of that, if we hadn't re-used most of the material for different sets. If you do that trying to use both bluescreen and greenscreen material, you run into enormous scheduling problems, since the usage of these sets depends on a multitude of factors, which usually change during the shoot. These factors can be actors availability, construction delays, design changes, set dressing turnaround times, weather re-scheduling or delays in shooting other sets.
Third, and this is the good news, the (fairly) new generation of keyers, e.g. Ultimatte, Primatte or Keylight, are not just chroma keyers. That means they're not just relying on differences in chrominance (color), but also luminance (brightness differences) and other cool mathematical stuff that I can’t hope to understand. And that makes it possible, like in this test we did for our current movie, to put a green hedge in front of greenscreen without problems.
In this case, the luminance of the greens in the hedge and the screen is sufficiently different to warrant good key. This is the green channel only:
So, the more important reason nowadays, as more and more digital cameras are taking over film, is the sensitivity and processing of the color channels of the digital camera you're shooting with. I say sensitivity and processing (sampling), because these are two distinctly different factors. One affects the noise level, the other affects the actual resolution of that channel. And both are important to good keying results.
The green channel is the cleanest channel in most digital cameras today. The green channel has the highest luminance of all three (red, green and blue) digital channels, and thus the sensors deliver the least noise in that channel. The processing is three-fold: There is Bayer Pattern filtering (which occurs in single CMOS/CCD sensors, but not 3-CCD cameras), DSP (digital signal processing) and the processing in the actual recording format.
Bayer Pattern means that the sensor has a filter arrangement on its pixel array that actually records twice as many green pixels as red or blue. So the actual recording resolution of the green channel is double that of the other channels.
Digital Signal Processing, or usually referred to as the "matrix", can (and should) in some cameras be turned off. In some cameras, it cannot be completely turned off, but at least the most detrimental feature for keying, image sharpening, can be turned off.
The recording format is another factor. If you have the (still) luxury to record 100% uncompressed, which is currently only possible if you record "raw" data or 4:4:4 via dual HD-SDI directly to a harddisk array or harddisk recorder, like the Codex or S2, then you have the best possible recording quality. The next best option is to record to Sony's HDCam SR format, in 880Mb/s mode, which is approximately 2:1 compression, or 440Mb/s, approx. 4:1 compression. If you're really adventurous and shooting 24fps on a Canon 5D still camera, it will record in H.264 codec (also used in Bluray discs) to the internal CF storage, which adds a lot of compression (in my tests about 40:1) in addition to the DSP artifacts. While it delivers a still amazing picture, it poses quite some challenges for keying. Below is an example of an exterior greenscreen I recently shot with the 5D Mark II.
Here, the red channel shows interesting artifacts which most likely stem from edge sharpening algorithms inside the DSP of the camera, in combination with h.264 compression:
Unfortunately, there are a few disadvantages to greenscreen. Due to green's high luminance, the green spill (i.e. the bounced/reflected light from the greenscreen) is quite a bit higher than with bluescreen. Color correction tends to be easier on bluescreen than on greenscreen. If your subjects have blond hair, for instance, it easily turns reddish after removing green spill in post.
So, in conclusion, I would never go with the "flavor of the day". I had people tell me "I thought they don't use bluescreen anymore" or "I've only been shooting greenscreen for the last 6 years". Well, that doesn't necessarily make it right. All the factors that I talked about are entirely different for every production.