Why Are Images From Space Probes Always in Black and White?

ADVERTISEMENT

Straight Dope

Why are images from our space program always in grayscale instead of color? I know NASA needs to extract data from those images, and I also know the cameras aren’t $9.99 specials from the corner drugstore. But couldn’t NASA just stick a plain old color digital camera on board and send it to Mars along with the rest of the equipment?
— Buster Blocker, Bettendorf, Iowa

[hr]

They’ve thought about it, actually. But the truth is, we’re probably better off the way things are.

To find out about space cameras, we got in touch with Noam Izenberg, a planetary scientist working on the MESSENGER probe, which is now circling Mercury taking pictures. He told us there are basically two reasons space photography is mostly in black and white. The first, as you rightly suppose, is that grayscale images are often more useful for research.

In principle, most digital cameras, including cheap Walmart models in addition to the custom-built jobs on space probes, are monochrome, or more accurately panachrome. Each of the pixel-sized receptors in a digital camera sensor is basically a light bucket; unmodified, their combined output is simply a grayscale image generated from all light in the visible spectrum and sometimes beyond.

To create a color image, each pixel on a typical earthbound camera has a filter in front of it that passes red, green, or blue light, and the camera’s electronics add up the result to create the image we see, similar to a color TV. In effect, filtering dumbs down each panachrome pixel so that it registers only a fraction of the light it’s capable of seeing. Granted, the human eye works in roughly the same way. The fact remains, in an earthbound camera, some information is lost.

ADVERTISEMENT

Space cameras are configured differently. They’re designed to measure not just all visible light but also the infrared and ultraviolet light past each end of the visible spectrum. Filtering is used primarily to make scientifically interesting details stand out. “Most common planetary camera designs have filter wheels that rotate different light filters in front of the sensor,” Izenberg says. “These filters aren’t selected to produce ‘realistic’ color that the human eye would see, but rather to collect light in wavelengths characteristic of different types of rocks and minerals,” to help identify them.

True-color images — that is, photos showing color as a human viewer would perceive it — can be approximated by combining exposures shot through different visible-color filters in certain proportions, essentially mimicking what an earth camera does. However, besides not inherently being of major scientific value, true-color photos are a bitch to produce: all the variously filtered images must be separately recorded, stored, and transmitted back to Earth, where they’re assembled into the final product. An 11-filter color snapshot really puts the squeeze on storage space and takes significant transmission time.

Given limited opportunities, time, and bandwidth, a better use of resources often is a false-color image — for example, an infrared photo of rocks revealing their mineral composition. At other times, when the goal is to study the shape of the surface, measuring craters and mountains and looking for telltale signs of tectonic shifts or ancient volcanoes, scientists want black-and-white images at maximum resolution so they can spot fine detail.

Terrific, you say. But don’t scientists realize the PR value of a vivid color photo?

They realize it all right. But that brings up the second reason most NASA images aren’t in color. The dirty little secret of space exploration is that a lot of the solar system, and for that matter the cosmos, is pretty drab. “The moon is 500 shades of gray and black with tiny spatterings of greenish and orangish glass,” Izenberg says. “Mars is red-dun and butterscotch with white ice at the poles. Jupiter and glorious Saturn are white/yellowish/brown/reddish. Hubble’s starscapes are white or faintly colored unless you can see in the infrared and ultraviolet.”

As for Mercury, Izenberg’s bailiwick, NASA has posted on its website detailed color photos showing vast swaths of the planet’s surface. If the accompanying text didn’t tell you they were true-color, you’d never know.

False-color images are often a lot more interesting. The colors aren’t faked, exactly; rather, they’re produced by amplifying modest variations in the visible spectrum and adding in infrared and ultraviolet. Some of the less successful examples look like a Hare Krishna tract, but done skillfully the result can be striking. The spectacular full-color nebula images from the Hubble Space Telescope were all produced by black-and-white sensors with color filters.

For what it’s worth, some colleagues of Izenberg’s a few years ago floated the idea of doing as you suggest — putting an off-the-shelf digital camera on a probe in addition to the more expensive models. The idea didn’t get off the ground, as it were, partly out of concerns the camera wouldn’t survive the extreme temperatures of space. But chances are the raw results wouldn’t have been all that impressive anyway. Experience suggests a good space photo needs a little . . . eh, don’t call it show biz. Call it art.

— CECIL ADAMS

Send questions to Cecil via straightdope.com or write him c/o Chicago Reader, 350 N. Orleans, Chicago 60654. Subscribe to the Straight Dope podcast at the iTunes Store.

0 0 votes
Article Rating
ADVERTISEMENT

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest
Oldest
Inline Feedbacks
View all comments
Geoffrey Buck
Geoffrey Buck
11 years ago

I also have a problem with not being shown the true colors of Mars. Nasa has their reasons but I think they are hiding a lot of information by having black and white or non true colors