How Do Space Pictures Get So Pretty?
Photoshop, of course.
NASA released a stunning set of photographs from its newly renovated Hubble Space Telescope on Wednesday. In 2005, Daniel Engber explained the process by which the Hubble's gray-scale photographs become dramatic colorized images. The original article is reprinted below.
A picture taken by the Spitzer Space Telescope was released on Monday; the image, which depicts the birth of 100,000 stars in a far-away gas cloud, shows a splotchy shape in light red, set against a background of speckled blue-white stars and olive mist. How do these photographs get to be so pretty?
Teams of specialists on the ground gussy them up for public consumption. Here's how it works: Telescopes like the Spitzer and the Hubble take black-and-white pictures using different filters to capture particular wavelengths of light. (The image released this week is a composite created from four shots of the same thing.) Then these pictures are sent back to Earth via the Deep Space Network, a set of large antennae set up around the world.
For the Hubble telescope, the image files can be up to 70MB in size, with a resolution of 16.7 megapixels. Data is downloaded from the telescope at a speed comparable to that of a good Internet connection.
Once the images are on the ground, scientists can look at them in the FITS ("Flexible Image Transport System") file format, a standard protocol used among astronomers. For analysis, most scientists use the data in this form—as grey-scale images representing light at different wavelengths.
To create an image suitable for public viewing, the scientists send the FITS files over to a public outreach team. Specialists on the team—who tend to be astronomers with graduate degrees and a passion for graphics and photography—begin the process of converting the information into the images sent out in press releases.
First, they put the image into a file format appropriate for media. That means that the data from the FITS files, which show a range of about 65,000 shades of grey, must be squeezed into a standard JPEG or TIFF file, with only 256 shades. This process is counterintuitively called "stretching" the data and must be done carefully to preserve important features and enhance details in the finished product.
Then each grey-scale image is assigned a color. In reality, each shot already represents a color—the wavelength of light captured by the filter when that picture was taken. But in some cases the images represent colors that we wouldn't be able to see. (The Spitzer, for example, registers the infrared spectrum.) To create a composite image that has the full range of colors seen by the human eye, an astronomer picks one image and makes it red, picks another and makes it blue, and completes the set by coloring a third image green. When he overlays the three images, one on top of the other, they produce a full-color picture. (Televisions and computer monitors create color in the same way.)
Sometimes the team assigns new colors even when the original pictures were taken in the visible spectrum. An object that would in real life comprise several indistinguishable shades of red might be represented to the public as the composite of three pictures in red, green, and blue. As a general rule, professional "visualizers" try to assign red to the image showing the longest wavelengths of light and blue to the one showing the shortest. (This parallels the relationship among the colors in the visible spectrum.)
Finally, the colorized images are cropped, rotated to the most dramatic orientation, and cleaned of instrument errors and other unsightly blemishes. Most of this work is done in Photoshop, using a freely downloaded plug-in that allows users to convert from the FITS format. (The original telescope images are also available, so you can create your own color gas cloud picture at home.)