We’ve all seen pretty pictures of outer space, with swirling patterns and bright stars against a dark background. With the ease of taking color photos on an iPhone, you might think that advanced space telescopes can automatically produce color photos as well.
However, all digital cameras—from your phone to the James Webb Space Telescope—cannot actually see in color. Digital cameras capture images as a series of ones and zeros, measuring the amount of light hitting their sensors. Each pixel has a colored filter over it (either red, green, or blue), which only allows specific wavelengths of light to pass through. These filters are arranged in a specific pattern (typically a four-pixel repeating square known as the Bayer pattern), which enables the camera’s computing hardware to merge the captured data into a full-colored image. Some digital cameras distribute the colored filters across three individual sensors, and their data can also be combined into a full-color image. However, telescope cameras have to capture images with one filter at a time, requiring experts to later combine them into a composite image. Bayer pattern
In our smartphones, the combination of layers happens incredibly fast—but telescopes are complicated scientific behemoths, and it takes a bit more effort to get the stunning results we know and love. Plus, when we’re looking at the cosmos, astronomers use wavelengths of light that our eyes can’t even see (e.g. infrared and X-rays), so those also need to be represented with colors in the rainbow. There are lots of decisions to be made about how to colorize space images, which begs the question: who is making these images, and how do they make them?
For the spectacular results we’ve been seeing from JWST, processing scientific data into beautiful color images is actually a full-time job. Science visualization experts at the Space Telescope Science Institute in Baltimore stack images together and combine observations from different instruments on the telescope. They also remove artifacts, which are things in the image that aren’t real, but rather just results of the telescope equipment and how digital data is processed. These could be streaks from stray cosmic rays, oversaturation of the brightest stars, or noise from the detector itself. Black and white to color
Before they even think about color, these specialists need to balance out the dark and light values in the image. Scientific cameras are designed to record a wide range of brightnesses beyond what our eyes can perceive. This means that the raw images from telescopes often appear very dark to our eyes, and you have to brighten up the image to see anything.
Once they have black and white images where the details are visible, they start adding color. “Different telescopes have filters that are made to be sensitive to only certain wavelengths of light, and the colorful space images we see are combinations of separate exposures taken in these different filters” similar to the earlier description of a phone camera, explains
Katya Gozman, a scientist at the University of Michigan. “We can assign each filter to a different color channel—red, green or blue, the primary colors of visible light. When placed on top of each other, we get the impressive color image that we’re used to seeing in the media,” she says.
The final result, of course, also depends on the type of data the image specialists have to work with initially. The team often selects different colors to emphasize the fact that NIRCam and MIRI—two of Webb’s infrared cameras—are observing different wavelengths (near-infrared and mid-infrared, respectively), and therefore different physical structures. For instance, in the Cassiopeia A supernova remnant, JWST’s observations revealed a bubble of something giving off a specific wavelength of light, shown as green in the MIRI image and thus called the “Green Monster.” Without this visualization, astronomers may not have noticed such an interesting feature that provides insight into how giant stars die—and after some investigation, they figured out the Green Monster is a region of debris disturbed by the huge blast from the supernova explosion.
Invisible to visible
In general, image specialists try to represent things as realistically as possible. For example, if a telescope is observing in visible light, wavelengths can directly correspond to colors we’re used to seeing. But for those parts of the spectrum invisible to our eyes, they have to make choices about which visible colors to use. This is where it becomes a bit of an art, choosing colors based on not only scientific accuracy, but also what looks best. For JWST and Hubble, the typical practice is to use blue for the shortest wavelengths, green for in between, and red for the longest wavelengths. If there are more than three different filters to choose from (as is often the case with JWST, especially when using more than one of its high tech instruments), sometimes they’ll add in purple, teal, and orange for other wavelengths in between the red, green, and blue.
Color images are far more than a pretty picture, though—they’re actually quite useful for science. The human brain is excellent at picking up patterns in color, such as understanding a map with color-coded subway lines or recognizing that “a red light is stop, green is go,” says Mark Popinchalk, a scientist at the American Museum of Natural History. “These are daily examples where societal information is presented and processed quickly through color. Scientists want to use the same tool,” he adds. “But instead of societal information, it’s scientific. If X-rays are red, and ultraviolet is blue, we can very quickly interpret energetic light beyond what humans are capable of.” The result is a visual representation of an intense amount of data–much more than can be processed with the naked eye, or in black and white alone.
For instance, Gozman explains how pictures have been useful in identifying where different physical processes are occurring in an object, like detecting where star formation is happening in a galaxy or where different elements are found around a nebula. Images in colors beyond what the eye can see have even shown dark matter around galaxies, like in the bullet cluster.
[ Related: This is what Uranus and Neptune might actually look like ]
Another particularly recent and interesting example of image coloration is the case of Neptune. The dark blue photo of the icy world from the Voyager mission doesn’t actually show its true color as it would appear to our eyes—instead, it's more similar to the pale color of Uranus. “In the 80s, astronomers actually stretched and modified the images of Neptune to enhance the contrast in some of its fainter features, giving it that deep blue shade which made it look very different from Uranus,” Gozman explains. “While astronomers knew this, the public did not. This is a good example of how reprocessing the same data in different ways can result in completely different representations.”
Analyzing images has always been a significant part of astronomy, allowing us to see the universe beyond the limitations of our human eyes. You can even try it yourself—JWST data is available to the public from NASA, and they even host an astrophotography challenge open to anyone. So, when you come across a stunning space image, perhaps you can view it as a wonderful combination of science and art.