The most common question we get asked about Hubble images is, “are the colors real?” What folks usually mean is, are these what the images would look like if we could travel in the Starship Enterprise to these cosmic landscapes? Or maybe more practically, if we could look through the Hubble Space Telescope. That would be incredibly awesome, but even the astronauts who serviced Hubble couldn’t look through it. The glib answer is yes and no. The more realistic answer is mostly no, and that applies to both questions.
The short explanation is that Hubble’s cameras work very differently from our eyes. In fact our eyes are pretty crummy for looking at the night sky. They’re fine for seeing the Moon, planets, the brightest stars and the Milky Way – as long as you’re in a location with a nice dark, clear sky (such as the desert Southwest, away from city lights). Our eyes evolved to see everyday (emphasis on day) landscapes. They fail us when we want to see the really faint deep sky stuff.
Clever folks a few hundred years ago came to realize that telescopes allow us to gather more light and create an amplified image that we can see better. Telescopes also magnify what we see so we think of them as bringing distant scenes closer, which turned out to be incredibly useful for sailors, birders and others. But for astronomers, the most important feature is the amplification of faint light sources.
Later on, other clever people developed photography, which allows us to record what we see, not only scenes similar to we can see with our eyes directly, but also images collected by telescopes and other optical instruments like microscopes. Photographs have a great advantage over our eyes because they build up an image over time. The longer the exposure, the more light can be recorded. You may have seen long-exposure photographs of night scenes with long streaks of car lights, or photographs of the night sky with streaks of stars wheeling around a central point. We don’t see the world like this because our eyes refresh the view in a fraction of a second. But that means we can only collect as much light as our eyes let in over that time, about 1/30 second. (By the way, that happens to be about the same rate as the separate frames of movies and video. Our brains string together a series of individual still images and fool us into thinking we’re seeing a continuous moving picture.)
When astronomers make photographs with a telescope, the power to amplify the light is tremendous. Long exposure photographs with large telescopes can record light millions of times fainter than what our eyes can detect. Not only that, but our eyes are not so good at seeing the colors of faint light, even though most people are able to see vivid colors in the daytime. Eyes have two kinds of light-sensitive cells: some see color but require more light to work, others detect fainter light but don’t respond to color.
Another factor is the range of colors we can see. Most of us can see a whole range of colors. Think about a rainbow with the familiar colors: red, orange, yellow, green, blue, purple (we’ll ignore the distinction between indigo and violet in the old “Roy G Biv” mnemonic for the time being). Turns out this is only a tiny fraction of the flavors of light that exist around us and throughout the wider universe. Many people know that ultraviolet radiation causes skin to burn. That’s another form of light, with more energy and bluer than the bluest light we can see. On the other end of the energy spectrum is infrared light, lower energy and redder than the reddest light we can see. Our skin responds to infrared light too, with the perception of heat, even though we can’t see it. But we can build instruments that can record infrared light. We use such cameras to see heat leaking out of houses so we can insulate them better and save energy, for example.
Hubble includes a camera that records infrared light along with other cameras that record mostly visible light and near ultraviolet. Instruments on the upcoming James Webb Space Telescope will only be sensitive to infrared light, opening up whole broad areas of studying the universe.
There’s one more difference between what we can see and the images we see from space, maybe a bit more technical. We see color because our eyes separate white light based on three types of color, roughly red, green and blue, and our brains assemble color from this information. Most color technology people have invented also works this way. To reproduce a color scene we mimic human eyes by building cameras that separate the light into red, green and blue signals, and reconstruct those into a picture we can see on a screen or in print. We also use the same process to produce color images from Hubble and other telescopes. But astronomers’ cameras include filters that can sample colors other than the usual red, green, and blue.
We normally translate whatever colors the camera sampled into the three standard colors red, green, and blue (technically called “additive primary” colors), which produces the broadest range of visible colors in the final image. If the filters used match our color perception, than the resulting image is close to what we would see directly. The bright, familiar planets in our Solar System – Mars, Jupiter, and Saturn – are good examples of this. They are bright enough to see well through a telescope and compare the real-life view to the photographs.
We can extend the color model to use images from any part of the light spectrum, from any available cameras and filters. Here’s an example that extends our view of stars into invisible light; ultraviolet and infrared.
All stars produce light in a wide range of colors. Their visible colors vary entirely by their temperature; cooler stars produce more red light and so appear red, while hotter stars produce more blue light and therefore appear blue. The Sun is somewhere in the middle and looks white (except near sunrise and sunset, when the light has to pass through more air that scatters away the blue light and the sun appears to turn red). But to our eyes, the color is rather pale or unsaturated.
Other images are very different because the filters don’t match the visual response. Many of the most familiar Hubble images were made using “narrow-band” filters. These sample the emission of certain specific elements present in glowing gas clouds or nebulae. These filters transmit only a very precise color with a small range of wavelengths. The most common examples sample the light of hydrogen, oxygen, and sulfur, glowing red, blue, and red, respectively. Note that both hydrogen and sulfur glow predominantly red in these gas clouds, but the different elements may be excited to glow in different parts of the cloud so the images will look different through the various filters. We then assemble the exposures into a color image using the standard red, green, and blue primaries rather than reproducing the colors of the incoming light.
Even though color happens in different ways in our eyes and in astronomical cameras, Hubble and other observatories produce real photographs with light that was produced far away and in most cases a very long time ago. The subjects of the photographs are real places, though very different from our everyday, earth-bound experience. The pictures and the colors are quite real, even though different from what we would see with our limited vision.