Hey everyone! My friend Debra Christofferson asked me a great question about color and images of space.
Where does the color in these photos come from?
Are they caused by the way our camera pick up the light? Is there color in space?
Actually there is color in space. To answer your question we need to cover what "color" is. What the human eye perceives as color is merely the excitation of certain cells in our retina in response to different wavelengths of light. So what we perceive as "red" is just light at ~640nm wavelength. "Green" is just light at ~560nm, and "blue" is just light at ~475nm.
There are two kinds of cells in our eyes: rods and cones. There are three kinds of cones, with each responding to a particular color most strongly (red, green and blue). Our brains combine these signals together to show us the full spectrum of color. Rods do not discriminate between different colors (though they're most sensitive in the green portion of the spectrum). Our brains perceive light picked up by our rods as shades of gray, no matter what part of the spectrum the light came from.
Our cones are used in daylight conditions when there's sufficient photons striking them to cause excitation: it's at this point we are able to perceive color.
In dark conditions, there's not enough photons coming into our eyes to activate the cones. The rods take over, which are more sensitive, so we can still see. But what we see is just black and white.
When you look through a telescope, you are amplifying the scarce quantity of photons arriving from distant interstellar and intergalactic objects so that you can see them. But even very large amateur telescopes still can't amplify the light enough to fire our cones -- so only our rods are activated, and thus what we see in the eyepiece is just black and white.
But that doesn't mean that the objects we are observing aren't "colorful". Most objects out there are emitting light across several parts of the spectrum. If your cones were sensitive enough to detect it, you'd be able to see color in objects with your own eyes. In fact, some brighter objects, in large telescopes, can do just that. For example, some observers are able to detect traces of blue and green in M42 visually.
CCD cameras are able to record full-color imagery because CCDs don't work the same way that our retinas do. The cells in our retinas only "fire" if there are enough photons striking them over a brief period of time. A CCD pixel, however, can "count" the number of photons that have struck it over enormous periods of time -- hours, even. The additive nature of the CCD, then, is able to pick up features that even our rods miss in a big telescope visually.
The colored filters used in many imaging rigs are optional - they're used with a monochrome camera to manually isolate the different colors (red, green, blue) and get grayscale representations of them. The three images are then colorized and stacked to create the final image.
A color CCD does the exact same thing, only the filters are permanently attached to the face of the CCD: one third of the pixels have a red filter, one third have a green filter, and one third have a blue filter.
CCDs, like our rods, can only perceive light in grayscale: it takes filters to generate component values for different parts of the visual spectrum, then combine them into a full color image.
Thanks very much Debra for the awesome question!
If you have a question about space or astronomy, please send it to me and I will do my best to answer it for you.