Skip to main content

This is how Adobe and NASA photoshop the stars

This is how Adobe and NASA photoshop the stars

Share this story

Even the wonders of the Universe need a touch-up every now and again. Although we marvel at pictures sent back to Earth from equipment like the Hubble Space Telescope or the Curiosity rover on Mars, there's a lot of work done on these images before they're published. As this blog post from Adobe explains, sometimes it's mostly just a case of stitching pictures to create a panorama (as with transmissions from Curiosity), but other times the process is more complicated.

"I think of it as a visual translation process."

Robert Hurt, an astronomer and Photoshop expert who works at Caltech’s Infrared Processing and Analysis Center, says that when processing images of galaxies and nebulas, his job is to turn data into something visible. "I basically take raw grayscale data from different parts of the infrared spectrum, and then remap them into visible colors — typically with red, green, and blue Photoshop layers — to create images that are accurately representative of the infrared colors that human eyes cannot see," says Hurt. "I think of it as a visual translation process." Below you can see a GIF of various images of the Orion Nebula being pulled together, with the colors representing temperature (blue is hotter, green and red are cooler):


The Orion Nebula, M42. (NASA/JPL-Caltech/UCLA)

Hurt says the images he helps produce not only have to be visually appealing, but also true to the scientific data. "The optics of the camera can create artifacts that to a naive viewer might look like something real from the universe," he says. "But these are things we want to clean out of the image, because we don’t want people to think there’s some weird planet thing floating out there when there isn’t."

The end-result is usually a massive "multi-gigabyte file" that contains layers of information from different telescopes. In the set of pictures below, for example, the final published image is on the far right, showing M104, better known as the Sombrero Galaxy. However, it's actually a composite of the image on the far left (from the Spitzer Space Telescope, which records infrared light) and the image in the middle (from the Hubble Space Telescope).

Three images of the Sombrero Galaxy, M104. (Infrared: NASA/JPL-Caltech/R. Kennicutt and the SINGS Team)

"My general workflow for this is to first take the original observational data from the telescope, which is kind of an HDR representation of the sky," says Hurt. "Sometimes I’ll bring in Hubble’s visible-light photos of the same astronomical region, too, and layer the Spitzer data on top of those to create images highlighting the interesting contrasts between different parts of the spectrum that the general public can enjoy and understand."

Verge Video: NASA has discovered liquid water on Mars