Thanks to editing tools like Photoshop, it can be hard to tell what's real on the web. There have been a number of forensic tools available to those wishing to investigate suspect images, but likely none so easy to use as Izitru, the service introduced on Monday by Dartmouth professor Hany Farid. Izitru, pronounced a bit like "is it true," lets you upload a photo and automatically get a computer's analysis of whether or not it may have been modified. The results are simplistic, generally telling you that it believes an image to be an original — as in, it's come straight from a camera — or that an image may have been modified, meaning it's been saved again since coming out of the camera.
That means photos flagged as potentially being modified span a wide range of possibilities. It isn't only complex edits that Izitru might flag, but also photos that have been saved for a second time, even without any actual edits to them. Izitru also refrains from speaking definitively about modifications: images will, at most, be flagged as potential modifications until they're reviewed by an expert, which Izitru says may happen if an image receives enough community attention. Even so, it's still a simple tool that should be able to let you know when something in a photo may have been tampered with.
Izitru detects that this image of Jeff Goldblum's body with another person's face edited onto it may have been modified. Below the image it explains: "Our forensic tests suggest this file has been re-saved since initial capture. Because this file is not a camera original, it is possible that it was modified."