This is a direct result of an incident last summer in which two photos Reuters published had been doctored in a way which changed their meaning and thus no longer accurately portrayed what had been shot.
I am pleased to announce today that we are working with Adobe and Canon to create a solution that enables photo editors to view an audit trail of changes to a digital image, which is permanently embedded in the photograph, ensuring the accuracy of the image.
We are still working through the details and hope this will be a new standard for Reuters and I believe should be the new industry standard.
It is important to say that we sought this technical solution, not because we don’t trust our photographers – far from it. I am incredibly proud of the amazing and dangerous work our photographers and journalists do. They all too often risk their lives to get the photograph that tells the true story of a conflict or captures the horror of war. The threat of injury or death is a daily hazard for many.
No, we sought a technical solution so that we had total and full transparency of our work. It’s what we stand for. It’s what we’ve always stood for. And we hope that it will provide reassurance to editors and consumers of our services.
Clearly, there needs to be a more detailed description of this, which I hope will be forthcoming. But for now I'm not sure how this will "ensure the accuracy of the image," per se, because it will not programatically prevent the publication -- even by Reuters, apparently -- of an image whose audit trail is not inspected. And in the online world images are published by the provider -- Reuters, AP, AFP, etc. -- directly on client sites, not by the Yahoos of the world.
There is also the matter of what weight this embedded information might add: will the audit trail data merely report what Photoshop tools had been used -- crop, autolevels, clone -- or will it include thumbnails of the before and after, which strikes me as the only way to know if a legal tool had been used legally? If the latter, this could add considerable weight, which is anathema in the online world, even in the era of broadband.
Reuters has some pretty specific rules about what can be done to a news photo. The short answer is, very little. It can be cropped to remove extraneous scenery (zeroing in on the action) but not to alter the appearance of the scene (like cutting out your boyfriend from a picture taken in happier times). An editor cannot make an overcast day look like a gathering storm, add a hockey puck or copy and paste smoke clouds from a bombing site.
And herein likes the problem: unless it is specific tools that must be left in the chest -- thus making it possible to automatically stop an image in its tracks on the production line -- computer-assisted auditing may provide an editor with nothing more helpful than her keen obervational and forensic skills. Each of the two embarrassing incidents of last August were detected by scrutinizing amateurs, not by digital analysis. This suggests to me that it is time that is needed most, not something new for the toolbelt.
And also unmentioned is how Reuters would vet the photographs from amateurs it is now soliciting with Yahoo. These will contain no information beyond the usual EXIF stuff, if that. Reuters intended to make these available for use in news stories, and Reuters Media President Chris Ahearn has raised the stakes to about as high as they can go by saying, "What if everybody in the world were my stringers?"
I did some programming when I was at Reuters and discovered a basic principle: Figure out which are machine decisions and which are human decisions. Shield humans from the machine decisions and never let the machine make a human one.
This Reuters initiative is well-intentioned and it may help. But my fear is that it will provide Reuters with nothing more than a false sense of security, and if anything goes wrong it won't be Canon or Adobe -- or Glocer -- who gets to fall on his sword.