What’s all the fuss about? Photographers and artists have been fudging, faking, manipulating, and tinkering with photographs since the invention of photography.
Take Camille Silvy (1834-1910), for example. His image titled The Streets of London was made in 1859, only a few decades after the invention of photography and 132 years before Adobe Photoshop 1.0 was released, and was made by combining four separate negatives.
While it was a lot more hassle, and required more skill than writing a prompt for a generative AI engine, the final image is as fudged and fake as anything that might have been altered in DALL•E 3.0. However, Camille Silvy wasn’t entering his photo (montage) into a ‘Photographer of the Year’ competition; if he had, he might have gotten into a spot of bother.
A little over a century after Silvy’s meddling, AI and AI imagery are the hot topic du jour, and nowhere more so than in the world of photo competitions. There are plenty of stories of AI-generated images fooling judges and winning awards, and it’s easy to see how, as some of the results can be utterly convincing.
Perhaps the most notorious story emerged when German artist Boris Eldagsen went public with the fact that he won a Sony World Photography award with an AI-generated image (see below). He pulled the wool over various eyes – and made a bit of a scene about it at the awards night.
It became a hot topic for a while, and gave Eldagsen some notoriety. However, the incident raised some interesting discussion points. Many photo competitions have since updated their terms and conditions, and reevaluated their stance on where AI sits in the context of their agenda and, more importantly, where to draw the line.
Photo competitions react to AI
For most competitions, the line sits somewhere around the acceptable use of some AI tools in image editing software such as Adobe Photoshop. It’s generally acceptable to modify and edit your own photographs using AI tools, typically with some caveat that you should stipulate this is the application process.
Many competitions will ask to see raw files before making a final decision. I’m a former creative director of Landscape Photographer of the Year, and we would do this to ensure that images hadn’t been manipulated too much.
This sentiment was echoed by Dan Calder of Close Up Photographer of the Year, who said, “Close-up Photographer of the Year is first and foremost a photography competition, so we won’t be accepting any images made with generative AI. During the latter rounds of judging, we will ask for raw files. And each entrant will confirm a declaration that no generative AI has been used”.
On the subject of image editing using AI tools, Calder adds, “Post-processing becomes a little trickier to adjudicate with the introduction of AI tools to Photoshop, etc. We are now open to creative post-processing techniques as long as we are informed about what has been done and that natural behavior is not misrepresented. We will continue to draw the line at AI-generated objects, animals, flora, landscape motifs that are added to the original picture”.
On a final note, Calder concludes that “over time, if photographers wish to start incorporating AI-generated elements to their pictures, then we will follow their lead and look to create a new category for them”.
Hugo Donnithorne-Tait, Awards Director of the British Photography Awards, has some interesting thoughts, too and says,” Like any new technology, getting angry at any disruption it causes is like shouting at the rainclouds. I’m convinced AI will be a watershed moment for so many industries and the great mega-trend of our generation – that said, there are important things to protect from it. Image integrity is one of those things, especially if you run a purist photography award as we do.”
“We built a secondary upload function so those who are considered for shortlisting can upload an original file or raw format file side by side with their image. While this used only to be requested when we suspected doctoring of imagery, we are going to implement it for all imagery due to AI. This will help us police the imagery and maintain the integrity of our shortlist. In the future, hopefully, a marriage of camera advances and blockchain technology can help us verify individual instances, creating an immutable catalogue of when moments were captured. Samsung has led the way with this to some extent. Their new smartphone (the Samsung Galaxy 24 Ultra) does have AI image editing built in, but it also has a function that ‘tags’ doctored images in the metadata. This is the kind of considered advancement we need, simple systems that let us see what is what”.
Where truth and integrity are paramount
For the World Press Association (WPA), and photojournalism in general, integrity and authenticity hold a higher stake. I asked Andrew Davies from the WPA about its stance on AI, and he told me, “As a starting point, we do not view AI-generated images as photographs. They can simulate the look of a photograph, but by definition, it’s not the same. A photograph captures light on a sensor or film; it is a record of a physical moment. So, by definition, this is different from an image made with generative AI. Therefore, as a photography competition, we do not allow AI-generated images”.
Davies goes on to explain how the WPA approach the prevalence of AI. “We use a multilayered approach to spot fake or manipulated images: We verify the professionality of all entrants and the facts behind each story. The images are judged by industry professionals, and the original files are examined by two independent digital analysts.”
He adds, “I think the bigger issue is around trust in general. Scenes can be staged, the framing of images can be deceptive, photo manipulation is almost as old as photography itself, and digital editing software has been around for decades. Our contest is outside the 24/7 media cycle, and we do not have to work at the speed of social media. We can take time to check each finalist image carefully”.
For Davies, the bigger question is how people can continue to trust press and documentary photography in general, outside the realm of photo competitions. Davies explains that there are some technological solutions for tracking the origin of images, but these can’t tell you about the trustworthiness of the source. The WPA helped co-create a set of principles, compatible with those followed by many news organizations, that individuals and organizations can also adopt or adapt.
It’s clear from my conversations that there’s currently a lot of fuss, debate and anxiety around the evolving nature of photography competitions in the new (ish) era of AI. After speaking with a number of competition organizers the consensus among them seems to be that while AI tools present new creative possibilities, safeguarding the integrity and authenticity of images remains crucial.
From established competitions like the British Photography Awards to revered institutions like the World Press Association, measures are being implemented to ensure transparency and uphold the essence of photography. As AI continues to shape the future of image making and photography, truth and integrity will remain the guiding principle for both creators and organizers alike, but, really, this has always been the case.
At the end of the day, a purely AI-generated image is not a photograph and therefore has no place in a photography competition; a photograph that’s been enhanced using AI tools might, but exactly what is acceptable depends on the competition and its core values, just as it’s always been.