Light
Dark
Light
Dark

Photography and Civic Engagement

The Persistent Demand for Misinformation and Fake Images

By Holly Stuart Hughes

Photo of HOLLY STUART HUGHES

Holly Stuart Hughes is an independent editor, writer, and grant consultant. Since 2022, she has worked with CENTER on The Democratic Lens: Photography and Civic Engagement, interviewing humanities scholars on photographic history. The former editor-in-chief of PDN (Photo District News), she has organized panels and lectured on artists’ rights and the business of photography around the U.S. and served as a portfolio reviewer at several photo festivals. A graduate of Yale, she has written on photography and media for Time.comThe TelegraphMultichannel News, Taschen Books, American Photographic ArtistsMagnum Photos, Carlton Publishing, and Blouin ArtInfo Media.

CONTENT WARNING: VIEWER DISCRETION ADVISED –

An image of the May 22, 2018 launch of the SpaceX Falcon 9 rocket. The photo was shared on social media with claims that space lasers sparked the Maui wildfires fires of 2023. Copyright SpaceX/ZUMA/Picture alliance

"For propaganda to succeed, it must correspond to a need for propaganda on the individual’s part…. There is not just a wicked propagandist at work who sets up means to ensnare the innocent citizen. Rather, there is a citizen who craves propaganda from the bottom of his being and a propagandist who responds to this craving.” – Joseph Ellul, Propaganda: The Formation of Men’s Minds (1962)

On December 14, 2012, Shannon Hicks, staff photographer at the Newtown Bee in Newtown, Connecticut, heard over the paper’s police scanner a report of shots fired inside Sandy Hook Elementary School. When she arrived outside the school, where a gunman had killed 20 students and six adults, Hicks photographed state police hurrying a line of frightened first graders through the parking lot. The next day, newspapers across the country published Hicks’s photo. To many of us, the image illustrated the trauma of the school shooting. To others, however, the photo was part of an elaborate hoax, a “false flag” event staged by government officials as a pretext for stricter gun control. To insist that the massacre never happened, Sandy Hook hoaxers also had to insist Hicks’ photo was a fraud. In social media posts, blog articles, and YouTube videos, hoaxers claimed Hicks had made her photo during an evacuation drill, or at a different location, or using paid actors, or by coaching six-year-old students to cry.

Leonard Pozner, whose son, Noah, died at Sandy Hook, fought the hoaxers in court for years. In 2022, a New York Times reporter asked him if releasing his son’s autopsy photos could have ended the conspiracy theories. Pozner said a photo would not make a difference. “Hoaxers will have more things to deny, absolutists will have more things to say — and people who are traumatized by mass shootings will be more traumatized,” he said. 

The mistrust and hyperpartisanship that Pozner described—our inability to agree on what evidence to trust or whose facts to believe—has worried the public and policy makers. Four years after the Sandy Hook massacre, worry turned into action. Mobilized by the spread of politically motivated disinformation during the 2016 Presidential election campaign, government agencies and think tanks began funding research into conspiracy theories, falsehoods, and the threats they pose for democracy. Misinformation studies are now “firmly entrenched in disciplines of sociology, computer science, communication, medicine,” as a recent article in HKS Misinformation Review noted. The burgeoning field of misinformation studies has also drawn many critics. They argue that most research has been too narrowly focused on the role of social media, algorithms, and other technologies. Critics reject the assumption that misinformation is a new, unprecedented contagion. This narrative, they argue, ignores that propaganda, demagoguery, and fear-mongering have always been a part of American culture. "Positing a current crisis of fragmented ‘truth’ due to technologically enabled polarization presumes that, prior to the advent of social platforms, the public agreed upon ‘facts’ and ‘knowledge,’” scholars Rachel Kuo and Alice Marwick wrote in a frequently cited article. 

Connecticut State Police lead children from the Sandy Hook Elementary School in Newtown, Conn., Friday, Dec. 14, 2012. © AP Photo/Newtown Bee, Shannon Hicks

“Hoaxers will have more things to deny, absolutists will have more things to say — and people who are traumatized by mass shootings will be more traumatized” – Leonard Pozner

They recommend that misinformation researchers learn from other disciplines, including social and cultural history, to understand how propaganda, myths, and fallacies have been used in the past, and the contexts in which they flourished. The disciplines of cultural studies and critical visual studies can also illuminate how we interpret and make meanings from the text and images—accurate, false, authentic, manipulated-- that we read and see. These perspectives help us look past the latest technologies for making misinformation, and instead focus on a more entrenched problem: why belief in false information persists.

The cover of a book by Ida B. Wells, detailing lynching in the United States South, published 1892. Project Gutenberg.

The humanities scholars interviewed for The Democratic Lens observed that, since the invention of photography, photos have meant radically different things to different people, depending on their biases, identity, experience, and ideology. Shawn Michelle Smith, for example, noted that photos of lynchings were often sold as souvenir postcards to members of the gathered mob. They shared the images proudly with friends and family, or mailed them to prominent Black families “as a form of terrorism and intimidation,” Smith said. The same images were also printed by anti-lynching crusader Ida B. Wells and the NAACP “to condemn white lawlessness and barbarity.” Art historian Leslie Ureña said that Lewis Hine’s portraits of new immigrants arriving at Ellis Island were used by aid societies devoted to settling new arrivals in the U.S. They were also published by anti-immigration activists, who described the subjects in Hine’s photos as undesirable and racially inferior. Erina Duganne has written about how Black photographers in the 1960s coped with having their artful, personal images used by white editors and curators “to say very particular, one-dimensional things about the African American experience.” To rally public support for the Great Society, the Johnson Administration mounted a public photo exhibition, “Portrait of Poverty.” The show’s 500 images, though shot for a variety of purposes and contexts, were selected for display on the assumption that “if a person looked impoverished, they must be impoverished,” Duganne said. As each of these examples demonstrate, a photo is not a fixed, objective representation of a single truth. They show, as Duganne notes, “that photography is, in fact, quite slippery and contingent on context.”

"Crowd of people gathered in street to watch the lynching of Jesse Washington, several men in tree appear to be securing chain or rope, Waco, Texas." May 15, 1916. Visual Materials from the NAACP, Library of Congress. The image was made into a postcard, and later printed with the article "The Waco Horror" in The Crisis, the magazine of the NAACP. 
© Lewis Hine. The Miriam and Ira D. Wallach Division of Art, Prints and Photographs: Photography Collection, The New York Public Library. "Climbing into America, immigrants at Ellis Island" The New York Public Library Digital Collections. 1905. 
https://digitalcollections.nypl.org/items/510d47d9-a96e-a3d9-e040-e00a18064a99
© Louis Draper. Virginia Museum of Fine Arts. "John Henry." ca. 1960s. Though taken on the Lower East Side of Manhattan, Louis Draper's "John Henry" image was used as the cover of the "Harlem" issue of Camera Magazine, July 1966. 

“Photographic meaning is dependent on who's viewing a photograph, to what purposes, when, and in what cultural context,” Smith said. Viewers are not passive or gullible recipients of images, text or other communication. At a conference on scientific disinformation and COVID conspiracies, media scholar Herman Wasserman reminded attendees, “Media users don’t merely receive misinformation, but shape it, curate it and share it.” With Wasserman’s reminder in mind, the problem of misinformation becomes a matter of demand, not supply.

“Photographic meaning is dependent on who's viewing a photograph, to what purposes, when, and in what cultural context" – Shawn Michelle Smith


Observations about the hunger for misinformation underpin several recent arguments that “deepfake” videos and fake images produced with generative Artificial Intelligence (AI) may have only a marginal effect on political misinformation, at least in the short-term. Midjourney, DALL-e, and other AI applications — which have been trained on the copyrighted photos of professional photographers—promise an easy way to make deceptively plausible images of events that never happened. AI fakes may soon be harder to detect than images made using airbrushing, Photoshop, and other tools previously used to make false images. When AI-images of orphans huddled under a tent in Gaza or the Pentagon on fire have grabbed headlines, the accompanying story speculates that one day, no one will know which images to believe. Some analysts say that epistemic crisis has already arrived, as Leonard Pozner observed: People believe the images they want to believe. Philosopher Joshua Habgood-Coote is critical of the “epistemic apocalypse” discourse. One reason is that it distracts us from the social and political problems behind deepfakes, which represent a breakdown of social norms around truth-telling. As a result, “Rather than thinking about media reform or institutional political change, we end up thinking about how best to detect deepfakes.”

A fake photo of an explosion at the Pentagon, produced using generative AI, was shared on X (formerly Twitter) in May 2023. 

Habgood-Coote and others aren’t suggesting that AI images could never become a problem. A flood of counterfeit images could undermine the knowledge we gain from photography, and worsen our epistemic problems. And yet: If AI is such a great tool for creating misinformation, why hasn’t a flood of fakes materialized since the technology debuted in 2016? One theory is that the demand for misinformation is already being met by simpler, cruder means. The AI-generated fakes that have made the news are vastly outnumbered by fakes that are less sophisticated than, for example, the 2004 Photoshopped montage that placed a young John Kerry next to reviled anti-war activist Jane Fonda. (That photo was also decried at the time as a harbinger of the end of photography’s credibility.) People who want misinformation don’t care if it is believable. “Misinformation is, in many cases, a fundamentally low-tech product,” economist Tyler Cowen observed. The most common technique, in fact, is to grab an old photo and mislabel it. Climate change deniers used a 2018 photo of Space X launch to “prove” that space lasers had ignited the 2023 Maui wildfires. As the story of Shannon Hicks’ photo shows, if people disagree with a photo, they label it a fake.

People who want misinformation don’t care if it is believable.

“Demand for misinformation is relatively easy to meet because the particular content of misinformation is less important than the broad narrative it supports,” wrote Felix M. Simon of the Oxford Internet Institute and his co-authors. “Experts on misinformation view partisanship and identity as key determinants of misinformation belief and sharing.”

A fake clipping produced in 2004, during the Presidential campaign of Sen. John Kerry. The composite image was made from a photo of John Kerry, taken in 1971 by photographer Ken Light (left) and a photo of Jane Fonda. Taken in 1972 by photographer Owen Franken (right). 

Misinformation didn’t cause our partisan rivalries; it’s a product of our partisan rivalries. The falsehoods and conspiratorial beliefs that have persisted throughout U.S. history—about racial hierarchies, perceived threats from immigrants and people viewed as outsiders, the shadowy influence of government or corporate entities—reflected anxieties over the loss of political, economic, or cultural advantages. Legislative action or self-regulation on the part of social media companies might help constrain the supply of misinformation, but political problems can’t be solved with technological tinkering alone. And strengthening our democracy against the threats posed by misinformation can’t be solved by ignoring our democracy’s problems. 


REFERENCES AND RELATED READINGS:

• Bernstein, J. "Bad News: Selling the Story of Misinformation." Harper's, Aug. 9, 2021. https://harpers.org/archive/2021/09/bad-news-selling-the-story-of-disinformation/

• Camargo, C. Q., & Simon, F. M. (2022). "Mis- and disinformation studies are too big to fail: Six suggestions for the field’s future." Harvard Kennedy School (HKS) Misinformation Review. https://misinforeview.hks.harvard.edu/article/mis-and-disinformation-studies-are-too-big-to-fail-six-suggestions-for-the-fields-future/

• Cowen, T. "Too Much Misinformation? The Issue is Demand, Not Supply." Bloomberg, October 3, 2023
https://www.bloomberg.com/opinion/articles/2023-10-03/campaign-2024-will-ai-generated-misinformation-be-a-big-problem?embedded-checkout=true

• Habgood-Coote, J. Deepfakes and the epistemic apocalypse. Synthese 201, 103 (2023). https://rdcu.be/dzEdt

• Kuo, R., & Marwick, A. (2021). "Critical disinformation studies: History, power, and politics." Harvard Kennedy School (HKS) Misinformation Review. https://misinforeview.hks.harvard.edu/article/critical-disinformation-studies-history-power-and-politics/

• Lenoir, T., & Anderson, C. (2023). "Introduction Essay: What Comes After Disinformation Studies." Center for Information, Technology, & Public Life (CITAP), University of North Carolina at Chapel Hill. https://citap.pubpub.org/pub/oijfl3sv

• Wasserman, H. "Cultural factors are behind the disinformation pandemic: Why this matters." The Conversation. https://theconversation.com/cultural-factors-are-behind-disinformation-pandemic-why-this-matters-141884

• Williamsson, E. “From Sandy Hook to Uvalde, The Violent Images Never Seen.” The New York Times. May 30, 2022. https://www.nytimes.com/2022/05/30/us/politics/photos-uvalde.html