In the past few days, an image of Pope Francis wearing a fashionable white puffer jacket has been circulating on social media. However, the image is not authentic and was generated by an artificial intelligence (AI) system called Midjourney. The system produces images based on text prompts, and an artist named u/trippy_art_special posted the images on Reddit on March 24th. The left-hand image has since spread across Twitter, where it has deceived many. This case is being referred to as the “first real mass-level AI misinformation case” by web culture expert Ryan Broderick.
While fears of AI-generated fakery are not new, this incident has highlighted a wider problem of technologies being introduced into society without proper oversight, regulation, or standards. Elinor Carmi at City, University of London, says that the fact that so many people can now access AI-generated content means that the floodgates have opened, and we need to be more careful and aware of the risks of deception.
Although earlier generations of AI produced deepfaked images, the images often had telltale signs of fakery, such as non-blinking eyes or blurred ears. Midjourney’s recent update has significantly improved the quality of output, making it difficult for people to differentiate between real and fake images. However, Midjourney still struggles with hands, often adding extra fingers.
Agnes Venema at the University of Malta warns of the issue of scale, as the r/midjourney subreddit has many equally convincing AI-generated images produced by its 143,000 members. The subreddit has examples of a fictional earthquake that hit the US and Canada in 2001, inspiring its own lore. The top-voted comment on the post states that “People in 2025 are going to have a real difficult time with misinformation. People in 2100 won’t know which parts of history were real…”
The rapid rise of AI means some disruption is inevitable, and society is being expected to hop on board the AI revolution without fully understanding its impact. Carmi emphasizes the need for better media literacy and understanding of how easy it is to create and spread fake images. “Most of our society has been left behind, not understanding how these technologies work, for what purposes, and what the consequences are,” she says.