scorecardresearch
Sunday, Aug 13, 2023
Advertisement

Facts About Fiction: How not to fall prey to an AI-generated image for real

It is very easy to make an AI-generated image, but it is also easy to spot such images through careful observation and critical analysis.

obama biden ai image facts about fictionAI-generated image of US President Joe Biden and former president Barack Obama. (Source: Jon Cooper/ Twitter)
Listen to this article
Facts About Fiction: How not to fall prey to an AI-generated image for real
x
00:00
1x 1.5x 1.8x

Recently, an image of US President Joe Biden and former president Barack Obama dressed in a Barbie-inspired outfit was widely shared on the internet. Twitter users were divided and had various opinions on the same, but before it came to light that the image was not real, many had fallen prey to this AI-generated creation.

On July 24, Democrat Jon Cooper, who raised $1 million for Obama’s campaigns and also served as national finance chairman for the Draft Biden 2016 super PAC, shared the AI-generated picture on his Twitter profile. The image went viral and also found its way to other social networking platforms, with people claiming it to be a real image of Biden and Obama.

On the same day, a few hours later, Cooper clarified that the image was not real.

But what if he or some fact-checkers had not issued a comment or published an article stating the truth? How would a common man scrolling the feed find out that the image was indeed AI-generated before someone issued a clarification?

Here are the tools that we recommend to cross-check the image:

1. ‘Optic AI or Not’

This tool detects if the images are made using Stable Diffusion, MidJourney, Dall-E, GAN or Generated faces. All these create realistic images using the texts issued as prompts.

2. Maybe’s AI-image-detector

This app is a proof-of-concept demonstration of using a ViT model to predict whether an artistic image was generated using AI.

Advertisement

One thing about the AI tools that generate images is that they do make mistakes, even if they are evolving rapidly. Currently, programmes like MidJourney, Dall-E and DeepAI have their glitches, especially with images that show people. For example, the image of the Pope wearing a white puffy jacket that went viral in the month of March exhibited only four fingers on the Pope’s hand.

Other things to look for in these images are:

  • Discrepancies when it comes to body proportions
  • Ultra-smooth skin, which also includes smooth hair and flawless teeth
  • Background of images: The objects in the background can appear deformed, or AI programmes can even clone people and objects and use them twice. Sometimes the background of the AI image is also blurred

Apart from this, a Google or a Yandex reverse image search is always advised to check for the source of the image, but if that doesn’t work as well, observation is the key. Also, it is always advised to check the social media accounts of the people in question. For example, in this case, check the verified social media handles of Biden and Obama. Had it really had been them, they would have shared the image on their profile or would have at least retweeted it.

Also Read
Self driving
What is Google's Project IDX?
How to access GPT-4 for free
Google Brain2Music | Brain listening AI model | Mind reading AI

It is very easy to make an AI-generated image, but it is also easy to spot such images through careful observation and critical analysis.

First published on: 12-08-2023 at 09:51 IST
Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement
close