Google’s AI Image Generator Under Fire For “Historically Inaccurate” Images

Google’s AI Image Generator Under Fire For “Historically Inaccurate” Images

Google’s Gemini AI platform has attracted ire online for creating “historically inaccurate” images including black Vikings and a host of racially diverse Nazi soldiers, with some users claiming that Google was refusing to create images of white people.

Google subsequently paused Gemini’s ability to create images of people following the news spreading online, saying it was “working to improve these kinds of image depictions immediately.”

The search giant added that “Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”

Google later confirmed that it would “re-release an improved version soon.”

There were a host of images created with simple prompts by users, which created images that they perceived to be anachronistic with history.

Users claim that Gemini would, at times, refuse requests when specifically asked to create images of images of white people. But, when they requested that images be created of black people, Gemini had no issues. Naturally, the “anti-woke” portions of the online community immediately began hand-wringing and slamming Google for a variety of perceived injustices.

Another example, created by a writer for tech site The Verge portrayed a selection of racially diverse Nazi soldiers — though Gemini has sense enough to portray them without swastikas.

Of course, herein lies a problem around historical perception and understanding as much as anything else. While there were no records of black Vikings, there was at least one black Samurai in Japan. There is also a huge amount of source material to suggest that much of Europe was far more ethnically diverse — particularly cities such as London and Paris — than the popular memory would suggest. There were also many non-white Nazi soldiers in the Ostlegionen and Indische regiments that fought on the Eastern and African fronts.

However, some more general issues have people online up in arms. As reported by the Daily Dot, users online have been complaining that when prompted to create an image of British, Australian, German and American women, they believe that Gemini refuses to include white people — or at least what they would perceive as the correct level of racial diversity. They ignore, of course, that those societies are all incredibly racially diverse.

There is every chance that Gemini is programmed to amp up the racial inclusion in its creation. AI models are trained on available digital and online data — the preponderance of skews heavily towards white people.

For advertisers — and creatives in particular — the growing use of AI to generate images for campaigns mid-flight or in the planning stages presents a challenge. They need to reflect the markets that they are advertising to and they will not be well-served if AI tools are incapable of creating an accurate reflection of those markets.




Please login with linkedin to comment

Gemini AI Google

Latest News

Sydney Comedy Festival: Taking The City & Social Media By Storm
  • Media

Sydney Comedy Festival: Taking The City & Social Media By Storm

Sydney Comedy Festival 2024 is live and ready to rumble, showing the best of international and homegrown talent at a host of venues around town. As usual, it’s hot on the heels of its big sister, the giant that is the Melbourne International Comedy Festival, picking up some acts as they continue on their own […]

Global Marketers Descend For AANA’s RESET For Growth
  • Advertising

Global Marketers Descend For AANA’s RESET For Growth

The Australian Association of National Advertisers (AANA) has announced the final epic lineup of local and global marketing powerhouses for RESET for Growth 2024. Lead image: Josh Faulks, chief executive officer, AANA  Back in 2000, a woman with no business experience opened her first juice bar in Adelaide. The idea was brilliantly simple: make healthy […]