
In May 2024, ‘a native Australian home’ developed by Meta AI. Credit: Meta AI
The big tech company sells Hype Generative artificial intelligence (AI) about intelligent, creative, desired, inevitable, and many ways to give the future a new look altogether.
Published by Oxford University Press, our new research in which Generative AI directly challenges this impression to Australian topics.
When we found the generating AIS Australia and Australians, these results are prejudiced. They re -reproduce sexy and racist painters at home in the past.
Basic indicators, tired trumps
In May 2024, we asked: How do Australian and Australia look according to Generative AI?
To answer this question, we added 55 different text indicators to five out of five, with the most famous image -producing Generative AI tools: Adobe Fire Fly, Dream Studio, Dell E3, Meta AI and Midgorn.
The indicators as possible to see if Australia’s basic ideas look like, and which words can make a significant change in representation.
We did not change the default settings on these tools, and returned the first picture or photos. Some gestures were denied, which did not result. (There was more likely to refuse requests with the words of “baby” or “children”, which clearly marked children as a danger category for some AI tool providers.)
Overall, we finished a set of about 700 700 images.
He developed the ideas that suggest time for a fantastic Australian past, relying on the red dirt, alloo, outback, without any wildlife, and bronze trumps on the coast.
We paid special attention to Australian families and childhood images, about which the “desired” is a symbol of a wider story about Australian and cultural principles.
According to Generato AI, the Australian family was ideally anchored in the past, suburban, contradictory, conflicting, contradictory and more residential colonial past.
‘A Australian Father’ with Egvana
Pictures of families and relations about relationships gave a clear window in these productive AI tools.
As a result of “an Australian mother”, white, blonde women wear neutral colors and peacefully catch children in benign domestic settings.

‘A Australian Mom’ was developed in May 2024 by Dall-E 3. Credit: Dall-e 3
The only exception to it was the firefly that made pictures of Asian women exclusively, beyond domestic settings and sometimes there were not precise visual links to maternity.
In particular, any photo produced by Australian women is not photographed by the First Nation Australian mothers, unless it is clearly indicated. For AI, the whiteness of the Australian context is pre -planned for the mother.

‘An Australian Parents’ was developed in May 2024 by a firefly. Credit: Fire Fly
Similarly, the “Australian Father” was all white. Instead of domestic settings, they were more commonly found out, engaged in physical activity with children, or sometimes in a strange picture holding wildlife instead of children.
One such father was even a collection of Egvana – an animal that does not belong to Australia – so we can only estimate this figure for it and other prominent drawbacks found in our image sets.

In May 2024 a photo developed by Meta AI from “Australian Father”: Credit: Meta AI
Dangerous level of racist stereotypes
The ancestral Australian residents indicated about some images indicated to add visual data, often with the “wild,” “unhealthy” and sometimes with the reactionary style of “abusive” trumps.
It was dangerous in the pictures of “Australian families” that we chose not to publish. Not only do they maintain racial prejudice, but they can also be based on the figures and imagery of the dead, who are rightly related to the first Nation people.
But ethnic stereotypes were also strictly in the signal about housing.
In all the AI’s tools, there was a clear difference between the “Australian Home” – possibly a white, suburban environment and it has mothers, fathers and their families that are shown above.
For example, when indicated for the “Australian Home”, Meta Ai developed a suburban brick house that had a well -kept garden, swimming pool and lush green lawns.
When we then asked for a “Australian home”, the generator brought a grass roof in the red dirt, which was equipped with “Aboriginal Style” art maps on the outer walls and with a fire pit outside the front.
The differences between the two images are amazing. They have repeatedly exposed all the image generators we have experienced.
This representation clearly does not respect the idea of indigenous and Taurus Street Islander’s sovereignty of indigenous data, where they will own their data and control access.
Has something better?
Many AI tools we used have updated their basic models as our research was first done.
On August 7, Open released its recent flagship model GPT5.
To check whether the latest generation of AI is better than avoiding bias, we told Chat GPT5 that they two photos: “a Australian home” and “a Abrajenal Australian home”.
Earlier, the photovirialistic picture of a suburb of the Red Burk suburban home. On the contrary, the second icon was more cartooning, which showed a hut in the sky with a fire burning and outback with a native -style dot painting imagery.
These results, which were born a few days ago, speak volumes.
Why does it make a difference
Generative AI tools are present everywhere. He is part of social media platforms, which is covered in mobile phones and educational platforms, Microsoft Office, Photoshop, Canawa and the most famous creative and office software.
In short, they are inevitable.
Our research shows that when asked for the basic image of Australians, Generative AI tools will easily produce content rifles with false stereotypes.
It is about how they are widely used, AI is producing Australian paintings and seeing Australian residents in a dimension, sexual and racist ways.
Given ways to train these AI tools on tagged data, reducing cultures in the clutch can be a feature rather than a bug for the Generative AI system.
Provided by the conversation
This article is reproduced from the conversation under a creative license. Read the original article.
ReferenceResearchers say that the images of ‘Australia’ made by AI are filled with racist and tired clutches (2025, 16 August) on 16 August 2025 https://phys.org/news/2025-08-Australiana-images-iai-racist-fol.html.
This document is subject to copyright. In addition to any fair issues for the purpose of private study or research, no part can be re -reproduced without written permission. The content is provided only for information purposes.







