With ‘AI slop’ distorting our reality, the world is sleepwalking into disaster | Nesrine Malik

5 hours ago 5

There are two parallel image channels that dominate our daily visual consumption. In one, there are real pictures and footage of the world as it is: politics, sport, news and entertainment. In the other is AI slop, low-quality content with minimal human input. Some of it is banal and pointless – cartoonish images of celebrities, fantasy landscapes, anthropomorphised animals. And some is a sort of pornified display of women just simply … being, like a virtual girlfriend you cannot truly interact with. The range and scale of the content is staggering, and infiltrates everything from social media timelines to messages circulated on WhatsApp. The result is not just a blurring of reality, but a distortion of it.

A new genre of AI slop is rightwing political fantasy. There are entire YouTube videos of made-up scenarios in which Trump officials prevail against liberal forces. The White House account on X jumped on a trend of creating images in Studio Ghibli style and posted an image of a Dominican woman in tears as she is arrested by Immigration and Customs Enforcement (Ice). AI political memefare has, in fact, gone global. Chinese AI videos mocking overweight US workers on assembly lines after the tariff announcement raised a question for, and response from, the White House spokesperson last week. The videos, she said, were made by those who “do not see the potential of the American worker”. And to prove how pervasive AI slop is, I had to triple-check that even that response was not itself quickly cobbled-together AI content fabricating another dunk on Trump’s enemies.

The impulse behind this politicisation of AI is not new; it is simply an extension of traditional propaganda. What is new is how democratised and ubiquitous it has become, and how it involves no real people or the physical constraints of real life, therefore providing an infinite number of fictional scenarios.

The fact that AI content is also spread through huge and ubiquitous chat channels such as WhatsApp means that there are no replies or comments to challenge its veracity. Whatever you receive is imbued with the authority and reliability of the person who has sent it. I am in a constant struggle with an otherwise online-savvy elderly relative who receives and believes a deluge of AI content on WhatsApp about Sudan’s war. The images and videos look real to her and are sent by people she trusts. Even absorbing that technology is capable of producing content with such verisimilitude is difficult. Combine this with the fact that the content chimes with her political desires and you have a degree of stickiness, even when some doubt is cast on the content. What is emerging, amid all the landfill of giant balls of cats, is the use of AI to create, idealise and sanitise political scenarios by rendering them in triumphant or nostalgic visual language.

Prof Roland Meyer, a scholar of media and visual culture, notes one particular “recent wave of AI-generated images of white, blond families presented by neofascist online accounts as models of a desirable future”. He attributes this not just to the political moment, but to the fact that “generative AI is structurally conservative, even nostalgic”. Generative AI is trained on pre-existing data, which research has shown is inherently biased against ethnic diversity, progressive gender roles and sexual orientations, therefore concentrating those norms in the output.

Donald Trump shares bizarre AI-generated video of 'Trump Gaza' – video

The same can be seen in “trad wife” content, which summons not only beautiful supplicant homemakers, but an entire throwback world in which men can immerse themselves. X timelines are awash with a sort of clothed nonsexual pornography, as AI images of women described as comely, fertile and submissive glimmer on the screen. White supremacy, autocracy, and fetishisation of natural hierarchies in race and gender are packaged as nostalgia for an imagined past. AI is already being described as the new aesthetic of fascism.

But it isn’t always as coherent as that. Most of the time, AI slop is just content-farming chaos. Exaggerated or sensationalised online material boosts engagement, giving creators the chance to make money based on shares, comments and so on. Journalist Max Read found that Facebook AI slop – the sloppiest of them all – is, “as far as Facebook is concerned”, not “junk”, but “precisely what the company wants: highly engaging content”. To social media giants, content is content; the cheaper it is, the less human labour it involves, the better. The outcome is an internet of robots, tickling human users into whatever feelings and passions keep them engaged.

But whatever the intent of its creators, this torrent of AI content leads to the desensitisation and overwhelming of visual palates. The overall effect of being exposed to AI images all the time, from the nonsensical to the soothing to the ideological, is that everything begins to land in a different way. In the real world, US politicians pose outside prison cages of deportees. Students at US universities are ambushed in the street and spirited away. People in Gaza burn alive. These pictures and videos join an infinite stream of others that violate physical and moral laws. The result is profound disorientation. You can’t believe your eyes, but also what can you believe if not your eyes? Everything starts to feel both too real and entirely unreal.

Combine that with the necessary trivialisation and provocative brevity of the attention economy and you have a grand circus of excess. Even when content is deeply serious, it is presented as entertainment or, as an intermission, in a sort of visual elevator music. Horrified by Donald Trump and JD Vance’s attack on Zelenskyy? Well, here is an AI rendering of Vance as a giant baby. Feeling stressed and overwhelmed? Here is some eye balm – a cabin with a roaring fire and snow falling outside. Facebook has for some reason decided I need to see a constant stream of compact, cutesy studio apartments with a variation of “this is all I need” captions.

And the rapid mutation of the algorithm then feeds users more and more of what it has harvested and deemed interesting to them. The result is that all media consumption, even for the most discerning users, becomes impossible to curate. You are immersed deeper and deeper into subjective worlds rather than objective reality. The result is a very weird disjuncture. The sense of urgency and action that our crisis-torn world should inspire is instead blunted by how information is presented. Here, there is a new way of sleepwalking into disaster. Not through lack of knowledge, but through the paralysis caused by every event being filtered through this perverse ecosystem – just another part of the maximalist visual and show.

  • Nesrine Malik is a Guardian columnist

Read Entire Article
Bhayangkara | Wisata | | |