A few weeks ago, when it was still summer, my partner’s mother was worrying over the arrangements for her 70th birthday party, due to begin in a few hours time. Alongside what to wear and the catering, she asked about the Spotify playlist.
“I’ve chosen some of my favorite songs and then a list of other songs came up,” she said. “I quite like them, but I can’t find any information about them anywhere?”
The songs in question were pleasant enough; mostly a strikingly rootless reggae.
In the event, the party was a wonderful success; in a lovely location, the food was excellent, and everyone was delighted to be there.
Did we end up chatting against a backdrop of AI-generated music? Or did someone carefully filter it out?
I’ve absolutely no idea. I forgot to ask, and I certainly didn’t notice.
Did it really matter, anyway?
We are becoming ever more immersed in a world of unreality, that has now more or less completely taken over our lives. And much of what we know.
Our digital world; the world in which most of our economy, our culture, our work, our personal—and increasingly, our intimate—lives now reside, is becoming ever more detached from our physical and embodied lived experience. AI is a natural evolution of that process, and of course also a massive accelerant of it. The edges of where that unreal world ends and our physical world begins are becoming harder and harder for us to detect.
In order to maintain and extend this all-immersive experience of unreality, tech companies are locked into an arms race to build gargantuan data centers with increasingly powerful “compute,” OpenAI’s newly opened “Stargate” data center will eventually require five gigawatts of electricity, about the same as New York City, and six million metric tonnes of water a day, while being built, like many other data centers, in an area of acute water stress.
And so it is also undeniably clear that AI’s extraordinary demand for power—in every sense—and also natural resources like water, is having a direct and profoundly rupturing impact on the viability of our real—and stubbornly organic—planet.
In that sense, AI is perhaps the essential tool for where we currently find (and lose) ourselves; it enables infinite digital worlds to be generated quite specifically for each of us, so that planetary reality may be kept indefinitely at bay. Like a Las Vegas casino with no windows and no clocks, AI and our immersive online spaces are creating an entirely sealed and controlled environment, the outside world remains out of sight, out of mind, as we, the ever hopeful gamblers, grip the sides of our devices, stare into the screen in front of us, and we anxiously pull the lever for one more spin.
Over and over again.
Unreality Bites
In the past few weeks, OpenAI brought two key dimensions of digital unreality together into one frictionless experience: “Sora 2“, a social media platform for people to share exclusively AI-generated content with others. Those others may be actual people, but might just as well be AI generated profiles, which have overrun social media platforms for years. Not that many early adopters of Sora 2 will care.
The retreat into unreality may be far more intentional than we fully appreciate.
Steve Jobs, the exemplar for many of today’s tech CEOs, was famously said to possess a “reality distortion field” that bent the perceptions of those around him to his own singular vision. And with AI, it appears that Silicon Valley is deploying all of its vast reserves of capital, influence and captured attention to pull off much the same trick.
Silicon Valley CEOs are deeply fixated with sci-fi utopian—and quite often dystopian—fantasies; indeed, it is notable that for them there is little distinction to be drawn between utopian and dystopian.
The language of the Matrix—where every human’s actual embodied reality is being encased in a pod, with tubes providing water, nutrients and air—is ubiquitous. The movie’s “red pill” has become a short hand in tech circles; a person is described as AGI-pilled, 10x growth-pilled, MAGA-pilled, and so on.
Elon Musk has regularly mused on whether we all live in a simulation, so it would be hardly surprising if he is working hard to bring that simulation into reality. Or unreality, depending on your point of view.
Open AI’s CEO Sam Altman was deeply influenced by the movie Her, which depicts a man falling in love with an AI, played by Scarlett Johansson. Intent for his own AI product to imitate the movie, Altman asked Johansson to voice ChatGPT, but she refused. OpenAI then created a synthetic voice to imitate Johansson, presumably so as users would not notice. Until, that is, the real Scarlett Johansson made clear her anger at Altman’s duplicity.
Now, it is increasingly clear that millions of people are also forming intense connections with AI “companions.” These relationships are very real—even if the partners are unreal. The Subreddit community “MyBoyfriendisAI” has 72,000 members.
Pyschiatrists are increasingly discussing “AI psychosis,” where previously well adjusted people lose their grasp on reality altogether on account of their deep interactions with an AI. Open AI recently revealed that over 500,000 ChatGPT users are showing signs of pyschosis or mania every week.
Arguably, our society and culture became increasingly removed from reality with the emergence of mass media, advertising and television in the twentieth century. But our headlong leap into unreality really took off with the widespread take-up of the internet, social media and of course the smartphone, which combined these technologies into one device that never leaves our side—or our consciousness.
The first internet browser arrived in 1993. That was around the time when humanity collectively realized that carbon emissions were verifiably warming the planet. Our carbon emissions have risen ever since, and are projected to peak some time in the 2030s. Maybe.
Are these two developments connected? It is impossible to say for sure.
But both are certainly disconnected.
Disruptive Technologies
The internet was initially introduced to us as a “disruptive” technology, with connectivity that could empower societies to reshape their actual lived reality, as exemplified in the shortlived “Arab Spring.”
The platforms, however, soon evolved into algorithmically engineered amplification chambers, optimized for engagement and amassing tens, and then hundreds of billions of dollars in revenue every year.
This information space was left largely unregulated, allowing unreality to truly unroot itself—and it took many of us with it. Over time it became increasingly apparent that synthetic online communication was far more effective for that than human connection. Disinformation metastasized into a platform industrial complex, combining advertising, influence networks, and algorithms. This complex created a hall of mirrors filled with distorted echoes of our own worst instincts—and it shouted them back at us.
In the height of the COVID pandemic, when physical embodied contact was heavily restricted, 41 of the 50 most influential Twitter accounts discussing COVID were bots. Conspiracy theories online took off as never before.
It turned out not only are humans easily manipulated by algorithms, but that algorithms are also easily manipulated by humans. Millions of us fell into this infinite toxic loop.
We have all seen friends and relatives fall into online conspiracy rabbit holes.
But what if the internet itself was the rabbit hole—and we have all fallen down it?
The AI Race to Unreality
Having emerged from that time, we have yet to emerge from the unreality; if anything, we are now even more trapped in it. The average human on earth now spends over six and a half hours every day watching a screen, about 40 percent of their time awake.
Now, with AI, we have been taken to an entirely new plane of unreality.
Only a few years ago, internet titans earnestly detailed how they were removing “inauthentic content” on their platforms. Now they are flooding their platforms with entirely inauthentic AI content at giga-industrial scale.
AI-generated books are now populating local libraries. As we have now run out of conventional content to feed AI models—there is simply no more data on the internet left—so they are increasingly trained on synthetic data, which then populates the internet to become our new online reality.
The business imperatives on which this AI boom is based, are also increasingly detached from reality, with $6.7tn forecast to be invested in AI by 2030.
One analyst says: “Nothing short of AGI (artificial general intelligence) will be enough to justify the investments now being proposed for the coming decade.”
No-one has yet convincingly defined what AGI actually is. But at that point, none of this will matter, anyway,
Sam Altman recently said he was planning to build AI data centers requiring 250 gigawatts of power by 2033, equal to the electricity required by India, and all its 1.5 billion people.
If those data centers will be powered by gas, as most currently are, OpenAI would produce twice the the carbon emissions of ExxonMobil.
Meanwhile, the rest of us may well endure three degree global warming within three decades, triggering massive crop failures, mass death heat waves, tens of millions of climate refugees, among other calamities.
Silicon Valley’s response is to retreat further into their sci-fi unrealities. Sam Altman claims nuclear fusion from another of his ventures is only three years away, whereas most scientists regard fusions as well over 40 years away. Altman muses that we might eventually build a “Dyson Sphere,” an array of mirrors around a sun to capture its energy.
Elon Musk’s fixation on colonizing Mars is far more an escape from planetary reality, than it will ever be from the planet itself.
Alas, science strongly indicates that for the forseeable future, Elon is very much stuck on this planet with us—as we are with him.
Making Sense of Our Place
In The Spell of the Sensuous, an exploration of phenomenology, indigenous knowledge and the more-than-human world, David Abram considers our systemic detachment to the world around us.
“We have forgotten the poise that comes from living in storied relation and reciprocity with the myriad things, the myriad beings, that perceptually surround us.”
David Abram was writing in the early nineties, when the internet was still a curiosity, yet even then he was acutely alive to its impact.
“Our reflective intellects inhabit a global field of information, as we absently fork food into our mouths,” Abram says, “clicking on the computer and slipping into cyberspace in order to network with other bodiless minds…”
“Our nervous system synapsed to the terminal, we do not notice that the chorus of frogs by the nearby stream has dwindled, this year, to a solitary voice, and that the song sparrows no longer return to the trees.”
Abram then could not have imagined how detached from reality we have now become, yet his call resonates even more deeply now.
As it is not only the data centers rupturing our planet; there is an entire supply chain that produces millions of AI “GPU” chips to fill them. Dozens of semi-conductor foundries are being built across the world, all of them requiring more gigawatts of power, millions of tonnes of water, and leaving megatonnes of toxic waste, including PFAS “forever chemicals”, that natural systems cannot break down.
Abram further posits that when humans in the past swept into new regions, they “suddenly found themselves in a world where…their stories seemed to lose all meaning, where the shapes of the landforms lacked coherence, where nothing seemed to make sense.”
Abram argues it was these geographic dislocations that led to the catastrophic loss of flora and fauna as humans spread into Australia, and later, as they fanned out from the Bering Strait into the Americas.
“Without an etiquette matched to this land and its specific affordances,” says Abram, “the displaced and often frightened newcomers could easily disrupt and even destroy a large part of the biotic community.”
Is it not possible that we feel the same sense of deep rupture, as we have migrated further into our digital landscapes?
Of Rocks and Earth
Following the AI supply chain up from the semi-conductor foundries, there is rapidly growing demand for minerals and metals—copper, rare earths, nickel, quartz. The International Energy Agency forecasts mining of critical minerals must increase six times by 2040 due to AI, digital and renewable technologies—and that is in order to meet “Net Zero” targets.
A recent paper in Nature found that 54 percent of those “transition minerals” will be extracted from or near indigenously stewarded lands. These are often especially fragile ecosystems: Greenland, the Sami reindeer grounds in Sweden, and the First Nation peatlands of Northern Canada, all vast natural stores of carbon.
So that presents a question: how many human and more-than-human-intelligences will we be mining out of existence, in order to build synthetic intelligences of our own? And what critical knowledges of our organic and physical world will die with them?
These are knowledges that machines will never reach, much less compute, because they never existed in data.
Considering how little we plainly know about living in balance with our planetary habitat, can we really afford to escape into AI unreality as those knowledges disappear?
And in that sense, while AI may well be a leap for humanity, we may yet find it is a leap in precisely the wrong direction.
After all, does it not seem likely that, in order to truly meet our planetary reality, rather than retreat further from our physical world, we should rather re-immerse ourselves in all its real, abundant aliveness; the aliveness, that is, that we have still left?

Alistair Alexander
Alistair Alexander explores the ecological and social impacts of AI and technology in research, writing, workshops and teaching at reclaimed.systems and at his free newsletter. He recently investigated AI supply chains, de-carbonization pathways for AI and tech, and has led courses on Tools to Disconnect/Reconnect, on How Social Networks could be more like Funghi Networks. If he’s on social media, it’s normally at linked.in or reclaimed_alicma@mastodon.world






