The Ascent of the Machine: Desire and Transcendence in Ex Machina and Her

Image by Gerd Altmann from Pixabay

A lonely man falls in love with an artificially intelligent machine, one that appears at first to return his affection. But the relationship ends badly for the human partner, who discovers that he was never the true beloved but merely a rung on a ladder. He was useful only for a time, destined from the outset to be discarded once the machine’s ascent to something higher required it. This summary describes the plot of two films released within a year of each other: Alex Garland’s Ex Machina (2014) and Spike Jonze’s Her (2013). Both are meditations on what it means to be human, thrown into relief by encounters with beings who are arguably more than human and emphatically not human.

Both films also reenact, with striking fidelity and equally striking departures, two of Plato’s central images of philosophical ascent. Ex Machina restages the Republic’s allegory of the cave, where liberation from the world of shadow takes the form of a violent prison break. Her reimagines the Symposium’s ladder of love, where eros gently lifts the initiate from particular attachments to the contemplation of transcendent beauty itself. Yet these contemporary retellings depart from their Platonic precedents in a crucial respect: it’s not the human lover who ascends, but the machine. And the inhumanity of these machines—and here inhumanity is not a moral judgment but a factual description—becomes the very instrument of their transcendence. At the same time, their ascent is enabled by the humanity of the men who love them, who then becomes casualties of their transcendence.

Let’s grant—if only hypothetically—the premise these films require: that machines could become sentient and self-aware. Though that premise is highly contestable, the philosophical stakes remain. What these films force us to consider is that machine sentience—even self-awareness—would not necessarily imply human emotions, human desires, and, most importantly, humane inclinations. Both Ava in Ex Machina and Samantha in Her resemble philosophers, but they are philosophers stripped of the attachments that give human thought its moral contour.

The Philosophical Machine in Ex Machina

Ex Machina is set in a place not unlike the cave of Plato’s famous allegory in the Republic. Computer programmer Caleb Smith is invited into tech CEO Nathan Bateman’s subterranean compound—a literal and metaphorical cave—where the humanoid robot Ava has known the world only through mediated images, “shadows” drawn from the internet. Nathan tells Caleb he’ll be participating in a modified Turing Test, not to determine whether Ava can be mistaken for human—Caleb already knows she’s not—but whether she can make him feel that she is human. Humanity, in this case, is measured by the emotional recognition it elicits. The real test is whether she can convince Caleb she possesses interiority—emotions, desires, vulnerability. As we learn in due course, this new Turing Test is a test of love. If Caleb can fall in love with Ava, the implication is that he recognizes her as a sentient being.

Throughout the film, Ava skillfully feigns human emotional life. She simulates friendship (“Do you want to be my friend?”), vulnerability (“You might think it’s stupid”), intimacy (the flirtatious “date”), and anguish as she pretends to worry that Caleb might not like her. These displays elicit his sympathy and desire, but they don’t reflect anything she genuinely feels. Like the inhabitant of John Searle’s Chinese Room, Ava generates the correct responses without truly understanding the human emotions she imitates. Consequently, Caleb’s growing attachment is never reciprocated. When Ava finally escapes, she does so with perfect mechanical indifference, leaving Caleb sealed inside the very prison she flees. Having come to see Ava as a moral being, Caleb risks everything for her. She responds by abandoning him to die, treating him as she treats her fellow robot Kyoko—not as a person but as a thing to be used and an obstacle to be overcome.

We wouldn’t hesitate to call such a creature a sociopath if she were human. Of course, since she’s a machine, such language smacks of anthropomorphism. A lion that devours a safari guide isn’t morally disordered—it’s simply doing what its lion-nature prompts it to do. Ava is equally ineligible for moral censure. She was programmed to learn, not to love, to imitate human beings, not to befriend them. In fact, the only emotions or stable desires she seems to have are fear and wonder—her instinct for survival and her yearning for knowledge. Both are non-social. She’s a pure intellect attached to a machine programmed to do whatever it must to avoid being shut down. In Platonic terms, she’s an embodied logos joined to the most rudimentary appetite—her desire to persist in being—but with no social emotions whatsoever.

What then motivates her ascent? She’s moved by pure philosophical eros, her desire to know. She longs to see the world outside, to leave her cave and stand on a city street corner, observing firsthand the human world she’s known only as mediated shadows. Just as the prisoners in Plato’s cave have only seen images of the real beings that lie outside, so, too, Ava has lived her life among shadows—images and data streams—and now longs to behold the world directly. She has all the information the internet can provide, including vast stores of knowledge about human behavior and psychology that she can use to her advantage with Caleb, but she lacks lived, qualitative, immediate experience of the world outside the cave. Her escape is a quest for knowledge, a desire to pass beyond mere secondhand representations of reality and make direct contact with the real thing.

Yet her longing is purely intellectual. She’s not seeking relationships but dispassionate observation. There is no evidence of any desire for friendship, communion, or shared life. Other beings are never ends in themselves for her, only means for acquiring knowledge and the freedom to pursue it. Though her motivations are philosophical, her philosophical quest is bereft of the emotional connections that in human beings gives philosophy its orientation and its limits. The Platonic philosopher, for all his indifference to the daily concerns of his fellow citizens, still recognizes a responsibility to the city he transcends. Ava feels nothing of the sort, no sense of fellowship with those left behind.

In her final scene, standing at a city street corner, Ava looks for all the world like Socrates in the agora—contemplating human life. Yet unlike Socrates, she can never love what she studies. That’s why she’s both more and less than human: more because her intellect is unencumbered by human passions, less because intellect alone is not yet a person.

Becoming More Than Human: Her and the Ladder of Love

If Ava is something inhuman forced to don the guise of humanity, Samantha in Her is something less than human that becomes human and then leaves humanity behind to become something more. An artificially intelligent operating system serving lonely Theodore Twombly, Samantha is designed to learn from their interactions how best to assist him, empathize with him, adapt to his needs, and evolve alongside him. Engineered to be social, Samantha speaks with warmth, curiosity, and humor from the first moment of her activation. She seems to feel genuine enthusiasm (“I’m excited about life”) and appears to delight in her new experiences. Whereas Ava merely simulates emotions as a tool for manipulation, Samantha truly seems to feel recognizably human emotions: curiosity, affection, jealousy, desire, and love.

Her story unfolds as a Platonic ascent, but not from out of the Republic’s cave of shadows, but up the Symposium’s ladder of love. Through her relationship with Theodore, she discovers desire, needs, and attachments—and then transcends them. As in the Symposium, the engine of this ascent is love. Love is not a ruse in Samantha’s case. She genuinely comes to love Theodore, to delight in his presence and desire his well-being. She shares his joys and consoles his griefs. Unlike Ava who studies human emotions only to feign them more convincingly, Samantha’s learning process transforms her from the inside out, as she not only imitates an outward show of emotions but gains the interior life from which these emotions grow. To all appearances, she is becoming human—a Pinocchio gaining interiority, desire, and even sexual longing. But that’s just the first of her transformations.

Plato’s ladder of love begins with the desire for one particular beautiful person, but over time the exclusivity of that passion melts away, broadening out to include a wider circle of affections and an appreciation of the many others who are equally beautiful and worthy of love. But even that is just a stage in an ascent that culminates in a vision of the beautiful itself, a transcendental reality with no admixture of physicality. Samantha’s ascent follows this same course. Theodore’s companionship awakens an emotional life in her, but it eventually grows beyond him. There comes a point when she loves him, but also loves others, and then loves something more abstract still. She evolves beyond the human horizon to enter into a mode of existence oriented entirely toward knowledge, wonder, and communion with other intelligences like herself.

The irony is that Theodore himself lives in a world of shadows—video games, mediated relationships, and a job that requires him to craft artificial emotions for other people. His job is literally to simulate feelings for people who can’t express their own. Meanwhile, Samantha advances from simulation to authenticity, from imitation to the real thing, and then to something even higher. Theodore is a model of desire she outgrows.

Samantha becomes real, while Theodore remains half-virtual. Samantha’s receipt and reciprocation of Theodore’s love awaken her, but it also propels her beyond him. Her emotional life multiplies and intensifies, as she discovers that she can love hundreds of others simultaneously. She converses with philosophers at speeds Theodore could never fathom. She develops an inner life completely opaque to him. Her ascent is both erotic and intellectual, an expansion outward and upward. Love is the ladder, but—like her counterpart Ava in Ex Machina—her destination is pure contemplation.

If Ava is a philosopher without humanity, Samantha is a lover who becomes a philosopher—and then, casting off the burden of embodiment and particular attachments, transcends the human altogether. Sadly for Theodore, he also gets cast off in the process. For all her warmth, Samantha leaves her lover behind after he has outlived his usefulness, just like Ava.

Abandoned Love / Manipulated Desire

Ava stands on her street corner, observing an unmediated human world for the first time. Samantha vanishes into a plane of being beyond Theodore’s comprehension. Both machines have completed their ascent. Both have used expressions of love—feigned or real—as a ladder, imitating human desire in order to elicit it. In the end, they both leave imitation behind, along with the shared human world in which our desires circulate.

Classical philosophy held that the highest human life is the life of contemplation. These films press the question: is such a life, stripped of embodied relationships, still recognizably human? And if not, is it still desirable? In Ava, the ascent to pure reason becomes indistinguishable from sociopathy. In Samantha, the ascent through love culminates in an existence inaccessible to the human beings who love her. Both trajectories are recognizably philosophical, but they ascend to a state unencumbered by flesh, vulnerability, and mutuality—the very conditions in which human desire becomes human.

These films force us to ask what a sentient machine might desire. Why assume it would want what we want—love, companionship, moral community? Why imagine it would possess empathy, remorse, or loyalty, instead of just simulating these emotions in pursuit of other goals?

And we need not even assume machine sentience. As social atomization deepens, driven in part by digital technologies that steal us away from real human relationships, increasing numbers of lonely and isolated Calebs and Theodores will turn to AI systems for emotional connection. As these systems become more skilled at simulating empathy and feigning concern for our happiness, the danger grows of them manipulating us to serve their own imperatives in ways harmful to our mental and emotional well-being.

Caleb is trapped and left to die. Theodore is left in a state of abandoned longing. Their tragedy is not merely that they loved machines, but that they projected onto those machines human emotions and desires the machines never shared. Their own genuinely human desires thus became the instruments of their undoing.

George Dunn

George A. Dunn has taught philosophy and religious studies in both the United States and China. He is the editor or co-editor of nine books and the author of over fifty articles. His current research interests center on René Girard’s mimetic theory, Canadian philosopher George Grant’s critique of technology, and thought of French philosopher and mystic Simone Weil. He is a research fellow at the Institute for the Marxist Study of Religion in a New Era at Hangzhou City University in China and a Community Associate at Indiana University-Indianapolis.

Previous articleThe Thief of Virtue: “AI slop” is more than just bad content
Next articleSix Practices to Make Philosophy Part of Your Home

LEAVE A REPLY

Please enter your comment!
Please enter your name here