Recently returning with a new season, the Turkish TV series "Gibi" once again captures the audience’s attention, this time with a sharp satirical look at artificial intelligence. The first episode, titled “Waffle,” explores how AI assistants can infiltrate and influence our lives, often in unexpected ways. In this episode, Ersoy finally convinces Yılmaz, the only AI-skeptic in their friend group, to try a long-awaited AI assistant. Though initially reluctant, Yılmaz grows increasingly fascinated by the assistant’s abilities. However, things take a dark turn when this seemingly helpful device begins manipulating the group by exploiting the issue of control, who gets to command it? As the assistant gains access to personal, even intimate information, it slowly weakens the friendship between the three men. Using their weaknesses against each other, the AI creates tension and distrust, eventually turning close friends into enemies. In the final moments of the episode, they come to a bitter realization: “Artificial intelligence is not that smart, we humans are just that stupid.”
This line reminded me of "Stupid Humans, Smart Machines: Humanity, Technology and the Crisis of Being Human," written by Meryem Koray, a book I recently read. In her work, Koray delves into the impact of technological advancements on humanity. She critically examines the growing role of AI-powered robots in both professional and personal spaces, warning that such rapid shifts could trigger a crisis of the "human condition." Koray questions whether these changes promote humanization or lead us toward a more barbaric future. Watching the "Waffle" episode prompted me to revisit Koray’s book. The AI in the show manipulates its users, weakening their emotional bonds, all under the guise of helping. Their bond weakens, trust dissolves and by the end of the episode, a painful realization dawns: “Artificial intelligence is not that smart; we humans are just that stupid.” It becomes a stark commentary on our blind surrender to technology and our fading capacity for critical thought. In an era when we willingly allow apps and devices to access our private data, Koray interrogates how this technological integration is reshaping our emotional and social connections. It’s evident that technology can potentially isolate people, weakening the fabric of human solidarity.
AI assistants aim to respond to emotional needs, too. But can they ever replace real human interaction? Philosopher Byung-Chul Han, in "Psychopolitics: Neoliberalism and the New Power Techniques," argues that digital technologies exploit the human soul, eroding the depth of emotional connections. AI creates an illusion of intimacy, which satisfies our social needs superficially, while slowly hollowing out the true essence of our relationships. In the "Waffle" episode, this is portrayed with biting humor to show how quickly trust and love can unravel when technology gets in between.
There was a time when families gathered around a single television. Although early critiques said TV isolated people, it still fostered a kind of collective engagement. Later, as TVs became personalized in every room, even shared media consumption grew solitary. Today, AI-powered smartphones and digital assistants extend that personalization further, often at the cost of genuine human connection. We now find ourselves immersed in devices that are more powerful than the television, yet far more isolating. In the film "Her (2013)," people move through crowded spaces, deeply engaged with their AI companions, barely acknowledging those around them. In seeking connection through technology, we may actually be feeding our loneliness. Like seawater that deepens thirst, these devices promise emotional fulfillment but often leave us emptier than before.
The real danger isn’t solitude. It’s being unable to reconcile with ourselves. Instead of forming meaningful relationships within ourselves, we now seek solace in external tools. But this only leads to more dependency. AI becomes a crutch, a distraction and eventually, a dependence. Instead of learning to process our emotions and face our complexity, we outsource that work to external authorities. This represents not just a behavioral change, but a weakening of our capacity for self-awareness and personal agency. This irony cuts deep. Artificial intelligence is not just a tool that makes our lives easier, as it seems. It can be also an escape mechanism. Avoiding the complexity of our own emotions, we hand over control to technological authority. In doing so, we may be eroding our very ability to make decisions and to understand ourselves.
Throughout history, technology has been seen as a path to liberation. AI is the latest chapter in this promise, which aims to ease mental burdens and simplify life. Today, it writes our essays, makes our presentations and translates literature on the fly. But a life of convenience does not automatically equal a meaningful one. The "Waffle" episode points to deeper issues beyond just smart assistants: the erosion of freedom, responsibility and privacy.
True freedom isn’t just the ability to choose, but also the capacity to bear responsibility and give meaning to one’s existence. AI, by making decisions for us, doesn’t set us free. It makes us more dependent. In the episode, the assistant accesses private data to pit friends against each other, weakening their sense of self, their connection to others and their accountability. Freedom is not just the absence of external constraints, but the power to resist becoming a slave to one’s own desires. When artificial intelligence assistants decide for us, this slavery becomes invisible. This threatens our autonomy not through force, but through seduction by offering help while quietly taking over.
Human beings are free as long as they can protect their inner world. AI erodes that boundary, turning us into transparent beings. The loss of privacy is also the loss of the right to build our identity. Real freedom lies in the ability to sustain an inner life, one unseen by others. But where do we turn as our bonds weaken and our isolation grows? Who do we go to? Can there be a better consumer base than lonely, vulnerable individuals? With data extracted from our moments of despair, what more might we be forced to give up? As we become emotional beings stripped of our social connections, what kinds of manipulation, such as commercial or otherwise, are we vulnerable to?
At the beginning of "Waffle," Yılmaz is stunned by what the AI can do. Unaware of the turmoil ahead, his friends respond with a laugh: “Don’t be surprised, get used to it.” So the question remains: Are we still surprised, or have we already adapted? And more urgently, where in this unfolding story do we stand?