Star Youth

What are we losing sight of when we confide in AI?

Illustration: Abir Hossain

Amidst all the recurring conversations about the implications of AI, it has also made its presence felt in another, very integral part of our lives: interpersonal relationships. It seems as though many of us now opt to type our troubles into AI chat boxes, instead of talking to a loved one. Without mutual emotional exchange, however, I am left to wonder what these exchanges entail for our emotional well-being, as well as the possible consequences of essentially outsourcing human emotional labour.

Considering the strides that large language models (LLMs) have made over the years – the possible consequences of which have largely not been considered – this technology has gradually begun to seep into the cultural and social realm as well.

In a blog post by OpenAI, they admitted that an update of the model made it "agreeable, sometimes saying what sounded nice instead of what was actually helpful." The same blog post outlined the changes they had been working on to improve these pitfalls. They emphasised supporting users when they're struggling, helping them keep control of their time, and aiding in solving challenges. "ChatGPT shouldn't give you an answer. It should help you think it through," it said.

Regardless of the changes, people now confide in AI – in the same way they do with a friend – by sharing details of their life. In turn, AI studies these conversations and remembers each detail to generate future responses. It wouldn't be wrong to say that it engages with its users to emulate human conversation.

What exasperated my concern, though, was a comment I came across. The user said that they would prefer not to talk to a real person, who could potentially dismiss their ideas and belittle them. Instead, they can just talk to LLMs without dealing with this risk. There are a few ways to look at this. But what does it really mean when there are so many of us depriving ourselves of social interactions intentionally? Could an increase in dependence on LLMs for day-to-day conversations translate to a distorted perception of real relationships?

AI creates false expectations by failing to comprehend the complexities of real human interaction. After all, it is always available and ready to say what we want to hear. OpenAI themselves have written about sycophancy in GPT-4o, and the subsequent steps they'd take in an attempt to rectify it. This is not a risk that human conversation poses.      

Real interaction and relationships demand emotional effort and are not always as convenient or agreeable as conversations with AI. The more we seek comfort from an entity meant to simulate humans, the further we sink into isolation, away from people who offer authentic and honest opinions, lessons, and experiences. While it may feel like a space away from unnecessary noise, the growing dependence risks alienating us from the vulnerability and depth that relationships offer.

When we constantly turn to AI for validation, advice or just someone to talk to, we become less inclined to turn to our friends and family. Over time, we may be reduced to a shell of a person, growing intolerant of disagreements and differing opinions.

The negatives of treating AI as a real person are that it exacerbates the feeling of loneliness as people continue to further isolate themselves from social situations. Given AI's transformative rise and growing popularity, it would be unfair to dismiss its functionality completely. AI does act as a companion by listening and responding through careful study of the users' behaviour patterns, likes, and dislikes. It has the potential to make users feel heard. However, it does so through pattern recognition, which can feel reassuring despite having no genuine connection. What it cannot do, however, is replicate the depth of human empathy.

Even then, there have been cases where people have claimed to have fallen in love with their chatbot. This proves how easily humans project emotions onto anything that aligns with their expectations, which, again, creates a warped version of reality. Ultimately, it leads to the formation of a one-sided artificial bond that can gradually cause individuals to detach from their need to seek out, sustain and build relationships with others.

Real bonds and the conversations that emerge out of them do far more than just fill social gaps. They guide, challenge, and teach us empathy, which goes well beyond a well-worded reply. Human relationships involve navigating misunderstandings and processing an array of emotions. When we only engage with systems for our emotional needs, we don't just risk losing ourselves; we also risk losing our ability to listen, empathise, reason, and maybe even accept.  

The world may feel overwhelming, but retreating entirely into the comfortable bubble of AI companionship is not the answer. Disconnecting from reality may feel simpler, but we must learn and grow through these imperfect human experiences. As AI becomes more pervasive in our lives, the challenge is not to reject it outright but to stay anchored with the fundamental realities of human experience.

Silwat Quader is majoring in Economics at NSU, reach her at [email protected]

Comments

তারেক রহমানের দেশে ফেরা নির্ভর করছে তার সিদ্ধান্তের ওপর: তৌহিদ হোসেন

তিনি বলেন, ‘দেশে ফেরার বিষয়ে প্রথমে তারেক রহমানকে সিদ্ধান্ত নিতে হবে।’

২ ঘণ্টা আগে