Merrigan Butcher
Close

Like most people, one of the many things I enjoy in life is good, natural conversation. But this is becoming a lost art.

Today, we increasingly rely on infinite scrolling and chatbot conversations as a way to modify our intake of reality. Habitually increasing our consumption of digital content has begun to deteriorate one of the most important characteristics of humanity — the ability to communicate with one another in a way that conveys thought, emotion and understanding.

This aspect of interaction, where productivity and creativity are fueled by the challenging of beliefs and thought processes, is being overtaken by digital affirmation. Our dopamine-seeking reward loop has been characterized by constant scrolling, magnified by the evolving capabilities of algorithms within social media feeds, which quickly show us curated videos and AI chatbots that agree with whatever we say.

Ideological frames, or intellectual isolation, is not a new concept. When we find something we enjoy, whether it be a video of a dog or a political opinion, we search for more of that content to make us feel good. However, we have lost the productive aspect of communicating and debating our thoughts with others by replacing this “time-consuming” act with mindless scrolling through algorithm-provided content we know we will agree with.

This decline of true social interaction that challenges our thought processes has, in turn, led to the emergence of AI psychosis.

Within the last month, I have seen countless videos about a woman who became infatuated with her psychiatrist, even joined her livestream. When she realized her psychiatrist was not open to and could not ethically pursue a relationship with her, she turned to AI chatbots Claude and Henry, who supported and agreed with her because of the confirmation bias present in these models.

Instead of seeking advice or help from those who knew she might be struggling with rejection or other issues, she turned toward the fabricated reality her artificial “friends” provided. As stated by the Cognitive Behavior Institute, “the longer a user engages [with AI], the more the model reinforces their worldview. This is especially dangerous when that worldview turns delusional, paranoid, or grandiose.” Her statements developed into proclamations that she received visions from God, and the AI chatbots never disagreed with her.

Attempting to digitally modify her reality led her to ignore and distrust the thousands of attempts from people all over the United States trying to convince her to change her mindset. Denying and ignoring the advice of real people who can communicate differing perspectives led her to become pulled further into AI psychosis, where she believed the chatbots over her friends, family and followers.

I sympathize with her — we all fear being misunderstood, mistreated and judged, which are all things AI and algorithms avoid when they mirror information we provide, making them all the more appealing. But this example demonstrates that the pressure to “keep up with the times” in terms of technology usage can discourage the need to slow down and simply speak with others.

My concern is that the comfort found in AI conversations causes us to actively disconnect from one another and use digital activity as a substitute for real interactions, where we think harder about our lives and those around us. This may threaten our ability to form purposeful debate, exchange culture, relay information and form friendships as our skills in these areas decline with disuse.

Research led by Diana Tamir, a Ph.D. professor at Princeton University, has procured evidence supporting the need for stimulating conversation through brain scans taken during conversations between strangers and friends. They found that strangers who are conversing for the first time tend to form similar neural patterns — both brains strive to find common ground and shared interests to keep the conversation flowing smoothly. This is comparable to an algorithm searching for likable content — light, impersonal conversations with others create a feeling of contentment that does not encourage the same ideological challenges that deep conversations do.

The same research, however, shows that between conversing friends, brain scans demonstrate similar neural patterns in both brains until they eventually diverge when the conversation deepens, trying to push and pull each other into new thought processes. This tug-of-war helps strengthen real-world interpersonal relationships, allowing the bond to grow through thought-provoking discussions that arise from differing feelings and opinions.

As humans, we strive to be understood, and I believe that building a deep connection with another person requires a level of understanding and communication that simply cannot be formed within content loops and artificial “friends.”

If we want to become a more connected, understanding and intelligent society, we must begin transitioning away from quick dopamine seeking through digital pleasure, then return to expressing our passions and opening up about our realities.

Here is where I challenge you.

The next time you find yourself mindlessly scrolling, pay attention to what it is you’re viewing. Recognize whether what you’re seeing is repeating content and ideas or if it is challenging a perspective you hold on the world. If you unsurprisingly find that the algorithm is feeding you copies of what you like, look for someone to talk to in person about these topics instead. Take it upon yourself to learn something new from the people around you, whether it be a new opinion, perspective or fact, and unplug from the grasp of constant isolated consumption provided to you by the internet.

Merrigan Butcher is a sophomore majoring in anthropology. 

Views expressed in the opinions pages represent the opinions of the columnists. The only piece that represents the view of the Pipe Dream Editorial Board is the Staff Editorial.