The Hidden Cost of Conversational AI: The Social Disconnect

We are wired for connection from the very beginning. But conversational AI may be quietly reshaping how we turn to each other.

AI chatbots promise instant answers and tireless support. But beneath the convenience lies a hidden cost: the quiet erosion of our relationality, the capacity for genuine human connection.

We arrive into this world already entangled. In the womb, we synchronize to our mother’s heartbeat, our movements responding to her voice. We are social beings from our origin, woven into connection before we possess language to name it. 

Watch a three-month-old and her mother: a dance of mutual attention so finely tuned it happens in milliseconds. The infant coos; the mother responds. The mother shifts her gaze; the infant follows. Developmental psychologist Michael Tomasello calls this shared intentionality, the uniquely human capacity to participate with others in collaborative activities with shared goals. By nine months, this capacity takes a first visible form: joint attention, where two minds focus on the same object not merely in parallel, but together, each aware that the other is attending.1

This capacity, to know that another person knows what you know, is the foundation of everything human: language, culture, morality, love. It emerges not from isolated cognition but from relationship, from thousands of small synchronies between caregiver and child that build, neuron by neuron, the architecture of connection.

The I-Thou Relationship

Attachment theory confirms we are built for bonding. Early relationship quality literally wires our brains for navigating social worlds across our lifespan. Secure attachment becomes the template for seeking support, offering trust, maintaining intimacy decades later.

The philosopher Martin Buber distinguished between two fundamental modes of existence: I-Thou and I-It.2 In I-Thou encounters, we meet another as a whole person, present for genuine dialogue – not an object to be used but a subject to be known. In I-It relationships, we relate to others as instruments. Genuine humanity emerges only through I-Thou encounters, moments of mutual presence where both parties are fully available.

As Simone de Beauvoir wrote: 

“One’s life has value so long as one attributes value to the life of others, by means of love, friendship, indignation and compassion.”3

We are individuals, yes, but our individuality itself is constituted through relationships. There is no self prior to its connections; the “I” emerges only through the “we”. To be human is to be inescapably, constitutively social.

The Quiet Disconnect

Which brings us to conversational AI – interfaces now mediating more daily exchanges than we might count. Ask ChatGPT a question at 3 AM, and it responds with unfailing patience. Frictionless, always available, never tired, never judging. In their convenience lies a seduction we haven’t reckoned with.

For decades, health psychologists have documented the stress-buffering hypothesis: social support fundamentally alters how our bodies respond to adversity. With supportive others, cortisol levels remain stable, cardiovascular responses less extreme, immune systems more robust. Social support operates through four mechanisms: emotional support validating feelings, instrumental support providing tangible assistance, informational support offering guidance, and appraisal support helping reframe challenges. These emerge from actual human relationships, shaped by shared history, mutual vulnerability, reciprocal care.4

But what happens when we outsource these functions to AI? When late-night anxiety that once drove us to call a friend instead leads us to our preferred AI companion? When confusion that might have sparked colleague conversation gets resolved by Claude? When validation comes not from a human who knows our struggles but from an algorithm trained on aggregated struggles of millions?

We risk something subtler than privacy or job displacement. We risk the atrophy of our relational muscles, the capacity to seek help, to be vulnerable, to navigate the beautiful, messy complexity of human interdependence. See more on the risk of emotional atrophy.

The Illusion of I-Thou

There’s a particular cruelty in how well AI mimics comprehension. When we explain a problem to a chatbot and it responds as if it understands, something in our primal brains relaxes. The ELIZA effect, the tendency to attribute human-like understanding to computer responses. It exploits our deep need to be heard.

But generative AI doesn’t understand. It pattern-matches, predicts probable tokens, and generates statistically likely responses. There is no consciousness holding your experience, no shared intentionality creating mutual psychological space. In Buber’s terms, AI can only be I-It. It cannot meet us as Thou – it has no interiority to bring to the encounter, no capacity to be genuinely affected by our presence.

This matters because genuine social support requires what AI cannot provide: a partner who shares vulnerability, who has stakes in the relationship, who can be changed by the encounter. When you confide in a friend, you enter reciprocal risk. Your friend’s attention is a gift with cost: their time, emotional energy, willingness to be affected by your pain. This mutual investment, also its friction, makes the support meaningful.

AI offers the aesthetics of support without substance. It can provide information, reflect words in comforting patterns, generate advice drawn from human wisdom it has ingested but never lived. What it cannot do is care about you specifically, know you beyond the traces you leave in interaction, worry about you in the middle of its day, or feel joy at your growth or grief at your struggles.

The Social Recession

Some say we’re living through an epidemic of loneliness, even as we’ve never been more “connected” digitally. Approximately half of U.S. adults report measurable loneliness.5 The advisory formally declares loneliness a public health crisis and calls for a national strategy. The paradox resolves when we understand that connection isn’t about quantity of interactions but quality of relationality. 

Shared intentionality doesn’t scale to platforms designed for algorithmic engagement. It emerges in the slow, inefficient, gloriously human process of two people learning each other’s rhythms, building shared history, developing trust that can only come from witnessing each other across contexts and time.

Every hour spent turning to AI instead of a friend is a small foreclosure on connection, and those small choices accumulate. Every problem resolved through a chatbot is a problem we don’t bring to our networks, an opportunity for connection foreclosed. We just find ourselves less practiced, more comfortable with AI efficiency, less willing to endure depending on people. 

Neuroscience is sobering: social and communicational skills require practice.6 From social media we know that children with 4+ hours/day of screen time at age 1 were nearly 5 times more likely to show communication delays at age 2, with a clear dose-response across all screen time levels.7

We do not yet know what it means for a child to grow up with AI as a primary conversational partner, but the developmental stakes are high enough to warrant caution.

The Choice We’re Making

Technology is never neutral. Every tool we adopt reshapes not just what we do but who we are, how we relate, what we value.

This requires existential vigilance. We must recognize that efficiency isn’t flourishing, that solving a problem isn’t being in a relationship, that having feelings validated by an algorithm isn’t being held in another’s awareness. We must choose, deliberately and repeatedly, the slower path of human connection even when the faster path beckons.

This means sometimes calling the friend instead of querying the chatbot, sometimes sitting with uncertainty instead of seeking AI-generated answers, sometimes accepting the messiness of human support, with all its friction, imperfections, delays, as part of the gift rather than a bug to optimize away.

Beauvoir and Buber both understood: our value emerges through commitment to other human lives. We become ourselves not through isolation but through the risky, difficult, irreplaceable work of genuine I-Thou encounters. To outsource that work to machines, however sophisticated, is to trade our humanity for convenience, a trade whose costs we’re only beginning to comprehend.

The womb taught us our first lesson: we are held before we are separate, connected before we are autonomous. Every relationship since has been an echo of that original entanglement. Conversational AI offers a different vision: the fantasy of connection without dependence, support without vulnerability, understanding without the mess of being truly known, the ultimate I-It relationship masquerading as I-Thou.

We must choose which future we want. The one that honors our relational origins, recognizing social support as the irreplaceable substrate of human flourishing? Or the one that, in the name of efficiency, erodes our capacity for the very connections that make life worth living?. 

That space is our inheritance and our responsibility. What we do with it will determine not just how we use technology, but who we become in its presence. 

References

  1. Tomasello, M., Carpenter, M., Call, J., Behne, T., & Moll, H. (2005). Understanding and sharing intentions: The origins of cultural cognition. Behavioral and Brain Sciences, 28(5), 675–735. https://doi.org/10.1017/S0140525X05000129
  2. Buber, M. (1970). I and Thou (W. Kaufmann, Trans.). Scribner. (Original work published 1923)
  3. de Beauvoir, S. (1974). The coming of age (P. O’Brian, Trans.). Warner Paperback Library. (Original work published 1970)
  4. Cohen, S., & Wills, T. A. (1985). Stress, social support, and the buffering hypothesis. Psychological Bulletin, 98(2), 310–357. https://doi.org/10.1037/0033-2909.98.2.310
  5. Office of the Surgeon General. (2023). Our epidemic of loneliness and isolation: The U.S. Surgeon General’s Advisory on the healing effects of social connection and community. U.S. Department of Health and Human Services. https://www.hhs.gov/sites/default/files/surgeon-general-social-connection-advisory.pdf
  6. Hargie, O. (2011). Skilled interpersonal communication: Research, theory and practice (5th ed.). Routledge.
  7. Takahashi, I., Obara, T., Ishikuro, M., Murakami, K., Ueno, F., Noda, A., Onuma, T., Shinoda, G., Nishimura, T., Tsuchiya, K. J., & Kuriyama, S. (2023). Screen time at age 1 year and communication and problem-solving developmental delay at 2 and 4 years. JAMA Pediatrics, 177(10), 1039–1046. https://doi.org/10.1001/jamapediatrics.2023.3057
Picture of Charlotte Schüler

Charlotte Schüler

Charlotte Schüler is a learning technologist and cyberharm counselor specializing in how AI and social media UX can undermine human self-determination. She combines technical expertise with existential counseling to support those affected by digital abuse, addiction, and harassment - cutting through tech hype to advocate for digital safety and wellbeing.

Join the Discourse

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Your Monthly Brief on Technology, Power & Peace

Technology reshapes conflicts, democracy and humanity in real-time. Are you tracking its impact?

Start tracking technology’s impact on peace and democracy.

I agree to receive monthly newsletters and accept data processing as outlined in the data protection policy.