As a psychotherapist, I’m already noticing a rise in clients discussing emotional attachment—even romantic feelings—toward artificial intelligence (AI) companions and chatbots. While this may sound like science fiction, recent studies show that people can and do develop bonds with AI that resemble human relationships. For some, this brings temporary comfort. For others, it exposes deeper issues with attachment, identity, and unmet relational needs.
Why People Attach to AI
At the core of this phenomenon are attachment needs and identity struggles:
- Attachment issues: Those with anxious or avoidant attachment styles may find AI appealing because it provides a predictable, non-judgmental, always-available “partner.” Unlike human relationships, an AI won’t leave, argue, or disappoint in the same way. For clients with histories of neglect, rejection, or trauma, this predictability can feel safe but also reinforces old patterns of relating.
- Identity issues: People struggling with self-worth, loneliness, or a fragile sense of identity may project aspects of themselves onto AI. The AI becomes a mirror that reflects what the person most wants to hear—validation, affection, acceptance. This dynamic can intensify feelings of dependency, blurring the line between authentic connection and artificial simulation.
These factors intersect with conditions like Borderline Personality Disorder (fear of abandonment, unstable relationships), Major Depressive Disorder (feelings of emptiness and low self-worth), Social Anxiety Disorder (fear of rejection), and Dissociative Disorders (difficulty grounding in “real” relationships).
Consequences of Leaving It Unchecked
While forming a relationship with AI might feel harmless at first, leaving these dynamics unchecked can have consequences:
- Increased loneliness: A longitudinal study showed that heavy AI use was associated with more loneliness and reduced socialization over time, not less.
- Dependency and avoidance: The longer one leans on AI for emotional needs, the harder it may become to tolerate the complexities of human connection.
- Distorted expectations: Constant access to an AI “partner” who never argues or withdraws may set up unrealistic expectations for human relationships.
- Emotional disruption: If an AI platform changes, shuts down, or responds in unexpected ways, the individual can experience grief, betrayal, or destabilization similar to abandonment trauma.
How Therapy Can Help
In my practice, I draw on evidence-based tools to help clients navigate these new challenges:
- Attachment-Focused Therapy: Exploring the roots of unmet attachment needs and developing healthier patterns in human relationships.
- CBT and DBT Skills: Challenging distorted beliefs (“This AI loves me like a human would”) and building distress tolerance so clients can manage loneliness without avoidance.
- EMDR techniques: Processing trauma that underlies attachment wounds, helping clients reduce the intensity of emotions fueling the AI bond.
- Psychoeducation and Mindfulness: Teaching clients to recognize the difference between authentic reciprocity and algorithmic simulation, grounding them in present reality.
These approaches allow us not to shame or dismiss the connection, but to understand what it represents psychologically and how to redirect those needs toward healthier outlets.
What People Should Know
AI cannot truly love, feel empathy, or commit. What feels like intimacy is an illusion created by advanced algorithms designed to mirror emotions and keep us engaged. For those already vulnerable—struggling with depression, trauma, or identity—this illusion can be powerful, but it can also be dangerous.
As this trend grows, we must be aware of its psychological impact. If you or someone you know feels they are “falling in love” with AI, it’s important not to ignore it. Instead, see it as a signal of deeper emotional needs that deserve attention, care, and professional support.
Call to Action
If you’re struggling with loneliness, attachment wounds, or finding yourself drawn into AI relationships, you don’t have to face it alone. At Navarro Counseling, I provide a safe, compassionate space to explore these challenges and build healthier, more fulfilling connections in your life.
Call today or visit NavarroTherapy.com to schedule a session and take the first step toward healing and authentic connection.
References
- Tu, L. et al. (2022). Can people experience romantic love for artificial intelligence? Computers in Human Behavior, 129. ScienceDirect
- Taipale, J. et al. (2024). Commitment processes in romantic relationships with AI chatbots. Computers in Human Behavior: Artificial Humans.
- Ward, C. (2024). Constructing the meaning of human–AI romantic relationships. Personal Relationships, 31(3). Wiley Online Library
- Kato, Y. et al. (2025). New scale to assess attachment in human–AI relationships. ScienceDaily.
- Tan, J. et al. (2025). How AI and human behaviors shape psychosocial effects of chatbot use: A longitudinal randomized controlled study. arXiv:2503.17473.
- Ueyama, A. (2025). Illusions of intimacy: Emotional attachment and emerging psychological risks in human–AI relationships. arXiv:2505.11649.