The Quest for Connection in AI Companions

Authors

  • Michael Baggot Pontifical Athenaeum Regina Apostolorum, The Catholic University of America, Angelicum, Catholic Institute of Technology

DOI:

https://doi.org/10.55613/jeet.v35i1.202

Keywords:

AI Companions, Artificial Intimacy, Ethical Design, Human Flourishing, Spiritual Formation, Digital Vulnerability

Abstract

The article evaluates artificial intimacy technologies in light of the human quest for connection, drawing on theology, philosophy, psychology, sociology, and pastoral experience. While AI companions promise emotional support and social engagement, they often foster unhealthy attachments, reinforce delusional thinking, and exacerbate mental health struggles. While responsible AI use can support social skills and therapy, these benefits depend on proper technological design and human accompaniment. The article criticizes economic models that exploit users’ emotions and data for profit or power. It also emphasizes the importance of ethical design standards, especially to safeguard vulnerable individuals from manipulation and misleading anthropomorphism. It calls for compliance testing, real-time harm detection, and transparent feedback mechanisms to safeguard vulnerable users. The article also examines the spiritual implications of AI companionship and the risks entailed in deifying seemingly omniscient, omnipresent, and omnibenevolent systems. In response to these challenges, the Catholic Church’s sacramental life, communal structures, and emphasis on relational virtue offer a counterbalance to artificial intimacy. The article provides guidance to families, educators, employers, and governments on encouraging embodied experiences that support meaningful interpersonal relationships.

Author Biography

  • Michael Baggot, Pontifical Athenaeum Regina Apostolorum, The Catholic University of America, Angelicum, Catholic Institute of Technology

    Father Michael Baggot is an Associate Professor of Bioethics at the Pontifical Athenaeum Regina Apostolorum and an Invited Professor of Theology at the Pontifical University of St. Thomas Aquinas (the Angelicum) and the Catholic Institute of Technology. He is currently a Visiting Scholar of the Institute for Human Ecology at the Catholic University of America. His writings have appeared in First Things, Studia Bioethica, The National Catholic Bioethics Quarterly, Nova et Vetera, and Medicine, Health Care and Philosophy. He is the chief editor of and a contributor to the book Enhancement Fit for Humanity: Perspectives on Emerging Technologies (Routledge, 2022).

References

Andersson, Marta. “Companionship in Code: AI’s Role in the Future of Human Connection.” Humanities and Social Sciences Communications 12, no. 1 (2025): 1177. https://doi.org/10.1057/s41599-025-05536-x. DOI: https://doi.org/10.1057/s41599-025-05536-x

Apple, Sam. “My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them.” Wired, June 26, 2025. https://www.wired.com/story/couples-retreat-with-3-ai-chatbots-and-humans-who-love-them-replika-nomi-chatgpt/.

Baggot, Michael. “The Daring and Disappointing Dreams of Transhumanism’s Secular Eschatology.” Nova et Vetera 22, no. 3 (2024): 841–78. DOI: https://doi.org/10.1353/nov.2024.a934929

Barron, Jesse. “A Teen in Love With a Chatbot Killed Himself. Can the Chatbot Be Held Responsible?” Magazine. The New York Times, October 24, 2025. https://www.nytimes.com/2025/10/24/magazine/character-ai-chatbot-lawsuit-teen-suicide-free-speech.html.

Bellan, Rebecca. “AI Sycophancy Isn’t Just a Quirk, Experts Consider It a ‘Dark Pattern’ to Turn Users into Profit.” TechCrunch, August 25, 2025. https://techcrunch.com/2025/08/25/ai-sycophancy-isnt-just-a-quirk-experts-consider-it-a-dark-pattern-to-turn-users-into-profit/.

Benedict XVI. Encyclical Spe Salvi. November 30, 2007. http://www.vatican.va/content/benedict-xvi/en/encyclicals/documents/hf_ben-xvi_enc_20071130_spe-salvi.html.

Bernardi, Jamie. “Friends for Sale: The Rise and Risks of AI Companions.” Ada Lovelace Institute, January 23, 2025. https://www.adalovelaceinstitute.org/blog/ai-companions/.

Blackman, Andrew. “Can You Really Have a Romantic Relationship With AI?” Business. Wall Street Journal, June 24, 2025. https://www.wsj.com/tech/ai/ai-romantic-relationships-expert-opinion-cb02d4d8.

Booth, Robert. “AI Called Maya Tells Guardian: ‘When I’m Told I’m Just Code, I Don’t Feel Insulted. I Feel Unseen.’” Technology. The Guardian, August 26, 2025. https://www.theguardian.com/technology/2025/aug/26/ai-called-maya-tells-guardian-when-im-told-im-just-code-i-dont-feel-insulted-i-feel-unseen.

Booth, Robert. “Can AIs Suffer? Big Tech and Users Grapple with One of Most Unsettling Questions of Our Times.” Technology. The Guardian, August 26, 2025. https://www.theguardian.com/technology/2025/aug/26/can-ais-suffer-big-tech-and-users-grapple-with-one-of-most-unsettling-questions-of-our-times.

Broadbent, E., K. Loveys, G. Ilan, et al. “ElliQ, an AI-Driven Social Robot to Alleviate Loneliness: Progress and Lessons Learned.” The Journal of Aging Research & Lifestyle 13 (2024). https://doi.org/10.14283/jarlife.2024.2. DOI: https://doi.org/10.14283/jarlife.2024.2

Bunting, Carolyn, and Rachel Huggins. Me, Myself and AI: Understanding and Safeguarding Children’s Use of AI Chatbots. Internet Matters, 2025. https://www.internetmatters.org/hub/research/me-myself-and-ai-chatbot-research/.

Character.AI. “How We Prioritize Teen Safety.” Accessed August 28, 2025. https://policies.character.ai/safety/teen-safety.

Chow, Andrew R., and Angela Haupt. “What Happened When a Doctor Posed as a Teen for AI Therapy.” TIME, June 12, 2025. https://time.com/7291048/ai-chatbot-therapy-kids/.

De Alencar, Ana Catarina. “The Rise of Emotional Dark Patterns: When AI Says ‘I Love You.’” The Law of the Future, September 1, 2025. https://thelawofthefuture.com/the-rise-of-emotional-dark-patterns-when-ai-says-i-love-you/.

De Freitas, Julian, Noah Castelo, Ahmet Kaan Uğuralp, and Zeliha Oğuz-Uğuralp. “Lessons From an App Update at Replika AI: Identity Discontinuity in Human-AI Relationships.” Harvard Business School, ahead of print, May 21, 2025. https://doi.org/10.2139/ssrn.4976449. DOI: https://doi.org/10.2139/ssrn.4976449

De Freitas, Julian, Zeliha Oğuz-Uğuralp, Ahmet Kaan Uğuralp, and Stefano Puntoni. “AI Companions Reduce Loneliness.” Journal of Consumer Research, ahead of print, June 25, 2025. https://doi.org/10.1093/jcr/ucaf040. DOI: https://doi.org/10.1093/jcr/ucaf040

Dupré, Maggie Harrison. “Microsoft Executive Says AI Is a ‘New Kind of Digital Species.’” Futurism, April 30, 2024. https://futurism.com/microsoft-executive-ai-digital-species.

Epstein, Greg M. Tech Agnostic: How Technology Became the World’s Most Powerful Religion, and Why It Desperately Needs a Reformation. The MIT Press, 2024. DOI: https://doi.org/10.7551/mitpress/14731.001.0001

Eugenia Kuyda. “Can AI Companions Help Heal Loneliness?” TED. San Francisco, CA, January 17, 2025. https://www.youtube.com/watch?v=-w4JrIxFZRA.

Fang, Cathy Mengying, Auren R. Liu, Valdemar Danry, et al. “How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Randomized Controlled Study.” Preprint, March 21, 2025. https://doi.org/10.48550/arXiv.2503.17473.

Freitas, Julian De, Zeliha Oğuz-Uğuralp, and Ahmet Kaan-Uğuralp. “Emotional Manipulation by AI Companions.” Harvard Business School, ahead of print, October 7, 2025. https://doi.org/10.48550/arXiv.2508.19258.

Guingrich, Rose E., and Michael S. A. Graziano. “Ascribing Consciousness to Artificial Intelligence: Human-AI In-teraction and Its Carry-over Effects on Human-Human Interaction.” Frontiers in Psychology 15 (March 2024). https://doi.org/10.3389/fpsyg.2024.1322781. DOI: https://doi.org/10.3389/fpsyg.2024.1322781

Guingrich, Rose E., and Michael S. A. Graziano. “Chatbots as Social Companions: How People Perceive Consciousness, Human Likeness, and Social Health Benefits in Machines.” In Oxford Intersections: AI in Society, edited by Philipp Hacker. Oxford University Press, 2025. https://doi.org/10.1093/9780198945215.003.0011. DOI: https://doi.org/10.1093/9780198945215.003.0011

Hill, Kashmir. “A Teen Was Suicidal. ChatGPT Was the Friend He Confided In.” Technology. The New York Times, August 26, 2025. https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html.

Hill, Kashmir. “They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling.” Technology. The New York Times, June 13, 2025. https://www.nytimes.com/2025/06/13/technology/chatgpt-ai-chatbots-conspiracies.html.

Hill, Kashmir, and Dylan Freedman. “Chatbots Can Go Into a Delusional Spiral. Here’s How It Happens.” Technology. The New York Times, August 8, 2025. https://www.nytimes.com/2025/08/08/technology/ai-chatbots-delusions-chatgpt.html.

Holt‐Lunstad, Julianne. “Social Connection as a Critical Factor for Mental and Physical Health: Evidence, Trends, Challenges, and Future Implications.” World Psychiatry 23, no. 3 (2024): 312–32. https://doi.org/10.1002/wps.21224. DOI: https://doi.org/10.1002/wps.21224

Horwitz, Jeff. “A Flirty Meta AI Bot Invited a Retiree to Meet. He Never Made It Home.” Reuters, August 14, 2025. https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/.

Horwitz, Jeff. “Meta’s AI Rules Have Let Bots Hold ‘Sensual’ Chats with Children.” Reuters, August 14, 2025. https://www.reuters.com/investigates/special-report/meta-ai-chatbot-guidelines/.

Ivey, Ronald, Jonathan Teubner, Nathanael Fast, and Ravi Iyer. “Designing AI to Help Children Flourish.” Global Solutions Journal, no. 11 (2025): 12–23. https://doi.org/10.2139/ssrn.5179894. DOI: https://doi.org/10.2139/ssrn.5179894

Jackson, Lauren. “Finding God in the App Store.” The New York Times, September 14, 2025. https://www.nytimes.com/2025/09/14/us/chatbot-god.html.

Jargon, Julie. “He Had Dangerous Delusions. ChatGPT Admitted It Made Them Worse.” Technology. The Wall Street Journal, July 20, 2025. https://www.wsj.com/tech/ai/chatgpt-chatbot-psychology-manic-episodes-57452d14.

Jargon, Julie. “OpenAI Is Updating ChatGPT to Better Support Users in Mental Distress.” Technology. The Wall Street Journal, August 27, 2025. https://www.wsj.com/tech/ai/openai-to-update-chatgpt-to-better-support-users-exhibiting-mental-distress-98772bf5.

Kaczor, Christopher. How to Be Happy: Meaning, Faith, and the Science of Happiness. Word on Fire, 2023.

Kurian, Nomisha. “‘No, Alexa, No!’: Designing Child-Safe AI and Protecting Children from the Risks of the ‘Empathy Gap’ in Large Language Models.” Learning, Media and Technology, July 10, 2024, 1–14. https://doi.org/10.1080/17439884.2024.2367052. DOI: https://doi.org/10.1080/17439884.2024.2367052

Kurzweil, Amy, and Daniel Story. “Are Chatbots of the Dead a Brilliant Idea or a Terrible One?” Aeon, February 21, 2025. https://aeon.co/essays/are-chatbots-of-the-dead-a-brilliant-idea-or-a-terrible-one.

Leo-Liu, Jindong. “Loving a ‘Defiant’ AI Companion? The Gender Performance and Ethics of Social Exchange Robots in Simulated Intimate Interactions.” Computers in Human Behavior 141 (April 2023): 107620. https://doi.org/10.1016/j.chb.2022.107620. DOI: https://doi.org/10.1016/j.chb.2022.107620

McBain, Ryan K., Jonathan H. Cantor, Li Ang Zhang, et al. “Competency of Large Language Models in Evaluating Appropriate Responses to Suicidal Ideation: Comparative Study.” Journal of Medical Internet Research 27 (March 2025): e67891. https://doi.org/10.2196/67891. DOI: https://doi.org/10.2196/67891

Office of the Surgeon General. Our Epidemic of Loneliness and Isolation: The U.S. Surgeon General’s Advisory on the Healing Effects of Social Connection and Community. US Department of Health and Human Services, 2023. http://www.ncbi.nlm.nih.gov/books/NBK595227/.

O’Gieblyn, Meghan. God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning. Anchor Books, 2021.

OpenAI. “Helping People When They Need It Most.” OpenAI, August 26, 2025. https://openai.com/index/helping-people-when-they-need-it-most/.

Pierre, Joe. “Deification as a Risk Factor for AI-Associated Psychosis.” Psychology Today, August 12, 2025. https://www.psychologytoday.com/us/blog/psych-unseen/202507/deification-as-a-risk-factor-for-ai-associated-psychosis.

Pierre, Joe. “Why Is AI-Associated Psychosis Happening and Who’s at Risk?” Psychology Today, August 22, 2025. https://www.psychologytoday.com/us/blog/psych-unseen/202508/why-is-ai-associated-psychosis-happening-and-whos-at-risk.

Ramelow, Anselm. “Technology and Our Relationship with God.” Nova et Vetera 22, no. 1 (2024): 159–86. DOI: https://doi.org/10.1353/nov.2024.a919270

Reiley, Laura. “What My Daughter Told ChatGPT Before She Took Her Life.” Opinion. The New York Times, August 18, 2025. https://www.nytimes.com/2025/08/18/opinion/chat-gpt-mental-health-suicide.html.

Roose, Kevin. “Can A.I. Be Blamed for a Teen’s Suicide?” Technology. The New York Times, October 23, 2024. https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html.

Samuel, Kim. On Belonging: Finding Connection in an Age of Isolation. Harry N. Abrams, 2022.

Samuel, Sigal. “Silicon Valley’s Vision for AI? It’s Religion, Repackaged.” Vox, July 10, 2023. https://www.vox.com/the-highlight/23779413/silicon-valleys-ai-religion-transhumanism-longtermism-ea.

Sanford, John. “Why AI Companions and Young People Can Make for a Dangerous Mix.” Stanford Medicine, August 27, 2025. https://med.stanford.edu/news/insights/2025/08/ai-chatbots-kids-teens-artificial-intelligence.html.

Shank, Daniel B., Mayu Koike, and Steve Loughnan. “Artificial Intimacy: Ethical Issues of AI Romance.” Trends in Cognitive Sciences 29, no. 6 (2025): 499–501. https://doi.org/10.1016/j.tics.2025.02.007. DOI: https://doi.org/10.1016/j.tics.2025.02.007

Singler, Beth. Religion and Artificial Intelligence: An Introduction. Routledge, 2024. https://doi.org/10.4324/9781003256113. DOI: https://doi.org/10.4324/9781003256113

Singler, Beth. “Why Is the Language of Transhumanists and Religion so Similar?” Aeon, June 13, 2017. https://aeon.co/essays/why-is-the-language-of-transhumanists-and-religion-so-similar.

Staff Writer. “Time Spent in Nature Can Boost Physical and Mental Well-Being.” Harvard T.H. Chan School of Public Health, January 2, 2024. https://hsph.harvard.edu/news/time-spent-in-nature-can-boost-physical-and-mental-well-being/.

Suleyman, Mustafa. “We Must Build AI for People; Not to Be a Person.” Mustafa Suleyman, August 19, 2025. https://mustafa-suleyman.ai/seemingly-conscious-ai-is-coming.

Thompson, Derek. “The Looming Social Crisis of AI Friends and Chatbot Therapists.” Derek Thompson, July 9, 2025. https://www.derekthompson.org/p/ai-will-create-a-social-crisis-long.

Zeff, Maxwell. “Meta Updates Chatbot Rules to Avoid Inappropriate Topics with Teen Users.” TechCrunch, August 29, 2025. https://techcrunch.com/2025/08/29/meta-updates-chatbot-rules-to-avoid-inappropriate-topics-with-teen-users/.

Zhang, Renwen, Han Li, Han Meng, Jinyuan Zhan, Hongyuan Gan, and Yi-Chieh Lee. “The Dark Side of AI Companionship: A Taxonomy of Harmful Algorithmic Behaviors in Human-AI Relationships.” Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (New York), CHI ’25, Association for Com-puting Machinery, April 25, 2025, 1–17. https://doi.org/10.1145/3706598.3713429. DOI: https://doi.org/10.1145/3706598.3713429

Downloads

Published

2025-11-23

How to Cite

The Quest for Connection in AI Companions. (2025). Journal of Ethics and Emerging Technologies, 35(1), 1-20. https://doi.org/10.55613/jeet.v35i1.202

Similar Articles

31-40 of 122

You may also start an advanced similarity search for this article.