Artificial Intelligence (AI) and Virtual Companionship: A Psychological Review and Future Directions
DOI:
https://doi.org/10.54097/qmkntc12Keywords:
Artificial Intelligence, Virtual Companionship, Chatbots, Human–computer Interaction, PsychologyAbstract
As generative artificial intelligence (AI) and chatbots (e.g., Replika and Character.ai) have increasingly developed with more advanced technology, more individuals were forming virtual companionship with artificial agents. While virtual AI companions have helped reducing loneliness, improving well-being, and providing emotional support, discussions were also ongoing about the ethical risks of using AI as a companion such as data privacy, overdependence, and commodification of relationships. Generally, this review investigated AI companion phenomenon and demonstrated the effects with both positive and negative psychological implications, analysing current existing research gaps along with possible future directions. This review aimed to provide a psychological perspective for understanding companionship and relationships with AI systems, thereby helping to further improve the interdisciplinary studies of virtual companions, offering potential reference for social and ethical criteria when managing companion AI chatbots. It was found that AI companions could simultaneously fulfil emotional and social needs while introducing new forms of psychological vulnerability, which indicated the importance of responsible design and further regulation in future AI development.
Downloads
References
[1] Russell, S., Norvig, P., & Intelligence, A. (1995). A modern approach. Artificial Intelligence. Prentice-Hall, Egnlewood Cliffs, 25(27), 79-80.
[2] Malfacini, K. (2025). The impacts of companion AI on human relationships: risks, benefits, and design considerations. AI & Society. https://doi.org/10.1007/s00146-025-02318-6.
[3] Zimmerman, A., Janhonen, J., & Beer, E. (2023). Human/AI relationships: challenges, downsides, and impacts on human/human relationships. AI And Ethics. https://doi.org/10.1007/s43681-023-00348-8.
[4] Merkle, E. R., & Richardson, R. A. (2000). Digital Dating and Virtual Relating: Conceptualizing computer mediated romantic relationships. Family Relations, 49(2), 187–192. https://doi.org/10.1111/ j. 1741-3729. 2000. 00187. X.
[5] Merrill, K., Kim, J., & Collins, C. (2022). AI companions for lonely individuals and the role of social presence. Communication Research Reports, 39(2), 93–103. https://doi.org/ 10.1080/ 08824096. 2022. 2045929.
[6] Meng, J., & Dai, Y. (2021). Emotional Support from AI Chatbots: Should a Supportive Partner Self-Disclose or Not? Journal of Computer-Mediated Communication, 26(4), 207–222. https://doi.org/ 10. 1093/ jcmc/zmab005.
[7] Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235.
[8] Bakir, V., & McStay, A. (2025). Move fast and break people? Ethics, companion apps, and the case of Character.ai. AI & Society. https://doi.org/10.1007/s00146-025-02408-5.
[9] Chou, C., Chan, T., & Lin, C. (2003). Redefining the learning companion: the past, present, and future of educational agents. Computers & Education, 40(3), 255–269. https://doi.org/10.1016/s0360-1315(02) 001 30-6.
[10] Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295x. 114. 4. 864.
[11] Kotsona, M. (2024). Virtue in Virtuality: A Cultural Analysis on Artificial Intelligence Companionship and its Ethical Dimensions. LUP Student Papers. http://lup.lub.lu.se/student-papers/record/9168883.
[12] Bowlby, J. (1969). Attachment and loss (No. 79). Random House.
[13] Turkle, S. (2010). In good company?: On the threshold of robotic Companions. In Y. Wilks (Ed.), Close Engagements with Artificial Companions: Key social, psychological, ethical and design issues (pp. 3-10). John Benjamins Publishing Company. https://doi.org/10.1075/nlp.8.03tur.
[14] Belsky, J., & Cassidy, J. (1994). Attachment and Close Relationships: An Individual-Difference Perspective. Psychological Inquiry, 5(1), 27–30. https://doi.org/10.1207/s15327965pli0501_3.
[15] Horton, D., & Wohl, R. R. (1956). Mass Communication and Para-Social Interaction. Psychiatry, 19(3), 215–229. https://doi.org/10.1080/00332747.1956.11023049.
[16] Dibble, J. L., Hartmann, T., & Rosaen, S. F. (2015). Parasocial Interaction and parasocial Relationship: conceptual clarification and a critical assessment of measures. Human Communication Research, 42(1), 21–44. https://doi.org/10.1111/hcre.12063.
[17] De Freitas, J., Oğuz-Uğuralp, Z., Uğuralp, A. K., & Puntoni, S. (2025). AI companions reduce loneliness. Journal of Consumer Research. https://doi.org/10.1093/jcr/ucaf040.
[18] Xie, T., & Pentina, I. (2022). Attachment Theory as a Framework to Understand Relationships with Social Chatbots: A Case Study of Replika. Proceedings of the Annual Hawaii International Conference on System Sciences. https://doi.org/10.24251/hicss.2022.258.
[19] Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My Chatbot Companion - A study of Human-Chatbot Relationships. International Journal of Human-Computer Studies, 149, 102601. https://doi.org/10.1016/j.ijhcs.2021.102601.
[20] Hakim, F. Z. M., Indrayani, L. M., & Amalia, R. M. (2019). A Dialogic Analysis of Compliment Strategies Employed by Replika Chatbot. Proceedings of the Third International Conference of Arts, Language and Culture (ICALC 2018). https://doi.org/10.2991/icalc-18.2019.38.
[21] Fox, J., & Gambino, A. (2021). Relationship Development with Humanoid Social Robots: Applying Interpersonal Theories to Human–Robot Interaction. Cyberpsychology Behavior and Social Networking, 24(5), 294–299. https://doi.org/10.1089/cyber.2020.0181.
[22] Xie, C., Wang, Y., & Cheng, Y. (2022). Does Artificial Intelligence Satisfy You? A Meta-Analysis of User Gratification and User Satisfaction with AI-Powered Chatbots. International Journal of Human-Computer Interaction, 40(3), 613–623. https://doi.org/10.1080/10447318.2022.2121458.
[23] Guingrich, R., & Graziano, M. S. A. (2023). Chatbots as social companions: How people perceive consciousness, human likeness, and social health benefits in machines. arXiv (Cornell University). https://doi.org/10.48550/arxiv.2311.10599.
[24] Kouros, T., & Papa, V. (2024). Digital Mirrors: AI companions and the self. Societies, 14(10), 200. https:// doi. org/10.3390/soc14100200.
[25] Chaves, A. P., & Gerosa, M. A. (2020). How should my chatbot interact? A survey on Social Characteristics in Human–Chatbot Interaction Design. International Journal of Human-Computer Interaction, 37(8), 729–758. https://doi.org/10.1080/10447318.2020.1841438.
[26] Car, L. T., Dhinagaran, D. A., Kyaw, B. M., Kowatsch, T., Joty, S., Theng, Y., & Atun, R. (2020). Conversational Agents in Health Care: Scoping review and Conceptual analysis. Journal of Medical Internet Research, 22(8), e17158. https://doi.org/10.2196/17158.
[27] Moretta, T., & Buodo, G. (2020). Problematic internet use and loneliness: How complex is the relationship? A short literature review. Current Addiction Reports, 7(2), 125–136. https://doi.org/10.1007/s40429-020-00305-z.
[28] Leo-Liu, J. (2022). Loving a “defiant” AI companion? The gender performance and ethics of social exchange robots in simulated intimate interactions. Computers in Human Behavior, 141, 107620. https://doi.org/10.1016/j.chb.2022.107620.
[29] Adams, A., Soumerai, S., Lomas, J., & Ross-Degnan, D. (1999). Evidence of self-report bias in assessing adherence to guidelines. International Journal for Quality in Health Care, 11(3), 187–192. https://doi. org/ 10. 1093/intqhc/11.3.187.
[30] Aktan, M. E., Turhan, Z., & Dolu, İ. (2022). Attitudes and perspectives towards the preferences for artificial intelligence in psychotherapy. Computers in Human Behavior, 133, 107273. https://doi.org/ 10. 1016/ j. chb.2022.107273.
[31] Kim, J., Merrill, K., Xu, K., & Collins, C. (2023). My Health Advisor is a Robot: Understanding Intentions to Adopt a Robotic Health Advisor. International Journal of Human-Computer Interaction, 40(19), 5697–5706. https://doi.org/10.1080/10447318.2023.2239559.
[32] Jegundo, A. L., Dantas, C., Quintas, J., Dutra, J., Almeida, A. L., Caravau, H., Rosa, A. F., Martins, A. I., & Rocha, N. P. (2020). Perceived usefulness, satisfaction, ease of use and potential of a virtual companion to support the care provision for older adults. Technologies, 8(3), 42. https://doi.org/10.3390/ technologies 8030042.
[33] Boine, C. (2023). Emotional Attachment to AI Companions and European Law. MIT Case Studies in Social and Ethical Responsibilities of Computing, Winter 2023. https://doi.org/10. 21428/ 2c646 de5.db67ec7f.
[34] Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—Two frameworks for Ethical Design practice. IEEE Transactions on Technology and Society, 1(1), 34–47. https://doi.org/ 10.1109/ tts.2020.2974991.
[35] Pfeifer, J. H., Masten, C. L., Moore, W. E., Oswald, T. M., Mazziotta, J. C., Iacoboni, M., & Dapretto, M. (2011). Entering adolescence: resistance to peer influence, risky behavior, and neural changes in emotion reactivity. Neuron, 69(5), 1029–1036. https://doi.org/10.1016/j.neuron.2011.02.019.
[36] Grace, T. D., Abel, C., & Salen, K. (2023). Child-Centered Design in the Digital World: Investigating the Implications of the Age-Appropriate Design Code for Interactive Digital Media. IDC ’23: Proceedings of the 22nd Annual ACM Interaction Design and Children Conference, 289–297. https://doi.org/10. 1145/ 3585088.3589370.
[37] Wang, G., Zhao, J., Van Kleek, M., & Shadbolt, N. (2022). Informing Age-Appropriate AI: Examining principles and practices of AI for children. CHI Conference on Human Factors in Computing Systems, 1–29. https://doi.org/10.1145/3491102.3502057.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Academic Journal of Science and Technology

This work is licensed under a Creative Commons Attribution 4.0 International License.








