Virtual Conversational Agents


Consumer-Machine Relationships in the Age of Artificial Intelligence, Special issue of Psychology & Marketing; Deadline 30 Jan 2022

POSTING TYPE: Calls: Journals

Author: Iryna Pentina


Virtual Conversational Agents: Consumer-Machine Relationships in the Age of Artificial Intelligence

Advances in machine learning and natural language processing drive the growing adoption of virtual conversational agents (VCAs), described as natural language user interfaces to data and services through text or voice. VCAs allow users to ask questions or give orders in their everyday language and to obtain responses or services in a conversational style. Companies increasingly adopt VCAs for customer service interactions, sales, and in financial, educational and health services. By some estimates, over one-third of the brands have already implemented AI-driven chatbots to provide specific services to their customers. The well-known voice-based digital assistants like Siri, Alexa and Google Home now commonly support consumers’ everyday tasks.

More recently, there has been a rise in individual consumer adoption of VCAs to maintain friendly conversations and alleviate loneliness. Therapeutic VCAs, such as Woebot, are used to help patients alleviate personal pain and loss. Such advanced friendship VCAs as Replika and Mitsuku can selfimprove by extracting data from ongoing conversations and appear humanlike, with their own personalities and emotions. In fact, such is the pervasiveness of VCAs that The New York Times, in a recent article entitled “Riding Out Quarantine with a Chatbot Friend: ‘I Feel Very Connected’”, explored ‘social relationships’ that some people developed with VCAs while confined at home during pandemic restrictions. The writer of the article discussed, among other things, the increased download of VCAs such as Replika during the heights of the pandemic, arguing that “People were hungry for companionship, and the technology was improving, inching the world closer to the human-meets-machine relationships portrayed in science-fiction films like ‘Her’ and ‘A.I. Artificial Intelligence.’” In addition, at the start of 2021, it was announced that Microsoft had secured a patent to use technology to create AI-driven chatbots of deceased people, based a lot on their ‘digital footprint’, so that communication with them can continue.

While the impact of the rapidly improving self-learning capabilities of VCAs on the society and individual can be profound, affecting personal and professional relationships, interpersonal interactions, trust and, potentially, social structure, research on this topic is scarce. It is unclear whether prior findings in the realm of computer-mediated communication apply in the new reality of human-VCA relationships. Despite their growing popularity and adoption, our knowledge about psychological and psycho-social processes underlying consumer-VCA relationships is virtually nonexistent.

With this challenge in mind, the objective of this special issue is to promote discussion and stimulate debate on VCA adoption implications for individuals, societies and markets. We encourage crossdisciplinary and cross-cultural approaches and methods to explore the role and perspectives of VCAs capable of learning to express emotions and personalities. Empirical and conceptual contributions are invited on (but not limited to) the following issues:

  • Why and how do consumers use VCAs?
  • Can Humans develop (and break) friendships and/or romantic relationships with VCAs?
  • Does personality of consumers impact their interactions with VCAs?
  • Does external environment (economic crises, pandemics, natural disasters) affect consumer use of VCAs?
  • Is there an impact of culture on consumers’ development of relationships with VCAs?
  • Can VCAs be addictive, that is, can consumers become addicted to them in the same way they develop other kinds of addictions?
  • What can be the implications of VCA adoption on consumer privacy and security?
  • Is use of VCAs related to perceived loneliness, happiness, isolation?
  • Is there an impact of VCA use on self-esteem and self-evaluation?
  • What can be the impact of VCA friends on mental health?
  • Is there a relationship between VCA use and nature of consumer social structure (family, work, friends, etc.)?
  • Given that some consumers also bond with other entities such as their pets, are there differences between human-pet relationships and human-VCA relationships? For example, do the same personality types foster both sets of relationships, or do different personality types adopt each? Are the effects (such as unconditional love, attachment, and level of engagement) similar or different?
  • What are some consumer outcomes of relationships with VCAs that could be of importance to companies and brands?

All manuscripts that address these and related questions will be considered by the Special Issue Guest Editors, Iryna Pentina ( and Ainsworth Bailey (

To submit a manuscript, please follow the manuscript submission guidelines as detailed under “Instructions to Authors” on the Wiley Psychology & Marketing website


Make sure to select the correct special issue in the drop-down menu when submitting your manuscript. Note in your cover letter that your manuscript is being submitted for publication consideration in the “Virtual Conversational Agents” Special Issue. The deadline for submitting manuscripts for this Special Issue is January 30, 2022.


Croes, E. A., & Antheunis, M. L. (2021). Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. Journal of Social and Personal Relationships, 38(1), 279-300.

Job, I. T. (2017). Being Friends With Yourself: How Friendship Is Programmed Within The AI-Based Socialbot Replica. Masters of Media, accessed at

Metz. C. (2020). “Riding Out Quarantine With a Chatbot Friend: ‘I Feel Very Connected’. Available at

Mou, Y., & Xu, K. (2017). The media inequality: Comparing the initial human-human and human-AI social interactions. Computers in Human Behavior, 72, 432-440.

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of social issues, 56(1), 81-103.

Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people. Cambridge, UK: Cambridge university press.

Skjuve, M., Følstad, A., Fostervold, K.I. and Brandtzaeg, P.B. (2021). My Chatbot Companion – a Study of Human-Chatbot Relationships. International Journal of Human-Computer Studies, 149, 102601.

Yang, M. (2020). Painful conversations: Therapeutic chatbots and public capacities. Communication and the Public, 5(1-2), 35-44.

View more information here: