The Impact of Chatbot Communication Style and Service Remedies on Consumer Adoption Intention

Based on Service Failure Scenario

Authors

  • Rulan Lv
  • Siyi Chen

DOI:

https://doi.org/10.54097/e6vj0q02

Keywords:

Service Failure, Chatbots, Communication Style, Uniqueness Neglect, Service Failure Remediation.

Abstract

 The development of chatbots is very rapid, so far, the research on chatbots mainly focuses on the field of service success, ignoring the fact that chatbot service failure (i.e., making mistakes) is a kind of norm, how to better design chatbots so as to minimize the loss brought by service failure has become a problem that needs to be solved urgently by AI producers nowadays. Based on the chatbot service failure scenario, this paper investigates the different impacts of chatbots adopting two communication styles, social and task, on consumers' adoption intention, and finds that consumers' adoption intention of social (vs. task) chatbots is lower after service failure, in which uniqueness neglect plays a mediating role. In addition, different matching effects between three chatbot service remedies (humor vs. apology vs. compensation) and communication style (social vs. oriented) were explored. Social-oriented chatbot service failure, using humor as a service remedy was superior to compensation. Task-oriented chatbot service failure and the use of compensation as a means of service remediation is superior to humor. Apologizing is less effective than humor and compensation in both social and task-oriented chatbot service failure scenarios. The conclusions of this paper enrich the research on chatbots in the area of service failure and provide effective suggestions for merchants to design and apply chatbots.

Downloads

Download data is not yet available.

References

Borau, S., Otterbring, T., Laporte, S., & Fosso Wamba, S. (2021). The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychology & Marketing, 38(7), 1052-1068.

Brandtzaeg, P. B., & Følstad, A. (2018). Chatbots: changing user needs and motivations. interactions, 25(5), 38-43.

Bruckenberger, U., Weiss, A., Mirnig, N., Strasser, E., Stadler, S., & Tscheligi, M. (2013). The good, the bad, the weird: Audience evaluation of a “real” robot in relation to science fiction and mass media. Paper presented at the Social Robotics: 5th International Conference, ICSR 2013, Bristol, UK, October 27-29, 2013, Proceedings 5.

Castelo, N., Bos, M. W., & Lehmann, D. R. (2019). Task-Dependent Algorithm Aversion. Journal of Marketing Research, 56(5), 809-825. doi:10.1177/0022243719851788

Chen, Y., Naveed, A., & Porzel, R. (2010). Behavior and preference in minimal personality: A study on embodied conversational agents. Paper presented at the International conference on multimodal interfaces and the workshop on machine learning for multimodal interaction.

Crolic, C., Thomaz, F., Hadi, R., & Stephen, A. T. (2022). Blame the Bot: Anthropomorphism and Anger in Customer-Chatbot Interactions. Journal of Marketing, 86(1), 132-148. doi:10.1177/00222429211045687

Cuayahuitl, H., Lee, D., Ryu, S., Cho, Y., Choi, S., Indurthi, S., . . . Kim, J. (2019). Ensemble-based deep reinforcement learning for chatbots. Neurocomputing, 366, 118-130. doi:10.1016/j.neucom.2019.08.007

Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A Taxonomy of Social Cues for Conversational Agents. International Journal of Human-Computer Studies, 132, 138-161. doi:10.1016/j.ijhcs.2019.07.009

Gelbrich, K., Hagel, J., & Orsingher, C. (2021). Emotional support from a digital assistant in technology-mediated services: Effects on customer satisfaction and behavioral persistence. International Journal of Research in Marketing, 38(1), 176-193. doi:10.1016/j.ijresmar.2020.06.004

Green, H. N., Islam, M. M., Ali, S., & Iqbal, T. (2022). Who's laughing nao? examining perceptions of failure in a humorous robot partner. Paper presented at the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

Haslam, N., Bain, P., Douge, L., Lee, M., & Bastian, B. (2005). More human than you: Attributing humanness to self and others. Australian Journal of Psychology, 57, 87-87. Retrieved from ://WOS:000234555700320

Holbrook, M. B., & Batra, R. (1987). ASSESSING THE ROLE OF EMOTIONS AS MEDIATORS OF CONSUMER RESPONSES TO ADVERTISING. Journal of Consumer Research, 14(3), 404-420. doi:10.1086/209123

Holzwarth, M., Janiszewski, C., & Neumann, M. M. (2006). The influence of avatars on online consumer shopping behavior. Journal of Marketing, 70(4), 19-36. doi:10.1509/jmkg.70.4.19

Kobel, S., & Groeppel-Klein, A. (2021). No laughing matter, or a secret weapon? Exploring the effect of humor in service failure situations. Journal of Business Research, 132, 260-269.

Koehler, C. F., Rohm, A. J., de Ruyter, K., & Wetzels, M. (2011). Return on Interactivity: The Impact of Online Agents on Newcomer Adjustment. Journal of Marketing, 75(2), 93-108. doi:10.1509/jm.75.2.93

Kreijns, K., Kirschner, P. A., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: a review of the research. Computers in Human Behavior, 19(3), 335-353.

Lariviere, B., Bowen, D., Andreassen, T. W., Kunz, W., Sirianni, N. J., Voss, C., . . . De Keyser, A. (2017). "Service Encounter 2.0": An investigation into the roles of technology, employees and customers. Journal of Business Research, 79, 238-246. doi:10.1016/j.jbusres.2017.03.008

Longoni, C., Bonezzi, A., & Morewedge, C. K. (2019). Resistance to medical artificial intelligence. Journal of Consumer Research, 46(4), 629-650.

Longoni, C., & Cian, L. (2022). Artificial intelligence in utilitarian vs. hedonic contexts: The “word-of-machine” effect. Journal of Marketing, 86(1), 91-108.

McGraw, A. P., & Warren, C. (2010). Benign Violations: Making Immoral Behavior Funny. Psychological Science, 21(8), 1141-1149. doi:10.1177/0956797610376073

Meany, M., & Clark, T. (2010). Humour theory and conversational agents: An application in the development of computer-based agents. International Journal of the Humanities, 8(5), 129-140.

Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from consumers. Journal of Consumer Research, 26(4), 323-339.

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of social issues, 56(1), 81-103.

Roschk, H., & Kaiser, S. (2013). The nature of an apology: An experimental study on how to apologize after a service failure. Marketing Letters, 24(3), 293-309. doi:10.1007/s11002-012-9218-x

Sheehan, B., Jin, H. S., & Gottlieb, U. (2020). Customer service chatbots: Anthropomorphism and adoption. Journal of Business Research, 115, 14-24. doi:10.1016/j.jbusres.2020.04.030

Sundar, S. S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility: MacArthur Foundation Digital Media and Learning Initiative Cambridge, MA.

Wang, H., Xie, T., & Zhan, C. (2021). When service failed: the detrimental effect of anthropomorphism on intelligent customer service agent avatar—disgust as mediation. Nankai Business Review, 24(4), 194-204.

Wang, L. C., Baker, J., Wagner, J. A., & Wakefield, K. (2007). Can a retail web site be social? Journal of Marketing, 71(3), 143-157.

Warren, C., Barsky, A., & McGraw, A. P. (2018). Humor, Comedy, and Consumer Behavior. Journal of Consumer Research, 45(3), 529-552. doi:10.1093/jcr/ucy015

Yoo, W., Choi, D.-H., & Park, K. (2016). The effects of SNS communication: How expressing and receiving information predict MERS-preventive behavioral intentions in South Korea. Computers in Human Behavior, 62, 34-43.

Yuan, C., Zhang, C., & Wang, S. (2022). Social anxiety as a moderator in consumer willingness to accept AI assistants based on utilitarian and hedonic values. Journal of retailing and consumer services, 65. doi:10.1016/j.jretconser.2021.102878

Zhang, L.-m., Liu, Z., Wang, J.-q., Li, R.-q., Ren, J.-y., Gao, X., . . . Ma, Y.-x. (2022). Randomized controlled trial for time-restricted eating in overweight and obese young adults. Iscience, 25(9). doi:10.1016/j.isci.2022.104870

Zhu, Y., Zhang, J., Wu, J., & Liu, Y. (2022). AI is better when I'm sure: The influence of certainty of needs on consumers' acceptance of AI chatbots. Journal of Business Research, 150, 642-652. doi:10.1016/j.jbusres.2022.06.044

Zumstein, D., & Hundertmark, S. (2017). CHATBOTS--AN INTERACTIVE TECHNOLOGY FOR PERSONALIZED COMMUNICATION, TRANSACTIONS AND SERVICES. IADIS International Journal on WWW/Internet, 15(1).

Downloads

Published

21-03-2024

Issue

Section

Articles

How to Cite

Lv, R., & Chen, S. (2024). The Impact of Chatbot Communication Style and Service Remedies on Consumer Adoption Intention: Based on Service Failure Scenario. Frontiers in Business, Economics and Management, 14(1), 161-166. https://doi.org/10.54097/e6vj0q02