ECREA

European Communication Research
and Education Association

Log in

AI as a companion – AI chatbots in the daily lives of young people, the educational contexts of the chatbot

29.12.2025 10:49 | Anonymous member (Administrator)

Deadline: January 12, 2026

Submit abstract HERE.

Supervising editors: Astrid Carolus (Julius-Maximilians-University Würzburg), Julian Ernst (Justus-Liebig-University Giessen) and the Merzwissenschaft editorial team (jff)

Artificial Intelligence (AI) has long been a ubiquitous topic, which is clearly present and the subject of intensive debate in the context of science, education, economics and everyday culture. AI plays a central role in the daily lives of young people in particular. In the latest SINUS study with data from 2024, only two percent of 14 to 17-year-olds indicated that they had never before heard the term AI, while 71 percent said they knew AI and were able to describe it. Almost one third of the youths indicated that they used AI on a daily or regular basis. The JIM study for 2025 shows that ChatGPT is also by far the most popular application among those surveyed in the twelve to 19-year-old age segment (mpfs, 2025, p. 62 sqq.). AI is primarily used for learning purposes and for homework, to search for information as well as to figure out how a given thing works. Around one half of those surveyed indicated that they use AI in class. A total of approximately three quar- ters of the 14 to 17-year-olds use AI applications for school purposes on a weekly basis. Around 60 percent use AI in pri- vate tasks – for example composing personal texts or for fun (ifo Education Survey, 2025). On the whole, youths considered AI to be for the most part positive, to be remarkably helpful, useful, convenient and fun (Wendt et al., 2024). One half of 16 to 29-year-olds could imagine giving precedence to an AI chatbot over friends or family when seeking advice; one fifth of this group can even imagine establishing a friendship with an AI chatbot (BITKOM, 2025). In the USA, one third of the youth population use AI companions for social interaction and relationships, for example for emotional support, role play- ing or for friendship-based or romantic interactions (Common Sense Media, 2025). A common feature shared by all of these fragmentary empirical examinations of the use and relevance of AI chatbots (and of other AI applications) in the everyday lives of youths is the conceptualization of the relationship between AI and the (youth) users as an “in order to“ relation- ship: A relationship which is to result in either the technical automation and removal of certain tasks or to return a specific output. This instrumental analysis overlooks a central aspect of the user‘s experience with AI chatbots, which was already described in connection with the early precursors of these technologies: the interaction with and experience of AI as a companion. Joseph Weizenbaum‘s program ELIZA is regarded as one of the first chatbots. Based on Carl Roger‘s client- centric psychotherapy, ELIZA was developed explicitly without formulated therapeutic objectives. Instead, for pragmatic reasons Weizenbaum chose the setting as “one of the few examples of categorized dyadic natural language communication“ (Weizenbaum, 1966, p. 42), which in technical terms was comparatively easy to realize. However, experiments showed that users quickly began to confide in the program and to recognize in ELIZA a counterpart to which they attributed understan- ding, empathy and intentions (Weizenbaum, 1976).

The ELIZA example illustrates how the possible uses of AI chatbots and other technologies do not follow only the inten- tions of their developers. The affordances of these technologies manifest in the relationships between people and machines (Davis, 2020). Similar to other “spectacular machines“ (Strassberg, 2022), AI chatbots are characterized by a “multistability“ (Verbeek, 2005; Ihde, 1990) that entails the potential for quasi-social interaction with them and – over the course of time – establishing quasi-social relationships with them. The continuous use facilitated by permanently available mobile terminal devices exhibits parallels to Hinde‘s definition of relationship, which he describes as “a series of interactions between two individuals known to each other“ and which include “behavioural, cognitive, and affective (or emotional) aspects“ (Hinde, 1979, quoted in Vangelisti & Perlman, 2018, p. 3). For example, empirical research has shown that people develop some pro- perties of social relationships with their smartphones, such as presence and trust (Carolus et al., 2019).

The arrival of Large Language Models (LLMs) fundamentally changed the technological basis of human interaction with technology. Earlier systems like Amazon Alexa or Google Home only offered restricted social context cues and made it practically impossible for the illusion of a human counterpart to arise. ChatGPT constituted a change: The user found it nearly impossible to tell GPT-4 from a human (Jones et al., 2025), a fact which actually satisfies, i. e. the process which tests whether a machine can imitate human communication so convincingly that it can no longer be distinguished from commu- nication with another human. More recent developments go even further, leading to increasingly autonomous AI systems. While past AI chatbots responded to instructions reactively, today‘s AI agents exhibit an increasing degree of proactive behavior, pursue their own objectives and makes decisions without direct input.

From media-educational and media-psychological perspectives, this increasing interactivity and autonomy is concomitant with a quantitative increase and higher level of differentiation of social context cues and thus means a growing potential for social affordances. Consequently, questions arise regarding the social imputations, short-term social interactions as well as long-term relationships which young users in particular enter into with these systems. In addition, AI chatbots are gai- ning in importance as a part of pedagogical practice: teaching staff, school social work and counselling services are making increasing use of generative AI in preparing lessons, structuring counselling processes and as support for organizational procedures (Hein et al., 2024; Linnemann et al., 2025, among others). This can result in the entanglement of the quasi-social relationships young people have to chatbots with institutional educational settings, in which educational specialists them- selves work with AI-assisted systems. This in turn raises new questions relating to professionalization, responsibility and the limits of using AI in educational process.

We look forward to receiving submissions which explore the various aspects of the quasi-social relationships of AI chat- bots and young people as well as the implications of these relationships in various pedagogical contexts. We welcome theoretical-conceptual contributions as well as empirical submissions from media education, media psychology, social work, media sociology, communications science and from the field of Human Computer Interaction (HCI). The submissions should focus on the following questions, among others:

  • What forms of social interaction between young people and AI chatbots can be described?
  • What forms of social relationships do young people enter into with AI chatbots? What role do emotions play in this process?
  • What are the impacts on social relationships between people when more is entrusted to the chatbot than for example to a friend?
  • How do young people understand interaction with chatbots, and to what extent does the interaction with chatbots change their understanding of human relationships and the expectations of quality they place on these human relationships?
  • How can human-chatbot interactions and human-chatbot relationships be empirically captured, described and analyzed? What are the corresponding methodological points of access?
  • How do inter-individual differences influence the formulation of encounters with AI chatbots?
  • What new developmental tasks arise for young people in the context of quasi-social interactions and relationships with AI chatbots?
  • What new (media) skill requirements can be formulated in the context of quasi-social interactions and relationships with chatbots? How can these requirements be addressed?
  • What possible (media) educational approaches are there to addressing quasi-social relationships between young people and AI chatbots?
  • To what extent do affect and emotion play a role in interactions between young people and AI chatbots?
  •  To what extent should the social character of the use of chatbots be reflected when for example AI chatbots are used as learning aids in the context of schools?
  • How does the interaction of young people with AI chatbots impact the design of AI chatbots?
  • To what extent do pedagogical practice and profession change in the context of the use of AI chatbots in educational work?
  • What new media skill requirements arise for educational specialists when AI chatbots in schools, youth welfare and counselling are integrated in educational work? How significant is the interface in interaction with AI chatbots (text-based input/output vs. spoken input/output)?
  • To what extent is it relevant that conventional systems have been programmed as assistants and fundamentally structured in order to support users? To what extent do contradictions and criticism arise in interaction with AI chatbots?

The deadline for submission of abstracts with a maximum of 6.000 characters (including blank spaces) is 12 January 2026. Please upload your abstract at https://merz-zeitschrift.de/fuerautorinnen. The format of the submissions should follow the layout specifications of the merzWissenschaft style guide, available at https://merz-zeitschrift.de/manuskriptrichtlinien. The journal articles should not exceed a maximum character count of approximately 35.000 characters (including blank spaces and literature). Please feel free to direct any questions you may have to the merz editorial team, tel.: +49 89 68989 120, e-mail: merz@jff.de

SUMMARY OF DEADLINE

  • 12 January 2026: Submission deadline for abstracts
  • 2 February 2026: Decision on acceptance/rejection of abstracts
  • 18 May 2026: Submission deadline for articles
  • May/June 2026: Evaluation period (double-blind peer review)
  • June/July 2026: Revision phase (when necessary, multi-phase)
  • End of November 2026: Publication of merzWissenschaft 2026

Literature

Bitkom. (2025). Junge Menschen und Künstliche Intelligenz: Einstellungen, Nutzung und Erwartungen. Bitkom e. V. https://www.bitkom.org/ Presse/ Presseinformation/ Freundschaft-KI-Sprachassistent

Carolus, A., Muench, R., Schmidt, C. & Schneider, F. (2019). Impertinent mobiles-effects of politeness and impoliteness in human-smartphone interac- tion. Computers in Human Behavior, 93, 290–300.

Common Sense Media (2025). Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions. https://www.commonsensemedia.org/sites/de- fault/ files/research/report/talk-trust-and-trade-offs_2025_web.pdf

Davis, J. L. (2020). How Artifacts Afford. The Power and Politics of Everyday Things. MIT Press.

Hein, L., Högemann, M., Illgen, K.-M., Stattkus, D., Kochon, E., Reibold, M.-G., Eckle, J., Seiwert, L., Beinke, J. H., Knopf, J. & Thomas, O. (2024). Chat-

GPT als Unterstützung von Lehrkräften – Einordnung, Analyse und Anwendungsbeispiele. HMD Praxis der Wirtschaftsinformatik, 61, 449–470. ifo Institut – Leibniz-Institut für Wirtschaftsforschung an der Universität München e. V. (2025). ifo-Bildungsbarometer 2025. https://www.ifo.de/DocDL/sd-2025-09-wedel-etal-ifo-bildungsbarometer-2025.pdf

Ihde, D. (1990). Technology and the Lifeworld: From Garden to Earth. Indiana University Press.

Jones, C. R., Rathi, I., Taylor, S. & Bergen, B. K. (2025). People cannot distinguish GPT-4 from a human in a Turing test. Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency, 1615–1639.

Linnemann, G., Löhe, J. & Rottkemper, B. (Eds.) (2025). Künstliche Intelligenz in der Sozialen Arbeit: Grundlagen für Theorie und Praxis. Beltz. Medienpädagogischer Forschungsverbund Südwest (mpfs) (2025). JIM 2025. Jugend, Information, Medien. Basisuntersuchung zum Medienumgang 12- bis 19-Jähriger in Deutschland. https://mpfs.de/studie/jim-studie-2025

Strassberg, D. (2022). Spektakuläre Maschinen. Eine Affektgeschichte der Technik. Matthes & Seitz.

Vangelisti, A. L. & Perlman, D. (Eds.) (2018). The Cambridge handbook of personal relationships. Cambridge University Press.

Verbeek, P.-P. (2005). What Things Do. Philosophical Reflections on Technology, Agency, and Design. Pennsylvania State University Press. Weizenbaum, J. (1966). ELIZA - A Computer Program For the Study of Natural Language Communication Between Man and Machine. Communications of the ACM, 9(1).

Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation. W. H. Freeman & Co.

Wendt, R., Riesmeyer, C., Leonhard, L., Hagner, J. & Kühn, J. (2024). Algorithmen und Künstliche Intelligenz im Alltag von Jugendlichen: Forschungs-bericht für die Bayerische Landeszentrale für neue Medien (BLM). Nomos.

contact

ECREA

Antoine de Saint-Exupéry 14
6041 Charleroi
Belgium

Who to contact

Support Young Scholars Fund

Help fund travel grants for young scholars who participate at ECC conferences. We accept individual and institutional donations.

DONATE!

CONNECT

Copyright 2017 ECREA | Privacy statement | Refunds policy