ECREA

European Communication Research
and Education Association

Log in

The age of synthetic media: Perspectives from communication and media studies

29.11.2024 09:03 | Anonymous member (Administrator)

Studies in Communication and Media (Special issue)

Deadline: April 1, 2025

Guest Editors: Alexander Godulla (University of Leipzig) Christian Pieter Hoffmann (University of Leipzig)

Ever since a Reddit user called “Deepfake” created a forum for publishing pornographic content based on deep learning technologies, synthetic media have attracted increasing interest in research and practice (Godulla et al., 2021). Deep learning technologies enable users to depict individuals in scenarios that never happened and have them say anything imaginable (Citron & Chesney, 2019; Vaccari & Chadwick, 2020). The rapid advances of these technologies result in synthetic media increasingly entering new social domains, such as entertainment, education, journalism, or politics (Seibert, 2023).

To date, research has focused primarily on the concept of deepfakes, while the term synthetic media has only recently gained popularity. Although both terms refer to the use of deep learning technologies in the creation of media content, the term “synthetic media” might be more suitable when discussing the benefits of synthetically generated content (e.g., WDR Innovation Hub, 2021), as the term “deepfake” is connotated with fake news and, thus, misinformation (Altuncu et al., 2022; Dan et al., 2021; Weikmann & Lecheler, 2023). Research into deepfakes is currently dominated by studies in the field of computer science, focusing on the development of tools for the automatic detection of deepfakes. In addition, studies in the field of law discuss legal frameworks to combat harmful effects of the novel technology (Godulla et al., 2021). Thus far, studies in the social sciences mostly focus on the implications of deepfakes for audiences (e.g. Dobber et al., 2020; Hameleers et al., 2024; Vaccari & Chadwick, 2020). Initial findings suggest that audiences have difficulties identifying deepfakes as such (Bray et al., 2023; Thaw et al., 2020) and that the mere awareness of the existence of deepfakes can create a sense of uncertainty, skepticism and even distrust towards online news and media in general (Ternovski et al., 2022; Vaccari & Chadwick, 2020; Hameleers & Marquart, 2023). From the audience's perspective, deepfakes and synthetic media increasingly blur the boundaries between reality and fiction (Bendahan Bitton et al., 2024).

The interdisciplinary nature of research into deepfakes and synthetic media is partly due to the technology’s diverse fields of application. However, research on the emergent technology from the perspective of communication and media studies is still in its infancy.

Therefore, the upcoming special issue of SCM aims at examining deepfakes and synthetic media specifically from the perspective of communication and media studies. We welcome qualitative, quantitative as well as theoretical and methodological contributions addressing challenges faced by the public, organizations and institutions as well as individual recipients in dealing with synthetic media and deepfakes. We define synthetic media as media content created using deep learning technologies with a wide range of potential applications, such as education, entertainment, journalism, or advertising. In contrast, we define “deepfakes” as a specific application of synthetic media, which primarily serves harmful purposes such as disinformation. Synthetic media can be used to generate audiovisual recordings that can be used in the context of corporate or organizational communication. Further, synthetic media hold the potential to create and enhance journalistic content, for example by illustrating real events or rendering the reception of news content more interesting through new forms of personalization (e.g. synthetic news anchors). Finally, synthetic media can be used in the creation of entertaining and satirical content, which can, however, mislead audiences if there is a lack of labelling or background information. Deepfakes can be used to expose individuals to risks (e.g. by means of nonconsensual pornographic content) or to defame public actors and spread disinformation. Politically motivated deepfakes may have the potential to influence political knowledge, attitudes or even voting intentions and thus challenge democracy. The public, in turn, could be deceived and manipulated by deepfakes if they do not dispose of the necessary digital skills to recognize them. The continuous improvement in the quality of deepfakes makes it increasingly difficult to determine the veracity of media content. Consequently, journalists and influencers could fall for a deepfake and accidentally share it with their audience.

Individual submissions could cover, but are not limited to, the following perspectives (or a combination thereof):

  • Media Reception and Effects: How do synthetic media influence recipients' trust in media content? How do they affect recipients' attention and entertainment? What dispositions and boundary conditions influence these relationships? What interventions can reduce deepfake misinformation effects?
  • Political Communication: What role do political deepfakes play in the context of elections? What persuasive effects do they have on voters? How are deepfakes employed in the context of political disinformation (e.g. Ukraine war)? To what extent are synthetic media used in the context of political campaigning?
  • Journalism Studies: To what extent can standards of journalistic work be reconciled with the use of synthetic media? What specific labels should be introduced for synthetic media to ensure transparency for audiences? What skills do journalists need to be equipped to deal with deepfakes?
  • Visual Communication: To what extent do the persuasiveness and credibility of audiovisual deepfakes differ from text-based content? Which factors favor or impede the credibility of audiovisual deepfakes (e.g. plausibility, background knowledge, attitude, psychological factors)? How do synthetic media and deepfakes change the definition and perception of authenticity of visual content?
  • Media Education: What skills do audiences need to develop to critically question and recognize deepfakes and synthetic media? How can children and young people be protected from negative applications of deepfakes? Media Ethics: To what extent can generated content be used to depict real events? What ethical aspects should be considered when using synthetic media for the creation and distribution of audiovisual content, for example in the context of education or strategic communication?
  • Media Law: What legal framework could prevent the misuse of deepfake technologies without unduly restricting the creative use of synthetic media and freedom of expression? What legal protections of personal rights and user privacy apply in connection with deepfakes and synthetic media? To what extent can the use of synthetic content depicting deceased individuals be justified?
  • Communication History: How can deepfakes be placed in historical contexts of media manipulation (e.g. Photoshop) and propaganda? What role do the negative effects of this new technology on audience trust play against the background of the history and development of audiovisual media?

Submission Instructions SCM is an Open Access Journal of the German Communication Association (DGPuK) and Affiliate Journal of the International Communication Association (ICA). Accepted papers will be published as Open Access without additional costs.

We invite submissions that fit any of the SCM formats: Extended paper (50-60 pages), Full Paper (15-20 pages), and Research-in-brief (5-10 pages). Manuscripts should be prepared in accordance with the SCM guidelines:

Manuscripts are to be submitted to christian.hoffmann@uni-leipzig.de.

Deadline for submissions will be April 1st, 2025. The special issue will be published in December 2025 (SCM issue 4/2025).

contact

ECREA

Chaussée de Waterloo 1151
1180 Uccle
Belgium

Who to contact

Support Young Scholars Fund

Help fund travel grants for young scholars who participate at ECC conferences. We accept individual and institutional donations.

DONATE!

CONNECT

Copyright 2017 ECREA | Privacy statement | Refunds policy