Support for deepfake regulation: The role of third-person perception, trust, and risk
Table of contents
Bibliographic information

SCM Studies in Communication and Media
Volume 14 (2025), Edition 4
- Authors:
- | | | | | | | | | | | | | | | | | | | | | | | | | | |
- Publisher
- Nomos, Baden-Baden
- Copyright year
- 2026
- ISSN-Online
- 2192-4007
- ISSN-Print
- 2192-4007
Chapter information
Volume 14 (2025), Edition 4
Support for deepfake regulation: The role of third-person perception, trust, and risk
- Authors:
- | |
- ISSN-Print
- 2192-4007
- ISSN-Online
- 2192-4007
- Preview:
Like other emerging technologies, deepfakes present both risks and benefits to society. Due to harmful applications such as disinformation and non-consensual pornography, calls for their regulation have increased recently. However, little is known about public support for deepfake regulation and the factors related to it. This study addresses this gap through a pre-registered online survey (n = 1,361) conducted in Switzerland, where citizens can influence political regulation through direct democratic instruments, such as referendums. Our findings reveal a strong third-person perception, as people believe that deepfakes affect others more than themselves (Cohen’s d = 0.77). This presumed effect on others is a weak but significant predictor of support for regulation (β = 0.07). However, we do not find evidence for the second-person effect – the idea that individuals who perceive deepfakes as highly influential on both themselves and others are more likely to support regulation. However, an exploratory analysis indicates a potential second-person effect among females, who are specifically affected by deepfakes; a result which must be further explored and replicated. Additionally, we find that higher perceived risk and greater trust in institutions are positively associated with support for deepfake regulation.
Bibliography
No match found. Try another term.
- Ahmed, S. (2023). Examining public perception and cognitive biases in the presumed influence of deepfakes threat: Empirical evidence of third person perception from three studies. Asian Journal of Communication, 33(3), 308–331. https://doi.org/10.1080/01292986.2023.2194886 Open Google Scholar
- Altay, S., & Acerbi, A. (2024). People believe misinformation is a threat because they assume others are gullible. New Media & Society, 26(11), 6440–6461. https://doi.org/10.1177/14614448231153379 Open Google Scholar
- Baek, Y. M., Kang, H., & Kim, S. (2019). Fake news should be regulated because it influences both “others” and “me”: How and why the influence of presumed influence model should be extended. Mass Communication and Society, 22(3), 301–323. https://doi.org/10.1080/15205436.2018.1562076 Open Google Scholar
- Birrer, A., & Just, N. (2024). What we know and don’t know about deepfakes: An investigation into the state of the research and regulatory landscape. New Media & Society. Advance online publication. https://doi.org/10.1177/14614448241 253138 Open Google Scholar
- Bendahan Bitton, D. B., Hoffmann, C. P., & Godulla, A. (2024). Deepfakes in the context of AI inequalities: Analysing disparities in knowledge and attitudes. Information, Communication & Society, 295–315. https://doi.org/10.1080/1369118X.2024.2420037 Open Google Scholar
- Chen, M., Yu, W., & Liu, K. (2023). A meta-analysis of third-person perception related to distorted information: Synthesizing the effect, antecedents, and consequences. Information Processing & Management, 60(5). https://doi.org/10.1016/j.ipm.2023.103425 Open Google Scholar
- Chung, M., & Wihbey, J. (2024). Social media regulation, third-person effect, and public views: A comparative study of the United States, the United Kingdom, South Korea, and Mexico. New Media & Society, 26(8), 4534–4553. https://doi.org/10.1177/14614448221122996 Open Google Scholar
- Corbu, N., Oprea, D.-A., Negrea-Busuioc, E., & Radu, L. (2020). ‘They can’t fool me, but they can fool the others!’ Third person effect and fake news detection. European Journal of Communication, 35(2), 165–180. https://doi.org/10.1177/ 0267323120903686 Open Google Scholar
- Davison, W. P. (1983). The third-person effect in communication. Public Opinion Quarterly, 47(1), 1–15. https://doi.org/10.1086/268763 Open Google Scholar
- de Ruiter, A. (2021). The distinct wrong of deepfakes. Philosophy & Technology, 34(4), 1311–1332. https://doi.org/10. 1007/s13347-021-00459-2 Open Google Scholar
- Gardner, G. T., & Gould, L. C. (1989). Public perceptions of the risks and benefits of technology. Risk Analysis, 9(2), 225–242. https://doi.org/10.1111/j.1539-6924. 1989.tb01243.x Open Google Scholar
- Godulla, A., Hoffmann, C. P., & Seibert, D. (2021). Dealing with deepfakes – An interdisciplinary examination of the state of research and implications for communication studies. SCM Studies in Communication and Media, 10(1), 72–96. https://doi.org/10.5771/2192-4007-2021-1-72 Open Google Scholar
- Gosse, C., & Burkell, J. (2020). Politics and porn: How news media characterizes problems presented by deepfakes. Critical Studies in Media Communication, 37(5), 497–511. https://doi.org/10.1080/15295036.2020.1832697 Open Google Scholar
- Gunther, A. C., & Storey, J. D. (2003). The influence of presumed influence. Journal of Communication, 53(2), 199–215. https://doi.org/10.1111/j.1460- 2466.2003.tb02586.x Open Google Scholar
- Hameleers, M., Van Der Meer, T. G. L. A., & Dobber, T. (2022). You won’t believe what they just said! The effects of political deepfakes embedded as vox Open Google Scholar
- populi on social media. Social Open Google Scholar
- Media + Society, 8(3). https://doi.org/10.1177/20563051221116346 Open Google Scholar
- Jang, S. M., & Kim, J. K. (2018). Third person effects of fake news: Fake news regulation and media literacy interventions. Computers in Human Behavior, 80, 295–302. https://doi.org/10.1016/j.chb.2017.11.034 Open Google Scholar
- Jungherr, A., & Rauchfleisch, A. (2024). Negative downstream effects of alarmist disinformation discourse: Evidence from the United States. Political Behavior, 46(4), 2123–2143. https://doi.org/10.1007/s11109-024-09911-3 Open Google Scholar
- Jungherr, A., & Rauchfleisch, A. (in press). Public Opinion on the Politics of AI Alignment: Cross-National Evidence on Expectations for AI Moderation from Germany and the United States. Social Media + Society. Open Google Scholar
- Kalogeropoulos, A., Toff, B., & Fletcher, R. (2022). The watchdog press in the doghouse: A comparative study of attitudes about accountability journalism, trust in news, and news avoidance. The International Journal of Press/Politics, 29(2), 485–506. https://doi.org/10. 1177/19401612221112572 Open Google Scholar
- Karaboga, M., Frei, N., Puppis, M., Vogler, D., Raemy, P., Ebbers, F., Runge, G., Rauchfleisch, A., de Seta, G., Gurr, G., Friedewald, M., & Rovelli, S. (2024). Deepfakes und manipulierte Realitäten: Technologiefolgenabschätzung und Handlungsempfehlungen für die Schweiz [Deepfakes and manipulated realities: Technology impact assessment and policy recommendations for Switzerland]. vdf Hochschulverlag AG. Open Google Scholar
- Kim, M. (2025). A direct and indirect effect of third-person perception of COVID-19 fake news on support for fake news regulations on social media: Investigating the role of negative emotions and political views. Mass Communication and Society, 28(2), 229–252. https://doi.org/10.1080/15205436.2023.2227601 Open Google Scholar
- Lima, M. L., Barnett, J., & Vala, J. (2005). Risk perception and technological development at a societal level. Risk Analysis, 25(5), 1229–1239. https://doi.org/10. 1111/j.1539-6924.2005.00664.x Open Google Scholar
- Liu, P. L., & Huang, L. V. (2020). Digital disinformation about COVID-19 and the third-person effect: Examining the channel differences and negative emotional outcomes. Cyberpsychology, Behavior, and Social Networking, 23(11), 789–793. https://doi.org/10.1089/cyber. 2020.0363 Open Google Scholar
- Marien, S., & Hooghe, M. (2011). Does political trust matter? An empirical investigation into the relation between political trust and support for law compliance: does political trust matter? European Journal of Political Research, 50(2), 267–291. https://doi.org/10.1111/j.1475-6765.2010.01930.x Open Google Scholar
- Nguyen, D. (2023). How news media frame data risks in their coverage of big data and AI. Internet Policy Review, 12(2). https://policyreview.info/articles/analysis/how-news-media-frame-data-risks-big-data-and-ai Open Google Scholar
- Paradise, A., & Sullivan, M. (2012). (In)visible threats? The third-person effect in perceptions of the influence of Facebook. Cyberpsychology, Behavior, and Social Networking, 15(1), 55–60. https://doi.org/10.1089/cyber.2011.0054 Open Google Scholar
- PytlikZillig, L. M., Kimbrough, C. D., Shockley, E., Neal, T. M. S., Herian, M. N., Hamm, J. A., Bornstein, B. H., & Tomkins, A. J. (2017). A longitudinal and experimental study of the impact of knowledge on the bases of institutional trust. PLOS ONE, 12(4). https://doi.org/10.1371/journal.pone.0175387 Open Google Scholar
- Rauchfleisch, A., Vogler, D., & de Seta, G. (2025). Deepfakes or synthetic media? The effect of euphemisms for labeling technology on risk and benefit perceptions. Social Media + Society. https://doi.org/10.1177/20563051251350975 Open Google Scholar
- Riedl, M. J., Whipple, K. N., & Wallace, R. (2022). Antecedents of support for social media content moderation and platform regulation: The role of presumed effects on self and others. Information, Communication & Society, 25(11), 1632–1649. https://doi.org/10.1080/1369118X.2021.1874040 Open Google Scholar
- Six, F. (2013). Trust in regulatory relations: How new insights from trust research improve regulation theory. Public Management Review, 15(2), 163–185. https://doi.org/10.1080/14719037.2012.727461 Open Google Scholar
- Slovic, P., Fischhoff, B., & Lichtenstein, S. (1982). Why study risk perception? Risk Analysis, 2(2), 83–93. https://doi.org/10.1111/j.1539-6924.1982.tb01369.x Open Google Scholar
- Swissinfo.ch (2025, May 9). Switzerland rejects deepfake regulation. Retrieved from https://www.swissinfo.ch/eng/ai-governance/switzerland-rejects-deepfake-regulation/89277391 Open Google Scholar
- Thouvenin, F.; Eisenegger, M.; Volz, S.; Vogler, D.; Jaffé, M., (2023). Governance von Desinformation in digitalisierten Öffentlichkeiten. Bericht für das Bundesamt für Kommunikation (BAKOM) [Governance of disinformation in digitalized publics. Report for the Federal Office of Communication]. Retrieved from: https://www.bakom.admin.ch/bakom/de/home/elektronische-medien/studien/ Open Google Scholar
- einzelstudien.html Open Google Scholar
- Vaccari, C., & Chadwick, A. (2020). Deepfakes and disinformation: Exploring the impact of synthetic political video on deception, uncertainty, and trust in news. Social Media + Society, 6(1). https://doi.org/10.1177/2056305120903408 Open Google Scholar
- Verhoest, K., Redert, B., Maggetti, M., Levi-Faur, D., & Jordana, J. (2025). Trust and regulation. In F. Six, J. A. Hamm, D. Latusek, E. V. Zimmeren, & K. Verhoest (Eds.), Handbook on Trust in Public Governance (pp. 360–380). Edward Open Google Scholar
- Elgar Publishing. https://doi.org/10. 4337/9781802201406.00030 Open Google Scholar
- Wang, S., & Kim, S. (2022). Users’ emotional and behavioral responses to deepfake videos of K-pop idols. Computers in Human Behavior, 134. https://doi.org/10.1016/j.chb.2022.107305 Open Google Scholar
- Wolf, C. (2021). Public trust and biotech innovation: A theory of trustworthy regulation of (scary!) technology. Social Philosophy and Policy, 38(2), 29–49. https://doi.org/10.1017/S0265052522 000036 Open Google Scholar
- Yadlin-Segal, A., & Oppenheim, Y. (2021). Whose dystopia is it anyway? Deepfakes and social media regulation. Convergence: The International Journal of Research into New Media Technologies, 27(1), 36–51. https://doi.org/10. 1177/1354856520923963 Open Google Scholar
- Yu, E., Song, H., Jung, J., & Kim, Y. J. (2023). Perception and attitude toward the regulation of online video streaming (in South Korea). Online Media and Global Communication, 2(4), 651–679. https://doi.org/10.1515/omgc-2023-0059 Open Google Scholar
- Open Google Scholar