{"id":108481,"date":"2025-06-02T17:24:15","date_gmt":"2025-06-02T15:24:15","guid":{"rendered":"https:\/\/www.hiig.de\/?p=108481"},"modified":"2025-12-10T17:46:26","modified_gmt":"2025-12-10T16:46:26","slug":"unwillingly-naked-deepfake-pornography","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/","title":{"rendered":"Unwillingly naked: How deepfake pornography intensifies sexualised violence against women"},"content":{"rendered":"\n<p><strong>A seemingly innocent holiday photo becomes the template for a highly realistic nude image \u2013 generated by artificial intelligence (AI), circulated online, without the subject\u2019s knowledge or consent. What sounds like science fiction is already a disturbing reality: tens of thousands of so-called deepfake pornographic images are created daily using freely accessible AI tools. These are computer-generated images or videos that simulate nudity or sexual acts. Women are disproportionately affected. This is no coincidence: sexualised violence is deeply rooted in society and digital technologies are significantly amplifying this reality. This article explores how deepfake pornography digitally perpetuates existing structures of violence \u2013 and what must be done to offer better protection for those affected.<\/strong><\/p>\n\n\n\n<p>Deepfake pornography is not an isolated phenomenon. It is a particularly <a href=\"https:\/\/www.frauen-gegen-gewalt.de\/de\/aktuelles\/nachrichten\/nachricht\/pressemitteilung-bff-fordert-bildbasierte-gewalt-umfassend-bekaempfen-2.html\" target=\"_blank\" rel=\"noreferrer noopener\">insidious form of image-based sexual violence<\/a> (bff, 2024). This refers to digital assaults that use visual material to humiliate individuals or violate their sexual autonomy. Examples include <a href=\"https:\/\/polizei.nrw\/artikel\/upskirting-und-downblousing-ist-strafbar\" target=\"_blank\" rel=\"noreferrer noopener\">upskirting<\/a> (Polizei Nordrhein-Westfalen, 2021) \u2014 secretly photographing under skirts \u2014 and the non-consensual distribution of intimate images, often euphemistically referred to as \u201c<a href=\"https:\/\/www.frauen-gegen-gewalt.de\/de\/ueber-uns\/presse\/informationen-fuer-die-presse\/hinweise-fuer-die-berichterstattung-ueber-gewalt-gegen-frauen-und-kinder.html\" target=\"_blank\" rel=\"noreferrer noopener\">revenge porn<\/a>\u201d (Frauen gegen Gewalt e.V., 2025), or as in this case, the dissemination of fake, AI-generated depictions of nudity or intimacy. These images are frequently distributed anonymously via messaging services, <a href=\"https:\/\/de.wikipedia.org\/wiki\/Imageboard\" target=\"_blank\" rel=\"noreferrer noopener\">imageboards<\/a> (Wikipedia, 2025) or pornographic platforms. Anyone can fall victim to deepfake pornography.<\/p>\n\n\n\n<p>The psychological toll these attacks take becomes evident in the voices of survivors. Danielle Citron, legal scholar and professor at the University of Maryland, describes deepfakes as an \u201cinvasion of sexual privacy\u201d. In an interview with <a href=\"https:\/\/www.vice.com\/en\/article\/deepnude-app-creates-fake-nudes-of-any-woman\/\" target=\"_blank\" rel=\"noreferrer noopener\">Vice magazine<\/a>, she quotes a survivor saying: \u201cYes, it isn\u2019t your actual vagina, but [\u2026] others think that they are seeing you naked.\u201d Citron continues: \u201cAs a deepfake victim said to me \u2014 it felt like thousands saw her naked. She felt her body wasn\u2019t her own anymore\u201d (Cole, S., 2019).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>A click away from harm: The ease of creating deepfakes<\/strong><\/h2>\n\n\n\n<p>But how are such deepfakes created\u2014and who is behind them? What once required technical expertise, time and powerful computers is now (perhaps too) easily within reach. Deepfake images and videos can be generated using a smartphone and a single social media photo. So-called <a href=\"https:\/\/www.vice.com\/en\/article\/deepnude-app-creates-fake-nudes-of-any-woman\/\" target=\"_blank\" rel=\"noreferrer noopener\">nudifier apps<\/a> and browser-based services openly offer their tools online: users upload any image and, within seconds, the person pictured appears undressed (Cole, S., 2019). The first results are often free, followed by an offer for paid subscription.<\/p>\n\n\n\n<p>These are far from isolated cases. An investigation by <em>netzpolitik.org<\/em> revealed a wide range of providers generating thousands of such images daily. The business is booming. A <a href=\"https:\/\/www.404media.co\/chinese-ai-video-generators-unleash-a-flood-of-new-nonconsensual-porn-3\/?ref=daily-stories-newsletter\" target=\"_blank\" rel=\"noreferrer noopener\">study by 404 Media<\/a> further illustrates the scale: Numerous AI-powered video generators \u2014 particularly from Chinese companies \u2014 offer minimal safeguards against the production of non-consensual pornographic content (Maiberg, E. 2025). These tools are already being used en masse to create disturbingly realistic sexualised deepfake videos, using nothing more than a portrait photo. Once uploaded, these videos circulate in dedicated online communities and are nearly impossible to remove.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>From Taylor Swift to schoolgirls: A shifting target group<\/strong><\/h2>\n\n\n\n<p>What makes the issue especially concerning is that most deepfakes feature bodies read as female. One reason lies in the training data behind the systems: many AI models were trained on millions of images of naked women. The result is a structurally biased technology that doesn\u2019t merely replicate gender-based violence \u2014 it amplifies it. What emerges is a deeply gendered, automated form of digital violence \u2014 primarily targeting <a href=\"https:\/\/www.securityhero.io\/state-of-deepfakes\/#targeted-individuals\" target=\"_blank\" rel=\"noreferrer noopener\">women<\/a> (SecurityHero, 2023).<\/p>\n\n\n\n<p>Initially, it was public figures who were targeted: <a href=\"https:\/\/regmedia.co.uk\/2019\/10\/08\/deepfake_report.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">actresses, influencers, female politicians<\/a> (Ajder, H. et al., 2019). But as the tools became more accessible, the target group shifted. Today, it is often girls and women from users\u2019 immediate social environments who are affected: classmates, colleagues, neighbours. In Spain, for example, <a href=\"https:\/\/netzpolitik.org\/2023\/deepfakes-in-spanien-gefaelschte-nacktbilder-von-maedchen-sorgen-fuer-aufschrei\/\" target=\"_blank\" rel=\"noreferrer noopener\">AI-generated nude images of schoolgirls<\/a> circulated in messaging groups caused a scandal already in 2023 (K\u00f6ver, C., 2023). In Pennsylvania, a teenager was arrested the following year for creating <a href=\"https:\/\/www.derstandard.de\/story\/3000000245632\/skandal-um-ki-nacktbilder-legt-us-schule-lahm\" target=\"_blank\" rel=\"noreferrer noopener\">deepfake nudes of his female classmates<\/a> (Der Standard, 2024).<\/p>\n\n\n\n<p>The full extent of the harm remains largely hidden. Reliable data are scarce. Many victims are not even aware that manipulated images of them are being circulated online.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>A systemic form of intersectional violence<\/strong><\/h2>\n\n\n\n<p>This particular form of digital abuse is systemic. As legal scholar <a href=\"https:\/\/regmedia.co.uk\/2019\/10\/08\/deepfake_report.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">Danielle Citron<\/a> accurately observes:<br>\u201cDeepfake technology is being weaponised against women by inserting their faces into porn. It is terrifying, embarrassing, demeaning, and silencing. Deepfake sex videos tell individuals their bodies are not their own \u2014 and can make it difficult to stay online, get or keep a job, and feel safe\u201d (Ajder, H. et al., 2019).<\/p>\n\n\n\n<p>The targeted sexualised depiction of women\u2019s \u2014 and increasingly queer \u2014 bodies is not a technical malfunction; it is an expression of patriarchal structures and is being deliberately used. This often occurs in the context of <a href=\"https:\/\/www.bpb.de\/lernen\/bewegtbild-und-politische-bildung\/556843\/strafrecht-und-regulierung-von-deepfake-pornografie\/#footnote-target-25\" target=\"_blank\" rel=\"noreferrer noopener\">anti-feminist campaigns and incel movements<\/a>, which aim to intimidate and exclude certain groups of people (Sittig, J., 2024).<\/p>\n\n\n\n<p>Studies show that over 95% of all deepfakes are sexual in nature. Almost 100% of these depict women. <a href=\"https:\/\/datasociety.net\/wp-content\/uploads\/2019\/09\/DS_Deepfakes_Cheap_FakesFinal-1.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">Marginalised groups are particularly affected: queer individuals, Black women and trans women<\/a> (Paris, B. et al., (2019). This deliberate form of digital violence creates what is known as a <a href=\"https:\/\/glossar.neuemedienmacher.de\/glossar\/silencing-effekt\/\" target=\"_blank\" rel=\"noreferrer noopener\">silencing effect<\/a> (NdM-Glossar, 2025): it distorts digital visibility and restricts democratic participation.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Digital violence as a business model<\/strong><\/h2>\n\n\n\n<p>What many still view as a niche phenomenon has become part of a lucrative market. Platforms offering deepfake services typically operate anonymously or from abroad. <a href=\"https:\/\/netzpolitik.org\/2024\/ki-nacktbilder-wie-online-shops-mit-sexualisierten-deepfakes-abkassieren\/\" target=\"_blank\" rel=\"noreferrer noopener\">Access is easy<\/a>: an email address suffices, payment is made by credit card, Google Pay or cryptocurrency (Meineck, S., 2024). A simple disclaimer (\u201cno editing without consent\u201d) is often the only nod to legality. Responsibility is shifted to users, while providers distance themselves from accountability.<\/p>\n\n\n\n<p>However, this very business model could offer a leverage point: The case of Pornhub demonstrates what economic pressure can achieve. In 2020, <a href=\"https:\/\/www.derstandard.de\/story\/2000138125747\/nach-vorwuerfen-visa-und-mastercard-sperren-zahlung-von-pornhub-werbung\" target=\"_blank\" rel=\"noreferrer noopener\">Visa and Mastercard cut ties with the platform<\/a> over non-consensual content, prompting significant changes in its upload policies and age verification processes (Der Standard, 2022). A similar mechanism could be applied to deepfake platforms \u2014 such as mandatory <a href=\"https:\/\/verfassungsblog.de\/deepfakes-ncid-ai-regulation\/\" target=\"_blank\" rel=\"noreferrer noopener\">withdrawal of support from payment providers, hosting services or search engines<\/a> that enable their digital infrastructure (Kira, B., 2024).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Legal gaps and political momentum<\/strong><\/h2>\n\n\n\n<p>But economic pressure alone is not enough. Criminal law has so far struggled to keep pace with deepfake-related offences. Although German law includes provisions such as <a href=\"https:\/\/www.gesetze-im-internet.de\/stgb\/__201a.html\" target=\"_blank\" rel=\"noreferrer noopener\">StGB, 1871, \u00a7 201a<\/a> (violation of personal privacy) and the <a href=\"https:\/\/stiftungdatenschutz.org\/ehrenamt\/praxisratgeber\/praxisratgeber-detailseite\/fotos-und-datenschutz-275\" target=\"_blank\" rel=\"noreferrer noopener\">right to one&#8217;s own image<\/a> (Bittner, C., 2021), many deepfake pornography cases fall through the cracks. Legal developments lag behind technological ones, perpetrators remain anonymous, and platforms operate outside EU jurisdiction. The German Women Lawyers\u2019 Association (djb) has <a href=\"https:\/\/www.djb.de\/presse\/stellungnahmen\/detail\/st23-17\" target=\"_blank\" rel=\"noreferrer noopener\">criticised these legal gaps<\/a> and called for a dedicated, discrimination-sensitive criminal offence for the unauthorised creation and dissemination of sexualised deepfakes \u2014 independent of traditional pornography legislation (Deutscher Juristinnenbund, 2023). There is also an urgent need for targeted training for police and the judiciary, as well as the establishment of a network of specialised prosecutors to raise awareness and develop effective solutions.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>EU regulation: A first step, but not a cure-all<\/strong><\/h2>\n\n\n\n<p>While national legislation lags behind, developments at the EU level are gaining momentum. The Digital Services Act (DSA) requires platforms to <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/PDF\/?uri=CELEX:32022R2065\" target=\"_blank\" rel=\"noreferrer noopener\">swiftly remove reported illegal content<\/a> (DSA, Art. 16, 2022) \u2014 including deepfakes \u2014 where clearly unlawful. The <a href=\"https:\/\/ai-act-law.eu\/de\/artikel\/50\/\" target=\"_blank\" rel=\"noreferrer noopener\">European AI Act introduces transparency obligations<\/a> (AI Act, Art. 50, 2024) that synthetically generated content must be labelled as such. And with the new <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/DE\/TXT\/PDF\/?uri=OJ:L_202401385\" target=\"_blank\" rel=\"noreferrer noopener\">EU directive to combat violence against women<\/a> (European Parliament and Council, 2024), the non-consensual dissemination of sexualised deepfakes will, for the first time, be criminalised across Europe. Member States \u2014 including Germany \u2014 have until 2027 to incorporate these rules into national law. In parallel, Germany\u2019s proposed <a href=\"https:\/\/www.bundesregierung.de\/breg-de\/bundesregierung\/gesetzesvorhaben\/gewalthilfegesetz-2321756\" target=\"_blank\" rel=\"noreferrer noopener\">Violence Support Act<\/a> aims to improve access to legal advice and assistance for victims (Die Bundesregierung, 2025).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Digital self-defence and social responsibility<\/strong><\/h2>\n\n\n\n<p>In addition to legal regulation, technical and societal prevention is essential. Tools like <a href=\"https:\/\/glaze.cs.uchicago.edu\/\" target=\"_blank\" rel=\"noreferrer noopener\">Glaze<\/a> and <a href=\"https:\/\/nightshade.cs.uchicago.edu\/whatis.html\" target=\"_blank\" rel=\"noreferrer noopener\">Nightshade <\/a>can, for example, alter images in such a way that they become unusable for AI systems \u2014 preventing original photos from being repurposed for training datasets or the generation of realistic deepfakes. Think of them as a digital cloak of invisibility against deepfake abuse.<\/p>\n\n\n\n<p>At the same time, public awareness must shift. Image-based sexual violence is still trivialised. Victims are subjected to <a href=\"https:\/\/www.fes.de\/wissen\/gender-glossar\/victim-blaming\" target=\"_blank\" rel=\"noreferrer noopener\">victim blaming<\/a> (Friedrich Ebert Stiftung, 2025) rather than support. Yet this is not just about individual fates \u2014 it is about structural inequalities that are reproduced and exacerbated in the digital realm.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>A complex problem demands multifaceted solutions<\/strong><\/h2>\n\n\n\n<p>Sexualised deepfakes are more than technical manipulation. They reflect a shift in digital power dynamics \u2014 where existing inequalities are not only reproduced but intensified. The deliberate violation of intimacy and control disproportionately affects those who are already structurally disadvantaged. Deepfakes affect us all \u2014 but not equally. That\u2019s why we need collective responses that are not merely technical, but feminist, human rights-based and rooted in solidarity. Digital violence is not a fringe issue of internet culture. It is its litmus test.<\/p>\n\n\n\n<p>Organisations like <a href=\"https:\/\/hateaid.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">HateAid<\/a>, the <a href=\"https:\/\/www.frauen-gegen-gewalt.de\/de\/aktuelles.html\" target=\"_blank\" rel=\"noreferrer noopener\">bff \u2013 Frauen gegen Gewalt e.V.<\/a> and <a href=\"https:\/\/annanackt.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">anna nackt<\/a> are already taking action against non-consensual sexualised deepfakes. They support victims, offer contact points, and in 2023 jointly<a href=\"https:\/\/hateaid.org\/petition-deepfake-pornos\/\" target=\"_blank\" rel=\"noreferrer noopener\">submitted a petition to German Digital Minister Volker Wissing<\/a>, calling for stronger protections and clearer legal frameworks (HateAid, 2023).<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>References<\/strong><\/h2>\n\n\n\n<p>Ajder, H., Patrini, G., Cavalli, F., &amp; Cullen, L.(2019): The State of Deepfakes: Landscape, Threats, and Impact. <em>Deeptrace<\/em>. <a href=\"https:\/\/regmedia.co.uk\/2019\/10\/08\/deepfake_report.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/regmedia.co.uk\/2019\/10\/08\/deepfake_report.pdf<\/a><\/p>\n\n\n\n<p>Bittner, C. (2021): (Fast) keine Fotos ohne Datenschutz. <em>Stiftung Datenschutz<\/em>. <a href=\"https:\/\/stiftungdatenschutz.org\/ehrenamt\/praxisratgeber\/praxisratgeber-detailseite\/fotos-und-datenschutz-275\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/stiftungdatenschutz.org\/ehrenamt\/praxisratgeber\/praxisratgeber-detailseite\/fotos-und-datenschutz-275<\/a>\u00a0<\/p>\n\n\n\n<p>Bundesministerium der Justiz (Hrsg.) (1871). <em>Strafgesetzbuch (StGB)<\/em>. <a href=\"https:\/\/www.gesetze-im-internet.de\/stgb\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.gesetze-im-internet.de\/stgb\/<\/a>\u00a0<\/p>\n\n\n\n<p>Cole, S. (2019): This Horrifying App Undresses a Photo of Any Woman With a Single Click. <em>Vice<\/em>. <a href=\"https:\/\/www.vice.com\/en\/article\/deepnude-app-creates-fake-nudes-of-any-woman\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.vice.com\/en\/article\/deepnude-app-creates-fake-nudes-of-any-woman\/<\/a><\/p>\n\n\n\n<p>Deutscher Juristinnenbund (2023): Bek\u00e4mpfung bildbasierter sexualisierter Gewalt. <em>Policy Paper<\/em>, 17. <a href=\"https:\/\/www.djb.de\/presse\/stellungnahmen\/detail\/st23-17\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.djb.de\/presse\/stellungnahmen\/detail\/st23-17<\/a><\/p>\n\n\n\n<p>Der Standard (2024). Skandal um KI-Nacktbilder legt US-Schule lahm. <em>Der Standard<\/em>. <a href=\"https:\/\/www.derstandard.de\/story\/3000000245632\/skandal-um-ki-nacktbilder-legt-us-schule-lahm\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.derstandard.de\/story\/3000000245632\/skandal-um-ki-nacktbilder-legt-us-schule-lahm<\/a>\u00a0<\/p>\n\n\n\n<p>Der Standard (2022). Nach Vorw\u00fcrfen: Visa und Mastercard sperren Zahlung von Pornhub-Werbung (2022). <em>Der Standard<\/em>. <a href=\"https:\/\/www.derstandard.de\/story\/2000138125747\/nach-vorwuerfen-visa-und-mastercard-sperren-zahlung-von-pornhub-werbung\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.derstandard.de\/story\/2000138125747\/nach-vorwuerfen-visa-und-mastercard-sperren-zahlung-von-pornhub-werbung<\/a><\/p>\n\n\n\n<p>Die Bundesregierung (2025). Bessere Unterst\u00fctzung f\u00fcr Gewaltopfer.<em> Die Bundesregierung<\/em>. <a href=\"https:\/\/www.bundesregierung.de\/breg-de\/service\/archiv-bundesregierung\/gewalthilfegesetz-2321756\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.bundesregierung.de\/breg-de\/service\/archiv-bundesregierung\/gewalthilfegesetz-2321756<\/a><\/p>\n\n\n\n<p>European Commission (2024). <em>Regulation (EU) 2024\/1689 of the European Parliament and of the Council of 13 June 2024 on artificial intelligence and amending certain Union legislative acts (AI Act)<\/em>, Art.\u202f50. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2024\/1689\/oj\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/eli\/reg\/2024\/1689\/oj<\/a>\u00a0<\/p>\n\n\n\n<p>European Commission (2022). <em>Regulation (EU) 2022\/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000\/31\/EC (Digital Services Act)<\/em>, Art. 16. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2022\/2065\/oj\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/eli\/reg\/2022\/2065\/oj<\/a>\u00a0<\/p>\n\n\n\n<p>European Commission (2016). <em>Regulation (EU) 2016\/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)<\/em>. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2016\/679\/oj\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/eli\/reg\/2016\/679\/oj<\/a>\u00a0<\/p>\n\n\n\n<p>European Parliament and Council (2024). Directive (EU) 2024\/1385 of 14 May 2024 on combating violence against women and domestic violence. <em>Official Journal of the European Union<\/em>. <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/DE\/TXT\/PDF\/?uri=OJ:L_202401385\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/legal-content\/DE\/TXT\/PDF\/?uri=OJ:L_202401385<\/a><\/p>\n\n\n\n<p>Frauen gegen Gewalt e.V. (2025). Hinweise f\u00fcr die Berichterstattung \u00fcber Gewalt gegen Frauen und Kinder. <em>Bundesverband Frauenberatungsstellen und Frauennotrufe \u2013 Frauen gegen Gewalt (bff). <\/em><a href=\"https:\/\/www.frauen-gegen-gewalt.de\/de\/ueber-uns\/presse\/informationen-fuer-die-presse\/hinweise-fuer-die-berichterstattung-ueber-gewalt-gegen-frauen-und-kinder.html\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.frauen-gegen-gewalt.de\/de\/ueber-uns\/presse\/informationen-fuer-die-presse\/hinweise-fuer-die-berichterstattung-ueber-gewalt-gegen-frauen-und-kinder.html<\/a><\/p>\n\n\n\n<p>Frauen gegen Gewalt e.V. (2024). bff fordert: Bildbasierte Gewalt umfassend bek\u00e4mpfen.<em> Bundesverband Frauenberatungsstellen und Frauennotrufe \u2013 Frauen gegen Gewalt (bff)<\/em>. <a href=\"https:\/\/www.frauen-gegen-gewalt.de\/de\/aktuelles\/nachrichten\/nachricht\/pressemitteilung-bff-fordert-bildbasierte-gewalt-umfassend-bekaempfen-2.html\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.frauen-gegen-gewalt.de\/de\/aktuelles\/nachrichten\/nachricht\/pressemitteilung-bff-fordert-bildbasierte-gewalt-umfassend-bekaempfen-2.html<\/a>\u00a0<\/p>\n\n\n\n<p>Friedrich Ebert Stiftung (2025). Victim Blaming. <em>Friedrich Ebert Stiftung<\/em>. <a href=\"https:\/\/www.fes.de\/wissen\/gender-glossar\/victim-blaming\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.fes.de\/wissen\/gender-glossar\/victim-blaming<\/a>\u00a0<\/p>\n\n\n\n<p>Hate Aid (2023): Deepfake-Pornos: Betroffene konfrontieren Wissing. <em>Hate Aid gGmbH<\/em>. <a href=\"https:\/\/hateaid.org\/wp-content\/uploads\/2023\/10\/Pressemitteilung-Schutz-vor-Deepfake-Pornos.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/hateaid.org\/wp-content\/uploads\/2023\/10\/Pressemitteilung-Schutz-vor-Deepfake-Pornos.pdf<\/a><\/p>\n\n\n\n<p>Kira, B. (2024): Deepfakes, the Weaponisation of AI Against Women and Possible Solutions. <em>Verfassungsblog<\/em>. <a href=\"https:\/\/doi.org\/10.59704\/9987d92e2c183c7f\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/doi.org\/10.59704\/9987d92e2c183c7f<\/a><\/p>\n\n\n\n<p>K\u00f6ver, C. (2023): Gef\u00e4lschte Nacktbilder von M\u00e4dchen sorgen f\u00fcr Aufschrei.<em> netzpolitik.org<\/em>. <a href=\"https:\/\/netzpolitik.org\/2023\/deepfakes-in-spanien-gefaelschte-nacktbilder-von-maedchen-sorgen-fuer-aufschrei\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/netzpolitik.org\/2023\/deepfakes-in-spanien-gefaelschte-nacktbilder-von-maedchen-sorgen-fuer-aufschrei\/<\/a>\u00a0<\/p>\n\n\n\n<p>Maiberg, E. (2025): Chinese AI Video Generators Unleash a Flood of New Nonconsensual Porn. 404 <em>Media<\/em>. <a href=\"https:\/\/www.404media.co\/chinese-ai-video-generators-unleash-a-flood-of-new-nonconsensual-porn-3\/?ref=daily-stories-newsletter\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.404media.co\/chinese-ai-video-generators-unleash-a-flood-of-new-nonconsensual-porn-3\/?ref=daily-stories-newsletter<\/a>\u00a0<\/p>\n\n\n\n<p>Meineck, S. (2024): Wie Online-Shops mit sexualisierten Deepfakes abkassieren. <em>Netzpolitik<\/em>. <a href=\"https:\/\/netzpolitik.org\/2024\/ki-nacktbilder-wie-online-shops-mit-sexualisierten-deepfakes-abkassieren\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/netzpolitik.org\/2024\/ki-nacktbilder-wie-online-shops-mit-sexualisierten-deepfakes-abkassieren\/<\/a><\/p>\n\n\n\n<p>NdM-Glossar (2025): Silencing(-Effekt). <em>W\u00f6rterverzeichnis der Neuen deutschen Medienmacher*innen (NdM)<\/em>. <a href=\"https:\/\/glossar.neuemedienmacher.de\/glossar\/silencing-effekt\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/glossar.neuemedienmacher.de\/glossar\/silencing-effekt\/<\/a><\/p>\n\n\n\n<p>Paris, B., Donovan, J. (2019): Deepfakes and Cheap Fakes. The Manipulation of Audio and Visual Evidence. <em>Data &amp; Society<\/em>. <a href=\"https:\/\/datasociety.net\/wp-content\/uploads\/2019\/09\/DS_Deepfakes_Cheap_FakesFinal-1.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/datasociety.net\/wp-content\/uploads\/2019\/09\/DS_Deepfakes_Cheap_FakesFinal-1.pdf<\/a>\u00a0<\/p>\n\n\n\n<p>Polizei Nordrhein-Westfalen (2021). \u201eUpskirting\u201c und \u201eDownblousing\u201c ist strafbar.<em> Polizei Nordrhein-Westfalen. <\/em><a href=\"https:\/\/polizei.nrw\/artikel\/upskirting-und-downblousing-ist-strafbar\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/polizei.nrw\/artikel\/upskirting-und-downblousing-ist-strafbar<\/a>\u00a0<\/p>\n\n\n\n<p>Reuther, J. (2021): Digital Rape: Women Are Most Likely to Fall Victim to Deepfakes. <em>The Deepfake Repor<\/em>t. <a href=\"https:\/\/www.thedeepfake.report\/en\/09-digital-rape-en\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.thedeepfake.report\/en\/09-digital-rape-en<\/a>\u00a0<\/p>\n\n\n\n<p>Sittig, J. (2024): Strafrecht und Regulierung von Deepfake-Pornografie. <em>Bundeszentrale f\u00fcr politische Bildung<\/em>. <a href=\"https:\/\/www.bpb.de\/lernen\/bewegtbild-und-politische-bildung\/556843\/strafrecht-und-regulierung-von-deepfake-pornografie\/#footnote-target-25\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.bpb.de\/lernen\/bewegtbild-und-politische-bildung\/556843\/strafrecht-und-regulierung-von-deepfake-pornografie\/#footnote-target-25<\/a><\/p>\n\n\n\n<p>State of Deepfakes \u2013 Realities, Threats, and Impact (2023).<em> SecurityHero<\/em>. <a href=\"https:\/\/www.securityhero.io\/state-of-deepfakes\/#targeted-individuals\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.securityhero.io\/state-of-deepfak\u2020Aes\/#targeted-individuals<\/a><\/p>\n\n\n\n<p>Wikipedia (2025). Imageboard. <em>Wikipedia. <\/em><a href=\"https:\/\/www.fes.de\/wissen\/gender-glossar\/victim-blaming\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.fes.de\/wissen\/gender-glossar\/victim-blaming<\/a>\u00a0<\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Funwillingly-naked-deepfake-pornography%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=Unwillingly%20naked%3A%20How%20deepfake%20pornography%20intensifies%20sexualised%20violence%20against%20women https%3A%2F%2Fwww.hiig.de%2Fen%2Funwillingly-naked-deepfake-pornography%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Funwillingly-naked-deepfake-pornography%2F&subject=Unwillingly%20naked%3A%20How%20deepfake%20pornography%20intensifies%20sexualised%20violence%20against%20women\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.<\/p>\n","protected":false},"author":313,"featured_media":108483,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1289,1577,1582],"tags":[],"class_list":["post-108481","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-digital-so","category-ftif-ai-and-society"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Unwillingly naked &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Unwillingly naked &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2025-06-02T15:24:15+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-12-10T16:46:26+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1144\" \/>\n\t<meta property=\"og:image:height\" content=\"643\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Digital Society Blog\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Digital Society Blog\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"12 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Unwillingly naked &#8211; Digital Society Blog","description":"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/","og_locale":"en_US","og_type":"article","og_title":"Unwillingly naked &#8211; Digital Society Blog","og_description":"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.","og_url":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/","og_site_name":"HIIG","article_published_time":"2025-06-02T15:24:15+00:00","article_modified_time":"2025-12-10T16:46:26+00:00","og_image":[{"width":1144,"height":643,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png","type":"image\/png"}],"author":"Digital Society Blog","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Digital Society Blog","Est. reading time":"12 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/"},"author":{"name":"Digital Society Blog","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd"},"headline":"Unwillingly naked: How deepfake pornography intensifies sexualised violence against women","datePublished":"2025-06-02T15:24:15+00:00","dateModified":"2025-12-10T16:46:26+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/"},"wordCount":2111,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png","articleSection":["Artificial Intelligence","Digital Society Blog","ftif AI and Society"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/","url":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/","name":"Unwillingly naked &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png","datePublished":"2025-06-02T15:24:15+00:00","dateModified":"2025-12-10T16:46:26+00:00","description":"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/06\/Titelbild_Deepfake.png","width":1144,"height":643,"caption":"Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change."},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/unwillingly-naked-deepfake-pornography\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"Unwillingly naked: How deepfake pornography intensifies sexualised violence against women"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd","name":"Digital Society Blog"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/108481","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/313"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=108481"}],"version-history":[{"count":7,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/108481\/revisions"}],"predecessor-version":[{"id":112026,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/108481\/revisions\/112026"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/108483"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=108481"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=108481"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=108481"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}