In an alarming trend that threatens the integrity of democratic participation, AI-generated deepfakes are being weaponized against women politicians worldwide, with particularly disturbing instances reported in the United States. The American Sunlight Project (ASP), a disinformation research group, has unearthed more than 35,000 instances of deepfake content specifically targeting 26 members of the U.S. Congress, of which 25 are women. This content was found spread across various pornographic websites, highlighting the sinister use of AI technology to undermine female leadership.
Nina Jankowicz, CEO of ASP, stated, "Female lawmakers are being targeted by AI-generated deepfake pornography at an alarming rate. This isn't just a tech problem — it's a direct assault on women in leadership and democracy itself." The study revealed that nearly 16% of all women currently serving in Congress, or about 1 in 6 congresswomen, have become victims of nonconsensual intimate imagery (NCII) generated by AI.
This issue isn't confined to the United States; from Italy to the UK and Pakistan, female politicians are increasingly falling prey to similar tactics. In Italy, Prime Minister Giorgia Meloni is seeking damages for deepfake porn videos featuring her, labeling it a "form of violence against women." In the UK, Deputy Prime Minister Angela Rayner was among more than 30 female politicians targeted by a deepfake porn site. Meanwhile, in Pakistan, a deepfake video falsely depicted a lawmaker in an immoral act, damaging her reputation in a conservative society.
The ease of access to AI tools that can create these deepfakes has outpaced regulatory efforts, leading to calls for urgent action. The intimate imagery is often used to tarnish reputations, jeopardize careers, and even threaten national security through blackmail or harassment. The ASP has privately notified the offices of the targeted U.S. Congresswomen, resulting in almost all content being removed from the sites. However, the organization highlighted a "disparity of privilege," noting that women without the resources of Congress members would struggle to achieve such rapid content removal.
Globally, the response to this crisis has been mixed. Some countries are moving towards criminalizing the creation and distribution of deepfake pornography, while others lag behind. The UK is considering laws to ban the creation of deepfake porn, and in the U.S., states like California and Florida have made it a punishable offense. However, the lack of federal legislation and the global nature of the internet pose significant challenges.
This phenomenon is part of a broader trend of tech-facilitated gender-based violence aimed at silencing women's voices in public life. Experts warn that without comprehensive legal frameworks and tech industry cooperation, the chilling effect of these deepfakes could deter women from entering politics, thus undermining gender equality in governance.
The fight against AI-generated deepfakes targeting women politicians is not just about protecting individuals but safeguarding the democratic process itself. It demands a concerted effort from lawmakers, tech companies, and society at large to ensure that technology is not used to perpetuate gender-based violence but rather to foster a more inclusive and fair political landscape.