Most of the imagery was quickly removed as researchers shared their findings with impacted members of Congress.
There is currently limited policy to restrict its creation and spread.
ASP shared the first-of-its-kind findings exclusively with The 19th.
© Photo by Win McNamee/Getty Images
The removal may be coincidental.
Jankowiczhas been the target of online harassment and threatsfor her domestic and international work dismantling disinformation.
AI-generated nonconsensual intimate imagery also opens upthreats to national securityby creating conditions for blackmail and geopolitical concessions.
That could have ripple effects on policymakers irrespective of whether theyre directly the target of the imagery.
Its affecting their own colleagues.
And this is happening simply because they are in the public eye.
Image-based sexual abuse is a unique risk for women running for office.
Maddocks has studied how women who speak out in public are more likely to experience digital sexual violence.
So when women speak publicly, its almost like, OK. Time to shame them.
Time to strip them.
Time to get them back in the house.
Time to shame them into silence.
ASP is encouraging Congress to pass federal legislation.
Its not a future harm.
Its not something that we have to imagine.
Butcritics arent optimisticabout Big Techs ability to regulate itself, given the history of harm because of its platforms.
This article wasoriginally published on The Markupand was republished under theCreative Commons Attribution-NonCommercial-NoDerivativeslicense.
News from the future, delivered to your present.
Two banks say Amazon has paused negotiations on some international data centers.