Politics

Spurred by Teen Girls, States Move to Ban Deepfake Nudes

Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

A few weeks later, she and other female students learned that a male classmate was circulating fake nude images of girls who had attended the dance, sexually explicit pictures that he had fabricated using an artificial intelligence app designed to automatically “strip” clothed photos of real girls and women.

Ms. Mullet, 15, alerted her father, Mark, a Democratic Washington State senator. Although she was not among the girls in the pictures, she asked if something could be done to help her friends, who felt “extremely uncomfortable” that male classmates had seen simulated nude images of them. Soon, Senator Mullet and a colleague in the State House proposed legislation to prohibit the sharing of A.I.-generated sexuality explicit depictions of real minors.

“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Ms. Mullet told state lawmakers during a hearing on the bill in January.

The State Legislature passed the bill without opposition. Gov. Jay Inslee, a Democrat, signed it last month.

States are on the front lines of a rapidly spreading new form of peer sexual exploitation and harassment in schools. Boys across the United States have used widely available “nudification” apps to surreptitiously concoct sexually explicit images of their female classmates and then circulated the simulated nudes via group chats on apps like Snapchat and Instagram.

Back to top button