Opinion editor’s note: Strib Voices publishes a mix of material from 11 contributing columnists, along with other commentary online and in print each day. To contribute, click here.
•••
In the long, hard struggle for women’s rights, we have often been forced to defend ourselves against new and insidious forms of exploitation. Today, artificial intelligence has introduced yet another weapon in the war against women’s autonomy: AI-powered “nudification” technology.
Minnesota lawmakers have a chance to take a stand by passing landmark legislation that would ban the creation of nonconsensual explicit images before they can ever be spread. It is a necessary and urgent step to protect women, children and all vulnerable individuals from digital abuse.
For as long as women have fought for equality, we have also fought against the sexual objectification of women. Now, with just a few clicks, anyone can upload a fully clothed image of a woman — or a child —and an AI tool will generate a hyper-realistic nude image. It is the digital equivalent of assault, stripping away dignity and control in mere moments. The harm isn’t just in distribution; it’s in its existence itself. These images — whether used for blackmail, humiliation, revenge, voyeurism, kicks and giggles — rob individuals of their agency and privacy.
Minnesota’s proposed bill, led by Democratic Sen. Erin Maye Quade, acknowledges this fundamental truth. Unlike other state and federal efforts that focus on punishing those who distribute deepfake pornography, this bill targets the companies that create the software in the first place. It recognizes that prevention is the best form of protection. If these tools are inaccessible in Minnesota, they cannot be weaponized against our residents.
Opponents argue that such a law might be unconstitutional, a violation of free speech. But let’s be clear: Free speech does not include the right to fabricate and distribute nonconsensual sexual imagery. The First Amendment does not protect harassment. It does not protect digital sexual assault. And it certainly does not protect technology that exists solely to violate another person’s rights.
Women and girls are the primary targets of this predatory AI. A recent lawsuit by the San Francisco city attorney’s office goes after 16 of the most frequented nudification sites — platforms overwhelmingly used to victimize women and children. Once an explicit deepfake is created, it can live forever on the internet, resurfacing on social media, reshared on pornography sites, and used for blackmail schemes. The psychological damage inflicted on victims is devastating, and for young girls, it can wreck their entire future.