President Donald Trump officially signed Monday the TAKE IT DOWN Act, a bill to criminalize revenge porn – both real and AI-generated. But internet rights groups have repeatedly warned the law is overly broad and vague, and could be used to order the takedown of protected speech.
The legislation [PDF] (officially the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act) establishes a federal law requiring any online site or service hosting “non-consensual intimate imagery” (NCII) to remove it within 48 hours of a complaint or face the wrath of America’s consumer watchdog, the FTC.
Individuals who create and post such images of adults may be sentenced to two years in prison, and for three years in cases involving children. Sites that fail to remove NCII will be fined $50,120 per violation, by the FTC.
“The passage of the TAKE IT DOWN Act is a historic win in the fight to protect victims of revenge porn and deepfake abuse,” said Senator Ted Cruz (R-TX), while also thanking First Lady Melania Trump for her support of the legislation.
Sounds good, and the law passed the Senate unanimously and with only two nay votes in the House of Representatives – both of them Republicans, and both on free speech grounds. Representative Eric Burlison (R-MI) said the law was both unnecessary – since most states have their own rules on this – and could be used to chill free speech, a point Thomas Massie (R-KY) agreed with, calling it “a slippery slope, ripe for abuse, with unintended consequences.”
Internet freedom advocates agree. The main fear is that, faced with a 48-hour turnaround, many websites and apps will pull material down immediately upon request rather than risk a hefty fine. Meta, Google, and other tech giants supported the bill – they have more than enough budget to hire moderators – but smaller platforms are likely to choose to censor first and check later.
Then there’s the complaints process itself. Anyone can complain about NCII content, whether or not they are in it themselves, India McKinney, director of federal affairs for the Electronic Frontier Foundation (EFF), told The Register. One can imagine all sorts of bad-faith complaints used to stall competition, seek revenge against an enemy, or cause general mischief.
Also, the law specifically criminalizes an “intimate visual depiction” without defining what that may be.
“We have seen things from some of the more conservative members of the elected body try to claim that anything depicting queer couples living their lives in public is explicit sexual content,” McKinney said.
Finally, McKinney said, there’s the encryption issue. Internet advocacy groups have warned that the bill would cover encrypted images stored on cloud services or sent via end-to-end encrypted messaging systems like WhatsApp. This would give service providers an incentive to create some kind of content filtering system that would side-step or weaken strong end-to-end encryption, according to a letter [PDF] sent to Congress in February from orgs including the Electronic Frontier Foundation and Center for Democracy and Technology.
All in all, though there may well be good intentions behind the act, the fact that it could be abused to take down images that annoy the rich and powerful, and the petty and thin-skinned, is a real and present danger.
McKinney said Democrats made several attempts to amend the bill to clarify these open issues, though the Republican majority shot them all down.
President Trump declared he was a big supporter of the legislation, telling a joint session of Congress last month, “I’m going to use that bill for myself, too, if you don’t mind, because nobody gets treated worse than I do online. Nobody.”
Rather than tackling actual NCII of himself with this fresh law, it sounds to us as though the commander-in-chief has in his sights protected critical speech, rather worryingly. As McKinney pointed out, the law would be easy to abuse if you had the resources.
Most likely, court battles will be required to define the limits of the act. As a hypothetical, McKinney wondered what would happen if Melania Trump took objection to certain revealing photoshoots she did in her early modeling days and tried to claim they were NCII because she hadn’t given her consent for them to be reshared in that format. Would her rights trump those of the owner of the rights to the original image? The courts would have to decide this issue.
Another court decision is almost certainly coming as to whether the new law falls foul of the US First Amendment’s free speech guarantee. ®