X Corp., the platform owned by Elon Musk and advised by him during political efforts, has filed a federal lawsuit against a Minnesota law banning the use of deepfakes to influence elections or harm political candidates. The company argues the law violates First Amendment rights and is also overridden by a federal statute from 1996 that protects platforms from liability for user-generated content.

The lawsuit, filed this week, challenges the 2023 law on grounds that it criminalizes even harmless political content such as satire or jokes, potentially holding social media companies accountable for not censoring such material. X Corp. claims the law would do more harm than good, stating, “Instead of defending democracy, this law would erode it.”

Minnesota’s law sets criminal penalties — including jail — for anyone who knowingly or recklessly spreads AI-generated deepfakes within 90 days of a nominating convention or after early voting begins in an election. To be penalized, the material must appear real enough to fool a reasonable viewer and must be shared with the intent to damage a candidate or manipulate an election outcome.

Democratic state Sen. Erin Maye Quade, the law’s author, criticized Musk’s challenge, stating he is upset because the law stands in the way of spreading harmful deepfakes. “This lawsuit is petty, misguided, and a waste of time and state resources,” she said, referencing Musk’s heavy involvement in the 2024 election and claims that he attempted to influence judicial outcomes.

The office of Minnesota Attorney General Keith Ellison, responsible for defending state laws, acknowledged the lawsuit and stated it is currently under review, with a response forthcoming.

This isn’t the first challenge to the law. A previous case filed by content creator Christopher Kohls and Republican state Rep. Mary Franson, both known for sharing AI-generated political parodies, is currently paused as they appeal a judge’s refusal to block the law.

In defending the legislation, the attorney general’s office has maintained that deepfakes are a serious and growing threat to democratic processes. The law, they argue, is narrowly crafted to target genuinely harmful uses while still allowing for parody and satire.

X Corp. noted that it is currently the only platform contesting the Minnesota law. It has also filed legal challenges against other regulations it sees as restricting free speech, such as a California political deepfake law that was blocked by a judge earlier this year. The company emphasized that tools like its “Community Notes” system and the AI-powered “Grok” tool help flag and manage misleading content effectively, similar to efforts adopted by platforms like Facebook and TikTok.

Legal expert and University of Minnesota law professor Alan Rozenshtein commented that the law is likely to be overturned. He emphasized that the First Amendment doesn’t have an exception for false or misleading political speech — even lies — and laws like this create strong pressure on platforms to over-censor just to avoid risk.

While acknowledging that deepfakes are problematic, Rozenshtein warned that such legal restrictions could do more damage to free speech than they solve. “People want to be fooled,” he said. “And that demand is the real threat to democracy — not just the technology.”

By DNN18

Leave a Reply

Your email address will not be published. Required fields are marked *