New legislation proposes that tech companies be responsible for monitoring and removing deepfake pornography.
- Congress is considering a new bill that would make social media platforms responsible for publishing and distributing AI images that superimpose fake bodies onto real faces without consent.
- Deepfake porn videos increased 464% in 2023 over the prior year.
- Two dueling bills in the Senate could complicate the legislative process.
On Capitol Hill, lawmakers are rushing to tackle the surge of deepfake AI pornographic images that have affected a wide range of individuals, including celebrities and high school students.
A bill is being introduced to make social media companies responsible for monitoring and removing deepfake porn images on their platforms. The bill would make it illegal to publish or threaten to publish such images.
The bill's main sponsor, Sen. Ted Cruz, R-Texas, shared exclusive details about the bill with CNBC through his office.
The Take It Down Act mandates social media operators to create a process for removing images within 48 hours of receiving a valid request from a victim, and also make a reasonable effort to remove any other copies of the images, including those shared in private groups.
The responsibility of implementing these new regulations would be with the Federal Trade Commission, which oversees consumer protection guidelines.
On Tuesday, a bipartisan group of senators will formally introduce Cruz's legislation, with victims of deepfake porn, including high school students, joining them in the Capitol.
Nonconsensual AI-generated images have affected various individuals, including celebrities like Taylor Swift and politicians like Rep. Alexandria Ocasio-Cortez, as well as high school students whose classmates have used apps and AI tools to create nude or pornographic photos of them without their consent.
Our bill will safeguard and strengthen the rights of victims of this vile act by establishing a level playing field at the federal level and mandating websites to implement measures to eradicate these images, as Cruz stated in a CNBC interview.
Dueling Senate bills
A 2023 report from Home Security Heroes revealed that producers of deepfake porn saw a 464% increase in output year-over-year in 2023.
Despite broad agreement among Congress members about the necessity of tackling deepfake AI pornography, there is no unanimity on the best approach to achieve this goal.
Instead, there are two competing bills in the Senate.
This year, Sen. Dick Durbin, D-Ill., presented a bipartisan bill that enables victims of non-consensual deepfakes to take legal action against individuals who have possessed, created, distributed or held the image.
Under Cruz's bill, deepfake AI porn would be treated as extremely offensive online content, requiring social media companies to moderate and remove the images.
Last week, Sen. Cynthia Lummis prevented Durbin's bill from passing a floor vote by arguing that it was too broad and could hinder American technological advancement.
Durbin argued that his bill does not impose any liability on tech platforms.
Along with Republican Sen. Shelley Moore Capito and Democratic Sens. Amy Klobuchar, Richard Blumenthal, and Jacky Rosen, Lummis is one of the original co-sponsors on Cruz's bill.
As Senate Majority Leader Chuck Schumer, D-N.Y. pushes his chamber to move on A.I. legislation, a task force on A.I. released a "roadmap" on key A.I. issues, including developing legislation to address the "nonconsensual distribution of intimate images and other harmful deepfakes."
Politics
You might also like
- President-elect Trump's crypto token receives $30 million investment from banana auction billionaire, showcasing new ways to increase wealth.
- The GOP readies for the impending struggle over the expenses of the $5 trillion tax reduction.
- Some Democrats Support Rubio for Secretary of State While Criticizing Hegseth, Gabbard, and Gorka
- The sentencing for Trump's hush money payment has been postponed indefinitely.
- A former New York police officer admits to being involved in a fraud scheme related to foreign exchange funds.