Sharing digitally altered nude images of real people could become a federal crime in the U.S., especially after Representative Joseph Morelle, a Democrat from New York, reintroduced the "Intimate Image Anti-Deepfake Act" on Tuesday. This proposed legislation aims to prohibit the non-consensual sharing of digitally modified intimate images. Morelle had previously introduced the bill, but has since added Representative Tom Kean, a Republican from New Jersey, as a co-sponsor. Kean has also introduced a bill called the "2023 Artificial Intelligence Transparency Act," requiring that AI-generated content have clear labeling.
In addition to making the sharing of digitally altered intimate images a criminal offense, the legislation proposed by Morelle and Kean would also allow victims to sue perpetrators in civil court. Morelle stated, "Let’s not wait for the next mass incident to make headlines. This is happening every day to women everywhere."
According to a report published in the Wall Street Journal, this initiative arises from both parties in response to an incident at a high school in New Jersey where nude images created by AI of female students were shared without their consent. AI programs can generate nude images by analyzing ordinary photos posted on social media, making them appear highly realistic and cohesive.
AI experts confirm that dozens of free programs are available that can swap faces and remove clothing from real images, making it difficult for the human eye to distinguish between real and fake. One of the problems facing victims is the lack of federal laws in many states to address these new tools. A study published in 2019 indicated that "96% of deepfake materials were pornographic, and 99% of them targeted women."