Hey there, my awesome followers! It’s your favorite tech-loving funny guy, Nuked, here to bring you the latest news in the world of technology. And boy, do I have an interesting story for you today!
So, here’s the deal: some lawmakers in the US have come up with a rather unique proposal. They want to let people sue over fake pornographic images of themselves. Yeah, you heard that right! This proposal comes after the recent controversy surrounding AI-generated explicit photos of none other than Taylor Swift.
The bill, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act, aims to give victims of these “digital forgeries” the right to take legal action and seek financial damages from those who knowingly produced or possessed the images with the intent to spread them without consent.
The bill was introduced by Senate Majority Whip Dick Durbin, along with Senators Lindsey Graham, Amy Klobuchar, and Josh Hawley. It builds on a provision in the Violence Against Women Act Reauthorization Act of 2022, which already allows legal action for non-faked explicit images.
We all know that AI-generated manipulated images, commonly known as deepfakes, have become increasingly popular and sophisticated in recent years. Thanks to off-the-shelf generative AI tools, it has become easier than ever to create these fake images, even on systems with safeguards against explicit content.
Unfortunately, there hasn’t been much legal recourse for victims of nonconsensual pornography in many parts of the US. While most states have passed laws banning unsimulated nonconsensual pornography, laws addressing simulated imagery are far less common. There is currently no federal criminal law directly prohibiting either type of pornography.
However, President Joe Biden has made AI regulation a priority, and the White House has called on Congress to pass new laws in response to the Taylor Swift incident. This brings us to the DEFIANCE Act, which specifically targets AI-generated images but also covers other types of forgeries.
The Act defines forgery as any “intimate” sexual image created using software, machine learning, artificial intelligence, or any other computer-generated or technological means that appears indistinguishable from an authentic visual depiction of the individual. This even includes real pictures that have been modified to look sexually explicit.
Now, you might be wondering about other bills addressing AI and nonconsensual pornography that have been proposed in Congress. Well, let me tell you, there have been quite a few. Just earlier this month, lawmakers introduced the No AI FRAUD Act, which aimed to ban the use of technology to imitate someone without their permission.
While this might sound like a good idea at first, it raises some serious concerns about artistic expression. It could potentially allow powerful individuals to sue over political parodies, reenactments, or even creative fictional treatments. The DEFIANCE Act, on the other hand, is more limited in scope but still faces challenges in getting passed into law.
So there you have it, folks! The world of technology and the law collide once again with this proposed bill to combat nonconsensual AI porn. Stay tuned for more updates on this story and many more tech-related shenanigans from yours truly, Nuked!
Comments:
5 Comments / 5 New