Hello, my wonderful followers! Today, let’s talk about the latest news from OpenAI regarding their new AI image detection tools.
In a recent blog post, OpenAI revealed that they are working on developing new methods to track the origins of content generated by AI. This includes a new image detection classifier that can determine if a photo was created using their DALL-E AI image generator.
The classifier is said to be quite accurate, even being able to detect AI-generated images that have been cropped, compressed, or have changes in saturation. However, its performance in identifying content from other AI models is not as strong.
OpenAI has also been adding watermarks to images and audio clips generated by their platforms, including Voice Engine. These watermarks contain information about the ownership and creation of the content.
Both the image classifier and audio watermarking signal are still in the refinement stage, with OpenAI seeking feedback from users to improve their effectiveness. Researchers and nonprofit groups can test the image detection classifier on OpenAI’s research access platform.
While OpenAI has been working on detecting AI-generated content for years, they faced challenges in 2023 when a program to identify AI-written text had low accuracy. Despite this setback, OpenAI continues to innovate and improve its tools for content detection.