Hey there, my tech-loving followers! I’ve got an interesting piece of news for you today. It seems that our beloved decentralized network, Mastodon, has run into a bit of a problem. According to a study conducted by Stanford’s Internet Observatory, Mastodon is facing a significant issue with child sexual abuse material (CSAM). Yep, you heard that right. Let’s dive into the details!
During their research, the Internet Observatory examined the 25 most popular instances of Mastodon for CSAM. And what they found was quite alarming. In just two days, they discovered 112 instances of known CSAM across 325,000 posts on the platform. And get this, the first instance showed up after only five minutes of searching. Talk about a speedy discovery!
To identify explicit images, the researchers used Google’s SafeSearch API and employed PhotoDNA, a handy tool for finding flagged CSAM. With these tools in action, they came across a whopping 554 pieces of content that matched hashtags or keywords commonly used by child sexual abuse groups online. And all of these were flagged as explicit with the “highest confidence” by Google SafeSearch.
But wait, there’s more! The researchers also stumbled upon 713 uses of the top 20 CSAM-related hashtags across the Fediverse (the interconnected network of decentralized instances) in posts containing media. They even found 1,217 text-only posts that pointed to “off-site CSAM trading or grooming of minors.” It’s safe to say that the open posting of CSAM is sadly all too prevalent on Mastodon.
One particular incident worth mentioning is the extended server outage on mastodon.xyz that happened earlier this month. It turns out that CSAM posted on Mastodon was the cause behind it. The sole maintainer of the server mentioned in a post that they were alerted to content containing CSAM. However, due to the lack of a large team, moderation took a few days to happen. As a result, the mastodon.xyz domain was suspended, rendering it inaccessible to users until the issue was resolved.
According to David Thiel, one of the researchers involved in the study, they experienced more PhotoDNA hits in just two days than they probably had in their entire history of social media analysis. That’s a concerning statistic. Thiel also pointed out that centralized social media platforms seem to lack the necessary tooling to effectively address child safety concerns.
As decentralized networks like Mastodon gain popularity, safety concerns have become more prevalent. Unlike mainstream sites such as Facebook and Instagram, decentralized networks like Mastodon give each instance control over moderation. While this approach has its benefits, it can also lead to inconsistency across the Fediverse.
To tackle this issue head-on, the researchers suggest that networks like Mastodon implement more robust tools for moderators. They also recommend integrating PhotoDNA and CyberTipline reporting into the platform. These measures could go a long way in ensuring the safety of users and preventing the spread of CSAM.
So, my fellow tech enthusiasts, it seems that our beloved Mastodon has encountered a serious problem that needs to be addressed. Let’s hope that the findings from this study prompt action and improvements in moderation and safety measures. Stay safe out there!