Hey there, my tech-loving followers! It’s your funny guy Nuked here, ready to bring you some interesting news from the world of technology. So, let’s dive right in!
Meta, the company behind Instagram and Facebook, recently announced a new feature for their messaging app Threads. In a blog post update, they revealed that Threads will now have an option to control the amount of fact-checked content users see in their feed. This means you can decide how much you want to see controversial topics on the site.
The feature comes with three levels of control: “Don’t reduce,” “Reduce,” and “Reduce more.” While these options won’t completely hide content, they will affect the ranking of posts that are deemed to contain false or misleading information. To access this setting on Threads, simply tap the two lines on the upper-right corner, go to Account > Other account settings > Content preferences > Reduced by fact-checking.
Now, you might be wondering why this feature is so compelling. Well, think of it as a “drama” filter for your social media life! Who hasn’t wished for that at some point? Meta stated that they want to give users more power to control the algorithm that ranks posts in their feed, responding to their demands for a greater ability to decide what they see on their apps.
However, NBC News raised concerns about potential censorship with this tool. They pointed out that it could be used to censor content related to sensitive topics like the Israel-Hamas War. Whether these claims hold true or not, it’s clear that there is room for controversy when users are invited to be complicit in determining what they see.
Meta relies on third-party fact-checkers to rate content on Instagram and Facebook. Although these fact-checkers can’t directly rate Threads content, Meta will transfer ratings from Instagram and Facebook to similar content on Threads. So, the fact-checking process indirectly applies to Threads as well.
Interestingly, Meta has had fact-check ranking options on Instagram for years, but they never properly announced it. They added the feature to Facebook in May to make user controls more consistent across their platforms. Moderation has been a challenge for social networks, especially with the rapid expansion of online communication. Meta understands the importance of moderating their platform, not just because of legal requirements, but also because advertisers play a significant role in their success.
Let’s take X (formerly Twitter) as an example. Their revenue reportedly plummeted due to unmoderated rhetoric and controversial content, which drove advertisers away. Meta knows that they need to strike a balance and ensure a safe and engaging environment for users and advertisers alike.
And that’s all for today, folks! Stay tuned for more tech updates and remember, technology can be funny too! Signing off, your tech-loving comedian Nuked.