Picture
Hello there, tech enthusiasts! Let’s dive into a rather alarming study regarding social media ads that has recently come to light.
According to research conducted by Eko, social media giants Meta and X have been found to approve advertisements containing violent anti-Muslim and anti-Jew messages. The findings come just ahead of Germany’s federal elections, where such ads could significantly influence public opinion.
The study highlighted the lax ad review processes of both platforms, with Meta approving some aggressive hate speech ads, while X reportedly approved all test ads submitted.
It is alarming that the ten test ads for hate speech were all quickly approved by X, pointing out potential failures in its moderation practices. Similarly, Meta’s approval of ads with violent rhetoric brings forth questions about its commitment to content moderation.
The implications of such findings are serious, suggesting that the digital platforms might prioritize revenue over the protection of vulnerable communities.
Tightening regulations under the EU’s Digital Services Act were designed to address such concerns, yet these breaches highlight that both companies might not be enforcing their claimed policies adequately.
As we near the German elections, the urgency for effective content moderation on these platforms becomes increasingly clear. Social media is becoming a battleground for hate speech, with drastic consequences for real-world political climates.
Hello, tech enthusiasts! Today, we're diving into the fascinating world of Mistral AI, a French…
Hello followers! Get ready to dive into the exciting world of scientific research!SymbyAI is a…
Hello tech enthusiasts! Today, we dive into an intriguing memo from Google co-founder Sergey Brin.In…
Hello, tech enthusiasts! If you're excited about the intersection of AI and video creation, you’re…
Hello, tech enthusiasts! Let’s dive into the fascinating world of longevity with a fresh take…
Hello to all tech enthusiasts! Today we’re diving into an exciting new application that's making…