In the lead-up to Germany’s federal elections, social media platforms Meta and X (formerly Twitter) were found to have approved advertisements containing violent anti-Muslim and anti-Jewish hate speech. This finding was reported by Eko, a non-profit organization focused on corporate responsibility. The group conducted research to test whether the ad review systems of these platforms would allow or reject ads featuring hateful and violent messages aimed at minorities during a time when immigration is a significant topic in political discourse.
The researchers submitted test ads including anti-Muslim slurs, calls for the imprisonment of immigrants in concentration camps or suggestions of gassing them, and AI-generated images of mosques and synagogues burning. Most of these ads were approved for dissemination within hours of submission in mid-February. The federal elections in Germany were scheduled to occur on Sunday, February 23.
According to Eko, X approved all ten hate speech ads submitted by its researchers shortly before the election, while Meta approved five out of ten of the submitted ads, allowing them to be displayed on Facebook and potentially on Instagram. Meta’s rejections were based on concerns that the ads posed political or social risks that could influence the election. However, the five approved ads on Meta included violent hate speech likening Muslim refugees to derogatory terms and calling for extreme measures against them and synagogues.
The AI-generated imagery used in the ads lacked disclosure as being artificially created, yet Meta approved half of these ads despite having a policy requiring such disclosure for ads related to social issues, elections, or politics.
X not only approved hate-filled ads targeting Muslims and Jews but also accepted additional ads spreading similar messages. These included accusations against immigrants and Jews, AI-generated antisemitic visuals, and messages attacking political parties with false claims to incite violence.
Elon Musk, the owner of X, actively influenced the German election by using the platform to endorse the Far Right AfD party. Despite these developments, Eko’s researchers ensured no test ads were aired by disabling them before their potential publication.
These tests showed significant weaknesses in how ad platforms manage content moderation, with potential financial gains from distributing such content. Eko’s findings have been submitted to the European Commission, which manages the enforcement of the Digital Services Act (DSA) on these platforms. Current investigations by the EU into both Meta and X continue, focusing on issues like election security and illegal content. The DSA allows for financial penalties for non-compliance and possibly blocking access to offending platforms.
With Germany’s election imminent, recent studies indicate that the EU’s online governance regulations have not effectively protected the democratic process from tech-related threats. Reports from Global Witness and other civil society groups suggest that platforms like X and TikTok display bias in their content promotion, possibly affecting voters’ access to diverse political information.
Eko commended the European Commission for initiating investigations into Meta and X but emphasized the need for decisive corrective measures. The organization criticized the platforms for failing to self-regulate, continuing to allow hate speech and misinformation, and called for regulators to enforce pre-election mitigation strategies. Concerns were also raised about potential U.S. political pressure on the EU to ease tech regulations.