EU adopts ‘censorship superweapon.’

Sharing is Caring!

The DSA enforces strict censorship on large platforms, penalizing systemic risks like hate speech.

The Digital Services Act (DSA), which went into effect at the start of 2024, is the European Union’s flagship online censorship law. It aims to regulate online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms. The DSA’s key goals include protecting consumers’ fundamental rights, fostering innovation, and creating a fair and open online environment. It also addresses issues like cyberbullying, illegal content, and simplifying terms and conditions. Very large online platforms and search engines with significant reach in Europe are subject to specific rules under the DSA.

via foundationforfreedomonline:

    • The Digital Services Act (DSA) creates a unified framework for government-directed content moderation across the European Union.

    • Each EU member state now has a “digital services coordinator,” with the power to penalize online platforms if they fail to adequately address “systemic risks,” including hate speech and misinformation.

    • These official speech commissars can deputize third party entities to act as “trusted flaggers,” empowering the global network of NGOs, research institutes, and private companies that make up the censorship industry.

    • Elon Musk’s X became the first platform to be investigated after the DSA took effect, after months of threats from EU officials over Musk’s attempts to restore free speech to the platform.


The Digital Services Act (DSA), which went into effect at the start of 2024, is the European Union’s flagship online censorship law. Other than China’s Great Firewall, it is arguably the most elaborate and wide-reaching instrument of government control of online content in the world.

The law creates censorship obligations for what it terms Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOEs). This category, which includes any platform or search engine with over 45 million users in the EU, is subject to the law’s most stringent censorship requirements, with a hefty fine of 6% of a provider’s global annual turnover levied in the event of non-compliance.

What is required of the online platforms? The core requirement is that they develop tools to “identify, analyse, and assess systemic risks” related to their service, and then “put measures in place to mitigate these risks,” including “adapting the design or functioning of their services or changing their [recommendation] systems.” In other words, the EU wants platforms to identify and suppress content proactively — something that, realistically, can only be accomplished at scale using AI censorship tools.

The EU’s requirements for the type of content that ought to be suppressed are similarly far-reaching. The EU identifies several types of “systemic risks,” including anything that threatens “public security and electoral processes,” “gender-based violence,” “discrimination,” or “illegal content.”

 

See also  Missouri AG investigates Google for alleged conservative censorship
Views: 281

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.