European Commission Implements Digital Services Act to Combat Misinformation and Disinformation

Sharing is Caring!

“This legislation… is a Trojan horse: it presents a facade of respecting democratic principles… But behind this liberal facade, the exact opposite is happening: an attack is taking place against the constitutional order.”

European Commission President Ursula von der Leyen opened her speech at Davos this year by underscoring the “top concern” among the World Economic Forum’s partner companies, which also happens to be one of the Commission’s biggest worries as well: “misinformation and disinformation.” These two risks, she said, are “serious because they limit our ability to tackle the big global challenges we are facing – climate, demographics and technological changes, and spiralling regional conflicts and intensified geopolitical competition.”

The primary solution to the problem of mis- and disinformation, according to Von der Leyen, is to forge a grand coalition of sorts between “business and governments,” which, as luck has it, fits snugly with the WEF’s primary mission in life: to promote public private partnerships at all levels and in all areas of government, for the benefit primarily of its partner companies.

See also  The Average U.S. Household Is Spending $1,019 More A Month Just To Buy The Same Goods And Services It Did 3 Years Ago

“It has never been more important,” VdL said, “for the public and private sector to create new connective tissue. Because none of these challenges respect borders. They each require collaboration to manage risks and forge a path forward.”

Through its Digital Services Act (DSA), the European Commission has already put into operation arguably the most ambitious manifestation yet of this grand coalition between government and business.

See also  European leaders rule out sending troops to Ukraine after Macron suggestion – as Russia warns of war with Nato

The DSA imposes a legal requirement on very large online platforms (VLOPs) and very large online search engines (VLOSEs) to take prompt action against illegal content hosted on their platforms (e.g. removing it, blocking it, or providing certain information to the authorities concerned). Platforms are also required to take action against hate speech, dis- or misinformation if it is deemed to have “actual or foreseeable negative effects on civic discourse and electoral processes, and public security” and/or “actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.”

Views: 65

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.