The European Union flag blowing in a breeze.

The Digital Services Act was presented by the European Commission in Brussels at the end of 2020, and detailed content moderation rules for multiple online platforms.

What would the Act do?

The Digital Services Act would require social media platforms to manage and take responsibility for removing illegal content from their online service. Illegal content would range from hate speech to the sale of counterfeit goods. Moreover, the Act would safeguard users whose content has been erroneously removed from the social media platform.

The European Commission is also pushing for more transparency on the platforms’ online advertising, and on the algorithms that are used to recommend content to its users. By implementing more rigorous checks, EU lawmakers seek a greater traceability of business users in online marketplaces. In doing so, criminals selling counterfeit goods will be easier to catch and put to justice. If enshrined in law, The Digital Services Act stands to further protect EU citizens against cyber crime and hate speech.

Following a closer examination of online social media platforms by European officials, regulators have defined the largest platforms to be ‘gatekeepers’ of cyber crime. Large platforms such as Facebook and Twitter boast 45 million users- the equivalent of 10 percent of the bloc’s total population, so the potential repercussions of cyber crime are significant. Non-compliance are set to include fines of up to 6 percent of the company’s global turnover, entering into the billions.

Some Countries are Already Instigating a Lockdown of Their Own.

Some European countries have already begun to implement greater checks on digital platforms within their own domestic law. France, for example, is planning to implement greater transparency regarding the online content moderation process. Austria is similarly in the process of formulating its own proposals in a bid to curtail online hate speech through law. Vienna is particularly keen to permit the power to actively remove offensive content from social media and digital networks.

The European Commission has previously launched multiple investigations into ‘GAFA’- Google, Apple, Facebook and Amazon- and handed out substantial fines to the companies when deemed necessary. Margrethe Vestager, the Executive Vice President of the European Commission, has headed the majority of previous investigations. Vestager has reiterated that ‘complaints keep coming through our door, so we have many more investigations.’ Whilst the major GAFA digital companies have been operational for years, the emergence of other platforms, such as Facebook, Twitter, Instagram, Youtube and TikTok has radically reshaped the technological and democratic landscape of the digital world. According to the Commission, updated laws and regulations are needed.

Looking to the Future.

So what are the implications of the planned Act? The proposed law is set to have a great impact on marketers, social media managers, content creators and broader businesses as they continue to advertise and digitally interact with European citizens. For those who provide a digital service to consumers, the Act will provide certainty in regards to what constitutes legal content and what doesn’t. Moreover, startups and scalability will be significantly easier to manage. Businesses that use digital services will benefit from greater choice and lower prices when it comes to selecting the service they wish to use, as illegal providers are increasingly removed from the market.

The proposed Digital Services Act is still in the pipeline. It could take months, if not years, to be officially enshrined in law. The act is yet to be signed off by both the European Parliament and the European Council. That said, Vestager has said that she is hoping that the Digital Services Act could become law in approximately 18 months.

Sources: ,