The EU identified 19 online platforms, including Meta, TikTok, and Twitter, as having such large user bases that they will be subject to harsher content regulatory standards.
The services on the list, which includes Amazon, Google, Meta, Instagram, Twitter, TikTok and Microsoft, all have more than 45 million monthly active users.
This classifies them under a new EU regulation known as the Digital Services Act (DSA), which goes into effect in August and imposes measures such as annual audits and a duty to effectively combat disinformation and hate content.
“These platforms and search engines will not be able to act as if they were ‘too big to care’ in four months,” said Thierry Breton, the EU’s internal market commissioner, in a statement.
“This new supervision system will cast a wide and tight net, catching all points of failure in a platform’s compliance,” he continued.
Twitter, Alphabet’s Google Search, Google Maps, Google Shopping, and Google Play businesses, as well as its YouTube subsidiary; and Meta’s Facebook and Instagram are among the platforms with 45 million or more users.
Others include Microsoft’s LinkedIn, Apple’s iOS App Store, Wikipedia, the online encyclopaedia, Snapchat, and the creative picture website Pinterest.
They are classified as a “Very Large Online Platform” (VLOP) or a “Very Large Online Search Engine” (VLOSE) under the DSA.
The majority of the companies on the list are based in the United States, although Chinese-owned platforms TikTok and e-commerce site AliExpress are also included. The commission also named Zalando, a German online fashion company.
Platforms to better protect children
Breton told reporters on Tuesday that his team will conduct “stress tests” to ensure Twitter’s compliance readiness “at the end of June.”
He went on to say that TikTok had expressed an interest in working with him to ensure compliance.
The disclosure on Tuesday comes after a February deadline for web corporations in Europe to report user data.
The DSA has a wide range of goals, including requiring platforms to better protect children, increasing transparency around digital services, prohibiting the online sale of dangerous items, and providing customers with more options when shopping online in the EU.
The guidelines empower the EU to levy fines of up to 6% of the platforms’ annual global sales for repeated violations.
By August 25, 2023, the 19 platforms must have an independent compliance framework in place and provide the European Commission with their first annual risk assessment, including how they intend to handle content on mental health and gender-based violence.
The panel will then conduct an independent audit and provide oversight.
Margrethe Vestager, vice president of the European Commission, said the designations were a “huge step forward” for the DSA in bringing “meaningful transparency and accountability of platforms and search engines and giving consumers more control.”
Swedish music-streaming service Spotify, US dating app Tinder, and home-rental platform Airbnb are among the online businesses that have declared that they have fewer than 45 million users.
Breton stated that “four to five” more platforms could be added to the list “in the coming weeks,” but did not specify which ones.
The DSA is one of two major regulations established by the EU last year to regulate digital platforms in order to safeguard EU users.
The specific obligations for very big platforms are in addition to the DSA rules, which will apply to everyone beginning February 17, 2024.
The Digital Markets Act, the second law, outlaws anti-competitive action by so-called “gatekeepers” of the internet.
European Union Digital Services Act
The EU Digital Services Act (DSA) is a proposed legislative framework aimed at regulating online platforms and digital services within the European Union (EU). The DSA is part of a larger EU Digital Single Market strategy that seeks to harmonize digital regulations and create a level playing field for businesses operating in the EU.
The main goal of the DSA is to establish clear rules for digital services, particularly online platforms, to ensure the safety and security of users and protect fundamental rights, such as freedom of expression and privacy. It would also aim to promote competition, innovation, and accountability in the digital marketplace.
Some key elements of the DSA include:
– New obligations for online platforms to detect, remove and prevent the spread of illegal content, including hate speech, terrorist propaganda, and child sexual abuse material.
– A new category of “gatekeeper” platforms, which have significant market power and would be subject to additional regulatory obligations.
– Enhanced transparency and reporting requirements for online platforms, including clear terms and conditions for users and increased access to data for researchers and regulators.
– Improved user rights, including the right to challenge content moderation decisions and the right to access data collected by platforms about their activities.
The DSA is currently being discussed and negotiated within the EU institutions, and its final form is still subject to change. Once agreed upon, it will need to be implemented into national laws by each EU member state.