Connect with us

Business

EU Fines TikTok 345 Million Euros for Child Data Breaches

Avatar of CTN News

Published

on

EU Fines TikTok 345 Million Euros for Child Data Breaches

On Friday, a European Union regulator fined Chinese-owned social media site TikTok 345 million euros for child data breaches, the bloc’s latest salvo against internet titans’ commercial practises. The $369 million fine is the result of a two-year investigation by Ireland’s Data Protection Commission (DPC).

The Irish watchdog, which plays a vital role in regulating the EU’s stringent General Data Protection Regulations, warned TikTok three months to “bring its processing into compliance” with its standards.

In September 2021, the DPC began investigating TikTok’s GDPR compliance in relation to platform settings and personal data processing for users under the age of 18.

It also investigated TikTok’s age verification mechanisms for those under the age of 13 and found no violations, but the platform failed to appropriately analyse the risks to younger people joining on the service. In its judgement, the regulator emphasised how minors who signed up had their TikTok accounts set to public by default, allowing anyone to view or comment on their video.

It also attacked TikTok’s “family pairing” option, which is intended to connect parents’ accounts with those of their adolescent children, but the DPC discovered that the firm did not check parent or guardian status.

Because TikTok, Google, Meta, and X, previously Twitter, have European headquarters in Dublin, Ireland is at the heart of the GDPR system.

The DPC penalised Meta a record 1.2 billion euros in May for transferring EU user data to the US in violation of a previous court judgement.

TikTok, a part of the Chinese digital behemoth ByteDance, has 150 million users in the United States and 134 million in the European Union.

EU Fines TikTok 345 Million Euros for Child Data Breaches

TikTok responded to the penalties by saying it “respectfully disagrees” with the ruling and was “evaluating” its next steps.

“The DPC’s criticisms focus on features and settings that were in place three years ago, and that we changed well before the investigation even began, such as setting all under 16 accounts to private by default,” a TikTok representative told AP.

The site insists on regularly monitoring its users’ ages and taking appropriate action when necessary.

TikTok claims to have deleted nearly 17 million accounts in the first three months of this year owing to concerns that they belonged to people under the age of 13.

Earlier this month, the social media behemoth inaugurated a long-promised data facility in Ireland, attempting to allay European concerns over data protection.

GDPR, which went into effect in 2018, was the EU’s strictest and most well-known tech law, requiring citizens to consent to how their data is used.

The fine comes as the EU unveiled a list of digital behemoths last week, including Apple, Facebook owner Meta, and ByteDance, that would face new commercial restrictions.

child predators tiktok

Many TikTok accounts have become gateways to some of the Internet’s most hazardous and frightening content. Despite how secret they are, almost anyone may join.

TikTok makes it simple to find the posts. They usually look like adverts and come from relatively unimportant accounts.

However, they are frequently gateways to illegal child sexual abuse content that is literally concealed in plain sight—posted in private accounts with a setting that makes it viewable only to the person logged in.

There is nothing visible from the exterior; on the inside, there are violent videos of minors stripping naked, masturbating, and indulging in other exploitative behaviours. It’s as easy as asking a stranger on TikTok for the password.

The security regulations of TikTok expressly forbid users from sharing their login details with others. However, a Forbes investigation discovered that this is exactly what is happening.

The reporting, which followed the advice of a legal expert, revealed how easily underage victims of sexual exploitation and predators may meet and trade unlawful photographs on one of the world’s largest social media platforms.

Forbes identified a major blind spot where moderation is falling short and TikTok is struggling to enforce its own guidelines, despite a “zero tolerance” policy for child sexual abuse material.

The problem of closed social media platforms creating breeding grounds for unlawful or violative behaviour is not limited to TikTok; for example, groups facilitating child predation have been discovered on Facebook. (Its parent, Meta, did not respond.) However, TikTok’s burgeoning popularity among young Americans—more than half of U.S. children now use the app at least once a day—has made the issue prevalent enough to catch the notice of state and federal authorities.

TikTok has “zero tolerance for child sexual abuse material and this abhorrent behaviour, which is strictly prohibited on our platform,” according to a spokesman, Mahsau Cullinane.

“When we become aware of any content, we immediately remove it, ban accounts, and make reports to [the National Centre for Missing & Exploited Children].”

TikTok also stated that all films posted to the platform—public and private, even those seen only by the individual inside the account—are subject to AI moderation and, in certain situations, extra human scrutiny.

Direct messages are also subject to monitoring. According to TikTok, accounts detected seeking to get or spread child sexual abuse content are removed.

The CTNNews editorial team comprises seasoned journalists and writers dedicated to delivering accurate, timely news coverage. They possess a deep understanding of current events, ensuring insightful analysis. With their expertise, the team crafts compelling stories that resonate with readers, keeping them informed on global happenings.

Continue Reading

CTN News App

CTN News App

Recent News

BUY FC 24 COINS

compras monedas fc 24

Volunteering at Soi Dog

Find a Job

Jooble jobs

Free ibomma Movies