(CTN News) – The appeal was rejected by a federal judge on Tuesday as the major social media companies sought to dismiss a nationwide lawsuit accusing them of illegally enticing and then addicting millions of children to their platforms, damaging their mental health in the process.
According to US District Judge Yvonne Gonzalez Rogers, Alphabet, which operates Google, YouTube, Facebook, Instagram, ByteDance, which operates TikTok, and Snap, which operates Snapchat,, were all ruled against by her.
Several lawsuits have been filed on behalf of children alleging that the use of social media caused them anxiety, depression, and even suicide on occasion.
A number of remedies are sought in the lawsuit, including damages and an end to the defendants’ wrongful practices.
The plaintiffs’ attorneys, Lexi Hazam, Previn Warren, and Chris Seeger, described today’s ruling as an important victory for families hurt by social media.
School districts have also filed similar lawsuits against the industry, and 42 states and the District of Columbia filed a lawsuit against Meta last month alleging youth addiction to its social media platforms.
In a statement, Alphabet said the accusations were false, and that safeguarding children was “an integral part of our work.” In a statement, TikTok spokeswoman said it had “strong safety policies and parental controls.”
The company wouldn’t comment. Despite a request for comment, Meta didn’t reply.
She rejected arguments that the companies were immune from lawsuits under the US Constitution’s First Amendment and a provision of the Communications Decency Act.
According to the companies, Section 230 provides immunity from liability for any content posted by users on their platforms, which requires all claims to be dismissed.
Rogers, however, stated that the plaintiffs’ claims went far beyond focusing on third-party content, and that the defendants did not explain why they were not liable for providing defective parental controls, failing to limit screen time, and creating barriers to account deactivation.
Among the allegations she cited were allegations that Social Media companies could have used age verification tools to alert parents when their children were online.
As a result, they propose a plausible theory that failure to validate user age causes users harm that is distinguishable from harm caused by consumption of third-party content on the defendant’s platforms.
As product manufacturers, Rogers said the companies owed a legal duty to their users and could be sued for negligence for failing to design reasonably safe products or to warn users of known defects.
As a result, the judge of Social Media decided that the companies did not have a legal obligation to protect their users from harm caused by third-party users of their platforms, and she narrowed the litigation by dismissing some of the plaintiffs’ claims.