(CTN News) – To ensure the safety and well-being of teenage users, Meta announced on Tuesday that it would intensify its content restrictions on Facebook and Instagram as part of efforts to ensure their safety and well-being.
On its social media platforms, the social media giant, formerly known as Facebook, is set to implement measures that will restrict the access to sensitive and “age-inappropriate” content that is targeted at teenagers by curtailing their visibility in the platform.
In a statement released on Meta’s official website, the company has announced that it is expanding its existing restrictions on content of a self-harming, eating disorder, or mental health nature.
There will now be restrictions on teens’ regular feeds and stories, even if the content is originating from accounts they follow, which were initially limited to Reels and Explore pages.
With the new update, Meta is also taking steps to hide more search results and terms associated with suicide, self-harm, and eating disorders for all users as part of the new update. This company aims to provide access to appropriate resources for people who are searching for such content.
During the announcement of the upcoming changes, Meta made sure to emphasize that the company wants teens to experience its apps in a safe, age-appropriate manner. There will be a default content control setting for teens on Facebook and Instagram that will be the most restrictive, limiting the materials that can be harmful to them.
To ensure that teens are aware of the importance of privacy and security, Meta will be actively encouraging them to update their account settings through notifications and prompts.
The decision was made during a time when the impact of Meta on young users was under heightened scrutiny. There was a lawsuit filed against Instagram in 2022 by a family whose teenage daughter was exposed to content on the social networking site that glorified anorexia and self-harm.
There was also evidence within the “Facebook Papers,” leaked in 2021, that Meta was aware of Instagram’s negative effects on teen girls as a result of its internal research.
The former engineering director and consultant for Meta, who took part in a congressional hearing in November, emphasized that there was a need for the company to do more to protect children online in the future.
Furthermore, a bipartisan group of 33 state attorneys general filed a lawsuit against Meta in October, claiming the company had incorporated features that were aimed at young users, as well as addictive features.