News
Meta Is Rolling Out Parental Controls And Tighter Messaging Limits For Teens
(CTN News) – It was announced today that Meta is rolling out updated DM restrictions on both Facebook and Instagram for teens to prevent unsolicited messaging.
Until now, Instagram has restricted adults over 18 from messaging teens who don’t follow them. In some geographies, the new limits will apply to users under 16, as well. A notification will be sent to existing users by Meta.
Only Facebook friends and contacts can send messages on Messenger.
Further, Meta is allowing guardians to allow or deny changes teens make to default privacy settings by enhancing its parental controls. Teens previously could not take action when they changed these settings, even though guardians were notified.
A guardian can block a teen if they attempt to change their settings around who can DM them or to make their account public from private.
Instagram’s parental supervision tools were first introduced in 2022, giving guardians a sense of how their teens are using the app.
A new feature will also protect teens from receiving unwanted and inappropriate images from people connected to them in their DMs, according to the social media giant. Additionally, this feature will also work in end-to-end encrypted chats, discouraging teens from sending these types of images.
Meta didn’t specify how it ensures teens’ privacy when implementing these features.
Additionally, it did not specify what constitutes “inappropriate” behavior. Facebook and Instagram have been restricted from showing content around self-harm and eating disorders since Meta announced new tools earlier this month.
Meta received a formal information request from EU regulators last month, who asked for more details about the company’s efforts to prevent the sharing of self-generated child sexual abuse material (SG-CSAM).
In addition, Meta faces a civil lawsuit in New Mexico state court alleging that its social network promotes sexual content to teens and promotes underage accounts to predators. The company was accused of designing products in a way that harms kids’ mental health in a lawsuit filed by more than 40 US states in October in a federal court in California.
On January 31 this year, the company and other social networks including TikTok, Snap, Discord, and X (formerly Twitter) will testify before the Senate about child safety issues.
SEE ALSO:
Where is QR code in WhatsApp web?