SAN FRANCISCO – Discord, the widely used chat app for gamers, creators, and online communities, is changing how accounts are set up and what users can access. In early March, Discord will start a phased worldwide rollout of “teen-by-default” settings along with new age assurance tools. Under this plan, both new and current users will start in a more limited, age-appropriate experience unless they confirm they are adults.
Discord says these updates build on protections it already launched in the United Kingdom and Australia last year. The company frames the shift as a way to give teens stronger safety controls, while still letting verified adults use more features and view restricted content.
Discord Says It’s About Teen Safety
Discord puts teen protection at the center of the update. The platform reports more than 200 million monthly active users and has long required users to be at least 13 under its terms. Still, worries about minors seeing mature or adult material have grown, along with pressure from regulators and lawmakers.
With the new defaults, Discord says teens will get tighter guardrails right away. These include filters for possibly mature photos and videos, stricter limits on message requests from people they do not know, alerts tied to friend requests, and reduced access to certain features like speaking in live Stage channels. Servers and channels marked age-restricted, plus some commands, will be available only to people confirmed as adults.
Discord also points to legal and industry shifts that are pushing age gates across social platforms. The UK’s Online Safety Act of 2023 is one example, raising the bar for child safety and increasing responsibility for platforms. Discord says similar rules are appearing in other regions, and companies are being pushed to adopt age assurance methods.
To confirm age, Discord says users can choose one of two options. The first is facial age estimation using a short video selfie that is processed on-device, meaning the scan does not leave the phone. The second is uploading a government-issued ID through vendor partners. Discord says it only receives an age result, not identity details. It also says IDs are used only to pull the age, then deleted, and that facial data is not stored or sent.
On top of that, Discord is introducing an “age inference model.” The company describes it as a background machine-learning system that reviews signals like account behavior, account age, device data, and community patterns to predict whether someone is an adult with high confidence. Discord claims many adults will not need to complete manual verification because the system will grant full access automatically.
“Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility,” Savannah Badalich, Discord’s head of product policy, said in the company’s press release.
Privacy Concerns and More Talk of Leaving
The news set off a loud argument online, including on Reddit and X (formerly Twitter). Some users agree with the focus on minors. Others say the approach goes too far, and they worry about privacy and the risks tied to verification.
Some critics point to a third-party vendor breach from September 2025 that exposed user information submitted through support tickets. Discord says the age assurance vendors were not part of that incident and repeats that its process is built to limit what data is shared. Even so, many users remain doubtful.
On Reddit, some posters called the change “a great way to kill your community,” and said they plan to cancel Nitro and move servers to other platforms. Others wrote that they refuse to upload a face scan or ID because they do not trust any system to stay secure.
On X, reactions range from blunt refusals to broader concerns about company leadership and outsourcing. Several users said they already canceled subscriptions and are looking at options such as Matrix, Signal, or Revolt.
Discord has tried to calm the criticism by saying verification will not be required for basic use. It also says most people who do not try to access age-restricted areas will see little change. The company also plans to form a Teen Council to bring younger users into future safety discussions.
Even with those assurances, the rollout highlights a growing conflict between online safety rules and the demand for privacy and anonymity. As more governments push for stricter age checks, platforms like Discord face a tough balance, protect minors without driving away adults who want to share less personal data.
Discord’s global rollout begins in March and will expand in phases to reduce disruption. It remains unclear if this approach will become a common model for youth safety, or speed up a move toward other, more decentralized services.





