Discord to require face scan or ID for adult content
Discord will soon require users worldwide to verify their age through a face scan or by uploading an official ID to access adult content, as the platform rolls out stricter safety measures aimed at protecting teenagers.
Discord to require face scan or ID for adult content
Discord will soon require users worldwide to verify their age through a face scan or by uploading an official ID to access adult content, as the platform rolls out stricter safety measures aimed at protecting teenagers.
The online chat service, which has more than 200 million monthly users, said the new system will place everyone into a teen-appropriate experience by default.
Only users who successfully verify that they are adults will be able to access age-restricted communities, unblur sensitive material or receive direct messages from people they do not know.
Discord already requires age verification for some users in the UK and Australia to comply with local online safety laws. The company said the expanded checks will be introduced globally from early March.
“Nowhere is our safety work more important than when it comes to teen users,” said Savannah Badalich, Discord’s head of policy. She said the global rollout of teen-by-default settings would strengthen existing safety measures while still giving verified adults more flexibility.
Under the new system, users can either upload a photo of an identity document or take a short video selfie, with artificial intelligence used to estimate facial age. Discord said information used for age checks would not be stored by the platform or the verification provider, adding that face scans would not be collected and ID images would be deleted once verification is complete.
The company’s move has drawn mixed reactions. Drew Benvie, head of social media consultancy Battenhall, said the push for safer online communities was positive but warned that implementing age checks across millions of Discord communities could be challenging. He said the platform could lose users if the system backfires, but might also attract new users who value stronger safety standards.
Privacy advocates have previously raised concerns about age verification tools. In October, Discord faced criticism after ID photos of about 70,000 users were potentially exposed following a hack of a third-party firm involved in age checks.
The announcement comes amid growing pressure on social media companies from lawmakers to better protect children online. Discord’s chief executive Jason Citron was questioned about child safety at a US Senate hearing in 2024 alongside executives from Meta, Snap and TikTok.
With the new measures, including the creation of a teen advisory council, Discord is following a broader industry trend seen at platforms such as Facebook, Instagram, TikTok and Roblox, as regulators worldwide push for safer online environments for young users.