Discord enforces identity verification in March! 70,000 users' personal data leaked, sparking privacy panic

MarketWhisper

Discord announced that starting in early March this year, it will implement stricter age verification measures worldwide. To create a safer experience for teenagers, users who do not undergo age verification will automatically be regarded as minors, with restrictions on access to adult content and certain real-time interaction features. In October 2025, the platform’s third-party customer service provider 5CA was hacked, resulting in the leak of approximately 70,000 users’ personal data, causing widespread concern.

Starting in March, Unverified Users Are Automatically Considered Minors

Yesterday, Discord issued a statement announcing that from early March this year, it will enforce more rigorous age verification measures globally. To foster a safer environment for teenagers, Discord stated that users who do not complete age verification will be automatically treated as minors, with restrictions on viewing sensitive content and accessing age-restricted servers and channels. They will also be unable to speak or broadcast in Stage channels, and direct message requests will be filtered into a separate inbox, with friend requests marked with warning labels. This “default minor” approach essentially shifts the burden of proof onto users: you must prove you are an adult, or else you will be treated as a minor.

The core controversy surrounding this design is: why do adults need to prove their age to the platform? In traditional society, minors are required to show ID when entering bars or purchasing cigarettes and alcohol, but adults generally do not need to constantly verify their age in daily life. Discord’s new policy effectively requires all users to “prove their age online,” a shift critics see as a fundamental infringement on online anonymity and freedom.

To unlock the above restrictions, users must verify their age in one of two ways: either by providing a selfie for “facial age estimation” or by uploading a government-issued ID. Discord emphasizes privacy: facial scans are performed only on the user’s device and are not uploaded to servers, and IDs submitted to partners for verification are deleted immediately after verification.

Two Methods of Discord Age Verification

Facial Age Estimation: Selfie analyzed locally on device, not uploaded to server (claimed)

ID Verification: Upload government ID to third-party verification service, then deleted (promised)

However, these “privacy guarantees” are hard to reassure users about, especially considering Discord’s past security record. Even if facial scans are truly only performed locally, who can guarantee that policies won’t change in the future? Even if IDs are “deleted after verification,” the process involves third-party systems, creating potential vulnerabilities. Moreover, Discord has already experienced a painful data breach before.

The Painful Lesson of the 2025 Data Leak of 70,000 Users’ Personal Data

Although Discord repeatedly assured the privacy of the verification process, users have not forgotten that in October 2025, the third-party customer service provider 5CA was hacked, leading to the leak of about 70,000 users’ personal data. The leaked data included ID images uploaded for age verification, as well as names, email addresses, and the last four digits of credit card numbers.

While Discord emphasized that its own systems were not compromised and refused to pay ransom to hackers, the incident proved that handling IDs through third parties carries significant risks. Electronic Frontier Foundation (EFF) policy director Maddie Daly pointed out that the Discord incident highlights the need for stricter oversight and encryption protections when outsourcing age verification and customer service processes.

The leak of 70,000 records is a moderate-scale data breach, but its impact is extremely serious. IDs contain names, birth dates, addresses, ID numbers, and photos—information sufficient for identity theft, loan applications, account opening, or even criminal activity. Once leaked, victims may face years of identity theft risks, with high costs to repair credit records and resolve legal issues.

Even more concerning is that Discord experienced this leak in October 2025, and just four months later, plans to implement the same age verification system globally. This “once bitten, twice shy” attitude makes it difficult for users to trust their privacy guarantees. If Discord cannot learn from the lessons of four months ago, how can it ensure future data security?

Community backlash has been intense. Many users say they would rather give up access to adult content than risk uploading ID documents. Some threaten to switch to platforms like Telegram if forced to comply. The potential loss of users may be the biggest pressure on Discord.

Dilemma for VTubers and Anonymous Creators

Because Discord’s user base includes anonymous creator communities, such as many virtual YouTubers (VTubers), who often use virtual avatars to avoid revealing their real identities and prevent doxxing, the new policy forces them to upload IDs to a platform with a known breach history. They face a dilemma between “losing privacy” and “losing platform functionality.”

The VTuber industry relies heavily on anonymity. Many VTubers’ appeal lies in their virtual personas and separation from their real identities; fans enjoy the characters, not the real people. If their real identities are exposed, it could lead to fan disappointment, doxxing harassment, or even threats to personal safety. There have been cases in Japan where VTuber identities were doxxed, leading to stalking and threats. For these creators, uploading ID to Discord is like entrusting their fate to a platform that has already proven unreliable.

Some netizens criticize that Discord is shifting the responsibility of guardianship—originally a parental duty—onto all adult users. By defaulting all adult users to a monitored state and requiring personal data submission, this “presumption of guilt” approach is hard to accept. Protecting minors is a societal responsibility, but it should be fulfilled through parental supervision and education, not by forcing all adults to prove they are not minors.

Global trends toward stricter age controls on social platforms are increasing. Last year, Discord cooperated with regulations in the UK and Australia to implement similar measures, and Spain is also planning to follow suit, proposing to ban users under 16 from social platforms. However, Telegram founder Pavel Durov criticized such policies, arguing that they are nominally aimed at protecting minors but in reality serve as government surveillance tools, threatening online freedom and potentially enabling political censorship.

Unlike regulations that are only enforced in certain jurisdictions, Discord is proactively extending its age verification measures worldwide, including countries where social media bans for minors are not mandated. This forward-looking and expansive approach has raised questions about Discord’s motives. Is it genuinely trying to protect children, or is it attempting to build a global database of user identities for future commercialization or government monitoring?

View Original
Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments