> Discord, a messaging platform popular with gamers, says official ID photos of around 70,000 users have potentially been leaked after a cyber-attack.
However, their senior director states in this Verge article:
> The ID is immediately deleted. We do not keep any information around like your name, the city that you live in, if you used a birth certificate or something else, any of that information.
> The ID is immediately deleted. We do not keep any information around like your name, the city that you live in, if you used a birth certificate or something else, any of that information.
This is also contradicted by what Discord actually says:
> Quick deletion: Identity documents submitted to our vendor partners are deleted quickly— in most cases, immediately after age confirmation.
One thing most of those lack is an easy way to share screen.
Now if anyone wants to differentiate their Discord alternative, they want to have most of discord functionalities and add the possibility to be in multiple voice chats (maybe with rights and a channel hierarchy + different push-to-talk binds). It's a missed feature when doing huge operations in games and using the Canary client is not always enough.
Does matrix have decent 1:N client desktop broadcasting with low latency (and high fps) yet? I use discord for "watch parties", video and tabletop gaming...
Last I checked Signal was not fully open source, which is iffy, believe their encryption protocol is still closed. That said its the best of a bad bunch for E2EE messaging. If you're on android I'd recommend doing what I do, which is installing from the APK on the site, manually verifying the sig locally (you can use termux for this), and then lagging ever so slightly behind on updates to avoid potential supply chain or hostile takeover attacks. This is probably over cautious for most threat profiles, but better safe than sorry imo. Also their server side stuff is close sourced, technically this isnt an issue though as long as the E2EE holds up to scrutiny though.
Edit: My information may be out of date, I cannot find any sources saying any part of the app is closed source these days, do your own research ofc but comfortable saying its the most accessible secure platform.
I wonder how Stoat will fare, and how it is currently maintained, in terms of "making money"; my fear is that it would steer into the direction of Discord itself.
Nevertheless, I don't like the new name either, oh well...
I like this comment though:
Imagine you make a free software project and it runs into trademark issues because people have more money than you to register in more classes than your project.
And then even though your project existed first, they still come after you anyway.
And from that an even more expensive rebranding from this as well.
Argh. If there's no stoat emoji, petition the Unicode Consortium for one, don't just use a beaver. It's not even the right family; the badger emoji would be closer.
For me, the closest alternative to Discord is Stoat. Matrix with Element (or other clients) would be great, but it feels so slow on both desktop and mobile.
IRC does not support group voice & video calls, which is one of the primary features of Discord (and previously Skype, from which everyone migrated to Discord in the first place)
The sad thing is that I think many people will en masse pony up their ID or snapshot without a second thought. I'm not sure if enough people will refuse to actually force Discord to back off this decision (unless their idea is to grab as much data as possible at once with the understanding that they are going to back off either way).
Especially if it's presented as a pop-up upon launching the app that suggests the user won't be able to talk to their friends/servers without showing ID. Carefully worded language would could spur some % of users to panic at losing years of history and immediately show ID. Folks with less privacy discernment hear "jump" and reply "how high".
I don't imagine this was a 100% their decision, it's more like a response to the epidemic of all the world's governments suddenly coming up with adult verification schemes. Discord has already required it in some countries, and it's definitely easier to get everybody to verify themselves than require it on a per-jurisdiction basis. The personal data they get is a cherry on top.
Also, this is just the beginning, more social networks will require the same soon.
I have done that for stripchat which was also requiring it. Not happy with it but I'd rather use a selfie than a whole ID document which includes an image anyway.
I'll continue using Discord in teen mode, I guess. I'd rather not lose the current connections & servers I have on there, and I'm not optimistic about people migrating away, especially non-tech people.
I get the draconian side of things, but I am also tired of thousands of russian, indian, domestic-funded etc. bots flooding the zone with divisive propaganda.
In theory, this seems like it would at least be a step in the direction of combating disinformation.
I'm curious if there are any better ways to suppress these propaganda machines?
I don't see how disallowing viewing "age-restricted" content through Discord without giving them your ID would have any impact on the spread of disinformation, outside of like, disinfo in the form or pornographic or gory images.
It's kind of surprising that no-one has really come out with a proper privacy-preserving approach to this yet. It is clearly _possible_; there are reasonable-looking designs for this. But no-one's doing it; they're just collecting photos and IDs, and then leaking them all over the place.
Really I've come to the conclusion that anything I send out of my LAN is probably kept on a server forever and ingested by LLMs, and indexed to be used against me in perpetuity at this point, regardless of what any terms or conditions of the site I'm using actually says.
Speaking of hosting, Discord used to be one of the biggest (inadvertent) image hosts, so they might have set up the system to reduce legal exposure than to monitor conversations per se.[1]
A lot of the internet broke the day they flipped that switch off.
Weren't external Tumblr hotlinks also a thing back in the day?
You have the fervent that love recording everything "for the good of the people". But then you'll just have piles of people with separation of duties that do things with very little understanding of where they fit in the process and very little care to.
And E2EE platforms like Mega are now being censored on some platforms specifically because they're E2EE, and so the name itself must be treated as CSAM.
As people who want to talk about words like "megabytes" or "megapixels" or "megaphones" or "Megaman" or "Megan" on Facebook are finding out.
To add context to the discussion, it is important to recall that Discord was reported to have recently filed paperwork with the SEC for an IPO [1]. Thus it seems likely that the real reason for the age verification (i.e., user identification) policy is to boost its perceived earnings potential among Wall Street investors. According to this theory, Discord is the new Facebook.
I predict out-of-the-box deepfake live-camera software will get a bump in popularity, there's already plenty solutions available that need minimal tinkering. It should be trivial to set up for the purpose of verification and I don't see those identity verification providers being able to do anything about it. Of course, that'll only mean stricter verification through ID only later on, much to the present-and-future surveillance state's benefit.
Here's how Discord works. A third or so of its features, such as forum channels (EDIT: I think this specific example was wrong; stage and announcement channels, but not forum channels) or role self-assignment, are locked behind Community Mode. After enabling Community Mode, server owners are NOT ALLOWED to turn off content filtering anymore, meaning that by default, content in every channel may be filtered out by systems you cannot configure.
The only way for the server owner to circumvent the filter is to mark a channel as "NSFW", which doesn't necessarily mean the channel actually contains any NSFW content.
This change will not actually require ID for content confirmed to be NSFW. It will require ID for each and every "NSFW mode" (unfiltered) channel. The end result is that you have three choices:
- Ditch Discord features implemented in recent years (or at least this is currently possible) - this prevents a server from being listed as public;
- Require ID checks from all your users (per channel);
- Have everything scanned from all your users (per channel).
Are you saying that you can "mark" the channel as "NSFW", and Discord will stop scanning your content, possibly allowing you to share very illegal content through their servers?
Sounds weird to me. Pretty sure that they legally have to make sure that they don't host illegal content. Or does "NSFW" enable some kind of end-to-end encryption?
That has always been the case, yes, though I'm not sure what you mean by "illegal" content. There is only a small overlap between NSFW and illegal content, and the NSFW filter has never been concerned with, uh, violating photograph copyright or something.
You don't have to take my word for it, just check it yourself, although it seems that this week, they renamed the NSFW setting to "Age-Restricted Channel" (in preparation for this change, no doubt). The verification-related portion of the behavior I described was implemented for the UK months ago.
The description still contains: "Age-restricted channels are exempt from the explicit content filter."
EDIT: IANAL (or american) but if Discord was policing content for legality rather than age-appropriateness, wouldn't they lose DMCA Safe Harbor protections?
> The description still contains: "Age-restricted channels are exempt from the explicit content filter."
Wait! This does not mean they do not scan it. What I understand from that statement is that they filter explicit content, as in they prevent it from appearing on the user's screen.
When you enable the "NSFW" mode, you tell Discord "it's okay, don't filter out anything". But Discord probably still scans everything.
So that makes sense to me: if you don't validate your age, then Discord will not allow you to join channels that disable the "adult" filtering. I can personally live without adult content on Discord...
I wonder if Discord is legally forced to do that, or if they would rather do it themselves (and collect the data $$$) rather than wait to be imposed a solution they don't own.
I feel like age verification will come, there is no way around it (unlike ChatControl and the likes, age verification seems reasonably feasible and has a lot of political traction right now).
But I would rather have a privacy-preserving solution for that, e.g. from the government (which already knows my age).
There are probably enough regions where it is required or will be required soon, that it makes sense to just get it over with.
The Internet is more or less becoming a locked down, controlled and fully observed thing for end users and citizens, so adapting to that world sooner and working within it is just sensible future-proofing.
This also lets them more safely target older users with ads, purchase requests, etc. and new integrations for gambling and other high ROI systems.
If you're a Slack user, I don't think they need your ID to tell that you're an adult
More seriously, it will become a problem on there is a significant user migration to there and a repeat of the mass hysteria. Due to being more niche, these smaller platforms are probably not in danger right now.
There is a bit of an arms race between id verification systems and users bypassing them when AI gen. Which is really just ai generated images vs. AI generated image detection.
In practice, nothing will stop it, the tooling will gradually get better at detecting prior fakes and banning those users while the newer fakes will go undetected for longer.
Putting up the requirement satisfies their CYA requirements here. The race between AI fraud vs. detection is something they can just ignore and let happen on its own.
Another company jumping on the bandwagon to data-farm in the pretext of safeguarding children. I really wonder if there's an actual method to actually safeguard children while also not holding on to data. Because, genuinely, you can't question this.. Companies would just say "we are trying to protect kids" and that'd be the end of the argument.
I really wonder if when this is fully implemented if they will have any safe guards against selling "adult verified" accounts. With AI being a possible work around for those who don't want to share an ID, selling accounts would be another big issue unless they check for IP addresses and block based on locations and logins. EDIT: I see in another comment that its against TOS to sell accounts, I doubt that has stopped anyone before though.
So my friend group has been looking for alternatives for a while now that feel like discord, works on mobile and desktop, and has voice chat.
I use Signal but the UI is very different from Discord.
I've had very mixed experiences with Element + Matrix, Element keeps crashing on mobile, and while voice chat kinda exists in Element it's not been great imho.
I looked into hosting Rocket.chat, Zullip, and Mattermost but from what I recall voice + mobile were either missing or paywalled at a per-user price.
> How do you know one party isn’t 15 when the other is 25?
You don't. That's why parents need to be involved in their children's lives.
CSAM is the easy excuse, anyway. That's the one lawmakers use, and most people are against CSAM, myself included, so the excuse goes down easy. But the impetus they don't talk about is monitoring and control.
The answer isn't to destroy privacy for everyone. The government and these corporations don't need to know what you're doing every second of the day.
You have got to be kidding me. What is it with these lawmakers and websites demanding people do all of this stuff using services that nobody has ever heard of? I myself (as someone who is blind) have never been able to do the face scanning thing because the information they provide (for, you know, getting my face focused) is just massively insufficient. And a lot of the ones I've seen also require me to (as an alternative) do some weird ID scanning with my camera instead of, you know, just allowing me to upload my ID or something? (Then again, I really wouldn't want to give my ID to some service nobody has ever heard of either, so there.) I also am concerned when tfa says "a photo of an identity document" what does this mean? If I have to scan my ID with my camera, that's not exactly going to be simple for me to pull off. I get that we need to protect kids, but this is not the way. Not when it is discrimination by another name for individuals with disabilities (as just one example).
How does anyone know whether a family is engaging in that time-honored tradition of passing down accounts from grandfather, to father, to son, to child, and their posterity, in perpetuity?
Seriously though, unless you have positively identified the person who created the account in the first place, you have 0% chance of knowing whether it is the same person using it today.
Gamers sell their high-level accounts all the time. It would be a simple matter of economics that the Discord users with the oldest accounts sell them to 12-year-olds. Likewise, accounts are shared willy-nilly, whether or not that violates the rules. And accounts can be stolen or compromised, if you're really hard up.
But under that argument, you would have to prove your age on a regular basis, the plan right now appears to be that each account would only need to do so once.
You agree not to license, sell, lend, or transfer your account, Discord username, vanity URL, or other unique identifier without our prior written approval. We also reserve the right to delete, change, or reclaim your username, URL, or other identifier.
If transfer of accounts is a policy violation, then Discord has legal cover to confidently assert that, once ID is verified, the ID'd person is the owner and controller of the account thereafter.
Account selling, stealing, and sharing will certainly still happen, but that's grounds for banning, and not Discord's legal liability anymore.
Just ban that in TOS. As we know TOS is inviolable. As such it is not possible to sell, gift or otherwise transfer an account. At least this should be considered how it works for age verification. If account transfer is found out account can be terminated thus closing the loop hole.
No law or regulation is ever 100% effective in real life. Income tax is not collected 100% effectively. Should we not do it? Criminals are not caught 100% of the time, should we not do it?
Of course this won't be 100% effective, maybe 80-90% effective. That's all they need and expect from this system.
HN is constantly obsessed with is it perfectly effective?
No law, none, is perfectly effective. Speed limits certainly aren’t self enforcing, but remove your neighborhood’s speed limits first if you truly believe laws must be demonstrated perfect.
Yeah, my youtube/google account is almost as old as youtube itself is, but will constantly ask me to verify my age when clicking on something as marked 'not for kids'. Can we just get the leisure-suit-larry age-verification system ;)
This is just the latest in a long trend of increasing spying on users. Why bother having to guess who your user is, or fingerprint a browser if you can just force them to show you their national ID?
This is transparently about spying on people, not "protecting children". The real world doesn't require you to show your ID to every business you frequent, or every advertiser you walk by. Someone can yell a swear word on the sidewalk, and not everyone within ear shot has to show ID.
Alternative: run your own self-hosted messaging server for you, your family and friends. No company should ever get such sensitive data as private conversations.
Use Discord with a throw-away account. Create a character in GTA 5 on your laptop and show its face (in "selfie" mode) to the web-camera on another computer with Discord open. All face scan checks so far gladly accept it. Instagram has been requiring occasional face checks for ages already.
Honestly they're probably big enough to get away with it.
If it was only friend groups it would kill them for sure, we've seen that many times, but given the absurd amount many large online communities on Discord, I'd wager they can force it down and be relatively unscathed.
They played the long game - they provided a good service for 10 years, and got REALLY big before they started the enshittification process.
Ratings aren't legally binding though are they? I bought games older rated than I was, and it's totally up to people's parents what they're allowed to play. Are you suggesting a 15 year old should be allowed to play the 16 rated game but not discuss it?
Haven't cared about Discord in a long time. In fact I'm glad they're continuing to shoot themselves in the foot.
During the pandemic, I was on a Discord server for folks to socialize and blow off steam about the whole situation. Yes, there were some anti-vaxx wackos, but overall the place was civil and balanced, and I met some interesting people through it. We cracked jokes and it was a little bit of fun in a tough time.
One day I came to discover that Discord had banned the server for allegedly violating... something. I wish I had written down everyone's emails because I permanently lost contact with a bunch of friends in an instant.
I never signed in to Discord again, in spite of times where some other social group wanted to use it. I vowed never to use Discord again. Fuck those guys and the Teslas they rode in on. I hope this ID verification thing is another big step towards their irrelevancy.
The difference with Reddit is it has way more persistent value. Everything on Discord is throwaway, but valuable posts on Reddit from years past are easily retrievable. The two aren't so comparable.
One of the unspoken reasons many people have for using Discord is they don't want what they say to easily be associated with them in perpetuity. Requiring ID really chips away at that, in spite of what Discord has to say about privacy around ID.
By no means am I saying that Discord will go extinct. I just haven't observed anything about it that's irreplaceable. Reddit, on the other hand, has a wealth of discussion dating back to the mid-to-late 00's.
Hard no. Reality is that this push is everywhere. Authoritarian governments are cracking down hard on dissent, they're not going to leave huge platforms for communication untouched. We'll need open source decentralized alternatives.
Here's the October 2025 Discord data breach mentioned at the end of the article:
https://www.bbc.com/news/articles/c8jmzd972leo
> Discord, a messaging platform popular with gamers, says official ID photos of around 70,000 users have potentially been leaked after a cyber-attack.
However, their senior director states in this Verge article:
> The ID is immediately deleted. We do not keep any information around like your name, the city that you live in, if you used a birth certificate or something else, any of that information.
Why they didn't do that the first time?
> The ID is immediately deleted. We do not keep any information around like your name, the city that you live in, if you used a birth certificate or something else, any of that information.
This is also contradicted by what Discord actually says:
> Quick deletion: Identity documents submitted to our vendor partners are deleted quickly— in most cases, immediately after age confirmation.
What are the non-most cases?
Also, _Discord_ deleting them is really only half the battle; random vendors deleting them remains an issue.
And do they really actually delete it this time?
>Why they didn't do that the first time?
The company they hired to do the support tickets archived them, including attachments, rather than deleting them.
Ah, so it was the "staffer" excuse.
How convenient.
What realistic open source alternatives to Discord are there? I'm currently considering moving to one of these with my friend group:
- Matrix
- Stoat, previously revolt (https://stoat.chat/)
- IRC + Mumble
- Signal
Snikket (https://snikket.org ) with Monal as the iOS client
One thing most of those lack is an easy way to share screen.
Now if anyone wants to differentiate their Discord alternative, they want to have most of discord functionalities and add the possibility to be in multiple voice chats (maybe with rights and a channel hierarchy + different push-to-talk binds). It's a missed feature when doing huge operations in games and using the Canary client is not always enough.
Stoat has screen sharing / video calling in the pipeline at least: https://github.com/stoatchat/stoatchat/issues/313
Does matrix have decent 1:N client desktop broadcasting with low latency (and high fps) yet? I use discord for "watch parties", video and tabletop gaming...
Last I checked Signal was not fully open source, which is iffy, believe their encryption protocol is still closed. That said its the best of a bad bunch for E2EE messaging. If you're on android I'd recommend doing what I do, which is installing from the APK on the site, manually verifying the sig locally (you can use termux for this), and then lagging ever so slightly behind on updates to avoid potential supply chain or hostile takeover attacks. This is probably over cautious for most threat profiles, but better safe than sorry imo. Also their server side stuff is close sourced, technically this isnt an issue though as long as the E2EE holds up to scrutiny though.
Edit: My information may be out of date, I cannot find any sources saying any part of the app is closed source these days, do your own research ofc but comfortable saying its the most accessible secure platform.
I wonder how Stoat will fare, and how it is currently maintained, in terms of "making money"; my fear is that it would steer into the direction of Discord itself.
Which of these has been around for over three decades?
That would be my answer.
I have found Element and Matrix to be totally unusable in iOS
Element’s awful, but I’ve found FluffyChat, another matrix client, to be a lot better, albeit with a very silly name.
Revolt's rename to stoat is probably worse than any rebranding MSFT done ever.
It's because of the trademark: https://stoat.chat/updates/long-live-stoat
Nevertheless, I don't like the new name either, oh well...
I like this comment though:
Imagine you make a free software project and it runs into trademark issues because people have more money than you to register in more classes than your project.
And then even though your project existed first, they still come after you anyway.
And from that an even more expensive rebranding from this as well.
from: https://news.ycombinator.com/item?id=45626225, not sure how accurate it is, but it makes me want to revolt .
"[beaver emoji] Revolt is Stoat now"
Argh. If there's no stoat emoji, petition the Unicode Consortium for one, don't just use a beaver. It's not even the right family; the badger emoji would be closer.
It's open source, I'm tempted to fork it and do nothing other than change the branding.
For me, the closest alternative to Discord is Stoat. Matrix with Element (or other clients) would be great, but it feels so slow on both desktop and mobile.
IRC was here before Discord, and it will still be here after.
I've never heard of Stoat. Looks like IRC but it's Electron. Total waste of time.
IRC does not support group voice & video calls, which is one of the primary features of Discord (and previously Skype, from which everyone migrated to Discord in the first place)
Kids these days...
The sad thing is that I think many people will en masse pony up their ID or snapshot without a second thought. I'm not sure if enough people will refuse to actually force Discord to back off this decision (unless their idea is to grab as much data as possible at once with the understanding that they are going to back off either way).
Especially if it's presented as a pop-up upon launching the app that suggests the user won't be able to talk to their friends/servers without showing ID. Carefully worded language would could spur some % of users to panic at losing years of history and immediately show ID. Folks with less privacy discernment hear "jump" and reply "how high".
> panic at losing years of history
I used to be like that. It was unsustainable and ultimately mentally unhealthy.
I don't imagine this was a 100% their decision, it's more like a response to the epidemic of all the world's governments suddenly coming up with adult verification schemes. Discord has already required it in some countries, and it's definitely easier to get everybody to verify themselves than require it on a per-jurisdiction basis. The personal data they get is a cherry on top.
Also, this is just the beginning, more social networks will require the same soon.
They don't have to comply in advance.
I have done that for stripchat which was also requiring it. Not happy with it but I'd rather use a selfie than a whole ID document which includes an image anyway.
The thing is, what other option do I have?
I'll continue using Discord in teen mode, I guess. I'd rather not lose the current connections & servers I have on there, and I'm not optimistic about people migrating away, especially non-tech people.
I get the draconian side of things, but I am also tired of thousands of russian, indian, domestic-funded etc. bots flooding the zone with divisive propaganda.
In theory, this seems like it would at least be a step in the direction of combating disinformation.
I'm curious if there are any better ways to suppress these propaganda machines?
I don't see how disallowing viewing "age-restricted" content through Discord without giving them your ID would have any impact on the spread of disinformation, outside of like, disinfo in the form or pornographic or gory images.
It's kind of surprising that no-one has really come out with a proper privacy-preserving approach to this yet. It is clearly _possible_; there are reasonable-looking designs for this. But no-one's doing it; they're just collecting photos and IDs, and then leaking them all over the place.
> and will see content filters for any content Discord detects as graphic or sensitive.
I didn't even realise discord scans all the images that i send and recieve.
Really I've come to the conclusion that anything I send out of my LAN is probably kept on a server forever and ingested by LLMs, and indexed to be used against me in perpetuity at this point, regardless of what any terms or conditions of the site I'm using actually says.
Speaking of hosting, Discord used to be one of the biggest (inadvertent) image hosts, so they might have set up the system to reduce legal exposure than to monitor conversations per se.[1]
A lot of the internet broke the day they flipped that switch off.
Weren't external Tumblr hotlinks also a thing back in the day?
[1]: https://www.reddit.com/r/discordapp/comments/16uy0an/not_sur...
To be fair, the terms and conditions probably say that they can do whatever they want with that data :-).
Don’t forget all the government creeps snooping on the wires.
Until the current administration, I was much more bothered by private misuse/abuse of date than the government. Now I worry about both.
Good. Being OK with authoritarianism because they are on your side is never good.
Why? People who volunteer to work for these government drag nets must be total psychos.
Volunteer? I mean they do get paid.
The thing is it's a mix of both.
You have the fervent that love recording everything "for the good of the people". But then you'll just have piles of people with separation of duties that do things with very little understanding of where they fit in the process and very little care to.
We gave those brogrammers the keys to the machine when we made programming more accessible.
Pretty much every non-E2EE platform is scanning every uploaded image for CSAM at least, that's a baseline ass-covering measure.
And E2EE platforms like Mega are now being censored on some platforms specifically because they're E2EE, and so the name itself must be treated as CSAM.
As people who want to talk about words like "megabytes" or "megapixels" or "megaphones" or "Megaman" or "Megan" on Facebook are finding out.
They have to at least for CSAM.
Everything that is not end-to-end encrypted understandably has to do it.
F** that, guess I'm leaving that platform too now...
In case anyone else can’t read it: https://archive.is/PvpAx
To add context to the discussion, it is important to recall that Discord was reported to have recently filed paperwork with the SEC for an IPO [1]. Thus it seems likely that the real reason for the age verification (i.e., user identification) policy is to boost its perceived earnings potential among Wall Street investors. According to this theory, Discord is the new Facebook.
[1] https://techcrunch.com/2026/01/07/discords-ipo-could-happen-...
Are they going to leak IDs of minors again like they did last time? Who does this protect exactly?
It protects the investors so they can IPO
Credit card verification not an option.
Facial video estimates or submit an id card.
Option 3: if we analyze all of your data we have and see you are not going to bed at 8pm for middle school, you get adult status.
Great news, there’s finally going to be sufficient motivation for people to both build out and use open source alternatives.
I predict out-of-the-box deepfake live-camera software will get a bump in popularity, there's already plenty solutions available that need minimal tinkering. It should be trivial to set up for the purpose of verification and I don't see those identity verification providers being able to do anything about it. Of course, that'll only mean stricter verification through ID only later on, much to the present-and-future surveillance state's benefit.
https://github.com/hacksider/Deep-Live-Cam
> Users who aren’t verified as adults will not be able to access age-restricted servers and channels
I genuinely wonder which proportion of the users want access to age-restricted servers and channels...
Feels like it should be just fine not to verify the age.
Here's how Discord works. A third or so of its features, such as forum channels (EDIT: I think this specific example was wrong; stage and announcement channels, but not forum channels) or role self-assignment, are locked behind Community Mode. After enabling Community Mode, server owners are NOT ALLOWED to turn off content filtering anymore, meaning that by default, content in every channel may be filtered out by systems you cannot configure.
The only way for the server owner to circumvent the filter is to mark a channel as "NSFW", which doesn't necessarily mean the channel actually contains any NSFW content.
This change will not actually require ID for content confirmed to be NSFW. It will require ID for each and every "NSFW mode" (unfiltered) channel. The end result is that you have three choices:
- Ditch Discord features implemented in recent years (or at least this is currently possible) - this prevents a server from being listed as public;
- Require ID checks from all your users (per channel);
- Have everything scanned from all your users (per channel).
Are you saying that you can "mark" the channel as "NSFW", and Discord will stop scanning your content, possibly allowing you to share very illegal content through their servers?
Sounds weird to me. Pretty sure that they legally have to make sure that they don't host illegal content. Or does "NSFW" enable some kind of end-to-end encryption?
That has always been the case, yes, though I'm not sure what you mean by "illegal" content. There is only a small overlap between NSFW and illegal content, and the NSFW filter has never been concerned with, uh, violating photograph copyright or something.
You don't have to take my word for it, just check it yourself, although it seems that this week, they renamed the NSFW setting to "Age-Restricted Channel" (in preparation for this change, no doubt). The verification-related portion of the behavior I described was implemented for the UK months ago.
The description still contains: "Age-restricted channels are exempt from the explicit content filter."
EDIT: IANAL (or american) but if Discord was policing content for legality rather than age-appropriateness, wouldn't they lose DMCA Safe Harbor protections?
> The description still contains: "Age-restricted channels are exempt from the explicit content filter."
Wait! This does not mean they do not scan it. What I understand from that statement is that they filter explicit content, as in they prevent it from appearing on the user's screen.
When you enable the "NSFW" mode, you tell Discord "it's okay, don't filter out anything". But Discord probably still scans everything.
So that makes sense to me: if you don't validate your age, then Discord will not allow you to join channels that disable the "adult" filtering. I can personally live without adult content on Discord...
OK, but you're not the one making that decision and you don't know/can't control how that decision is being made.
> I genuinely wonder which proportion of the users want access to age-restricted servers and channels...
Way more than you think. There are tons of Discord servers that only exist to share pornography.
I wonder if Discord is legally forced to do that, or if they would rather do it themselves (and collect the data $$$) rather than wait to be imposed a solution they don't own.
I feel like age verification will come, there is no way around it (unlike ChatControl and the likes, age verification seems reasonably feasible and has a lot of political traction right now).
But I would rather have a privacy-preserving solution for that, e.g. from the government (which already knows my age).
There are probably enough regions where it is required or will be required soon, that it makes sense to just get it over with.
The Internet is more or less becoming a locked down, controlled and fully observed thing for end users and citizens, so adapting to that world sooner and working within it is just sensible future-proofing.
This also lets them more safely target older users with ads, purchase requests, etc. and new integrations for gambling and other high ROI systems.
Good riddance Discord. Any alternative for the masses?
They’re not gonna use Slack or phpBB.
Why would Slack not be affected by the same stupid laws?
If you're a Slack user, I don't think they need your ID to tell that you're an adult
More seriously, it will become a problem on there is a significant user migration to there and a repeat of the mass hysteria. Due to being more niche, these smaller platforms are probably not in danger right now.
“We will find ways to bring people back” yeah because that usually works. I imagine this gets rolled back or siloed to only adult specific channels.
Genuine question, what is stopping users from using AI to generate a fake face or ID to bypass this restriction?
There is a bit of an arms race between id verification systems and users bypassing them when AI gen. Which is really just ai generated images vs. AI generated image detection.
In practice, nothing will stop it, the tooling will gradually get better at detecting prior fakes and banning those users while the newer fakes will go undetected for longer.
Putting up the requirement satisfies their CYA requirements here. The race between AI fraud vs. detection is something they can just ignore and let happen on its own.
> prior fakes
But they assured me my biometrics are deleted after uploading!
To be honest it kinda sounds like a benefit for my use-case. I don’t engage with adult content on there and use it for one server with friends.
And this will reduce spam from random accounts. Will see if it remains usable without uploading my Id.
Another company jumping on the bandwagon to data-farm in the pretext of safeguarding children. I really wonder if there's an actual method to actually safeguard children while also not holding on to data. Because, genuinely, you can't question this.. Companies would just say "we are trying to protect kids" and that'd be the end of the argument.
I really wonder if when this is fully implemented if they will have any safe guards against selling "adult verified" accounts. With AI being a possible work around for those who don't want to share an ID, selling accounts would be another big issue unless they check for IP addresses and block based on locations and logins. EDIT: I see in another comment that its against TOS to sell accounts, I doubt that has stopped anyone before though.
I foresee Discord receiving a lot of identification documents from the likes of Ben Dover
So my friend group has been looking for alternatives for a while now that feel like discord, works on mobile and desktop, and has voice chat.
I use Signal but the UI is very different from Discord.
I've had very mixed experiences with Element + Matrix, Element keeps crashing on mobile, and while voice chat kinda exists in Element it's not been great imho.
I looked into hosting Rocket.chat, Zullip, and Mattermost but from what I recall voice + mobile were either missing or paywalled at a per-user price.
Any recommendations?
I seem to recall Jitsi working pretty well.
Jitsi is great but the element integration felt clunky. Maybe I'll have to revisit it.
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting. [1]
That presumably includes selfies?
That means that to exchange racy photos on Discord, each person must first record a facial age estimation video or upload identification documents.
That seems dystopian.
1: https://discord.com/press-releases/discord-launches-teen-by-...
How do you know one party isn’t 15 when the other is 25?
You’re never going to convince a parent or a lawmaker or even me that this is dystopian. Seems like a perfectly reasonable safeguard.
> How do you know one party isn’t 15 when the other is 25?
You don't. That's why parents need to be involved in their children's lives.
CSAM is the easy excuse, anyway. That's the one lawmakers use, and most people are against CSAM, myself included, so the excuse goes down easy. But the impetus they don't talk about is monitoring and control.
The answer isn't to destroy privacy for everyone. The government and these corporations don't need to know what you're doing every second of the day.
> That's why parents need to be involved in their children's lives.
Can't, aren't, look at iPad kids, won't. This is about as logical as saying people should drive safely, so we don't need guardrails.
They'll now have kompromat associated with a name, address, and id number (be it social security, BSN, or whatever your country calls it)
You have got to be kidding me. What is it with these lawmakers and websites demanding people do all of this stuff using services that nobody has ever heard of? I myself (as someone who is blind) have never been able to do the face scanning thing because the information they provide (for, you know, getting my face focused) is just massively insufficient. And a lot of the ones I've seen also require me to (as an alternative) do some weird ID scanning with my camera instead of, you know, just allowing me to upload my ID or something? (Then again, I really wouldn't want to give my ID to some service nobody has ever heard of either, so there.) I also am concerned when tfa says "a photo of an identity document" what does this mean? If I have to scan my ID with my camera, that's not exactly going to be simple for me to pull off. I get that we need to protect kids, but this is not the way. Not when it is discrimination by another name for individuals with disabilities (as just one example).
Any age verification process that does not consider the age of the account as a verification option is a data trap, plain and simple.
How does anyone know whether a family is engaging in that time-honored tradition of passing down accounts from grandfather, to father, to son, to child, and their posterity, in perpetuity?
Seriously though, unless you have positively identified the person who created the account in the first place, you have 0% chance of knowing whether it is the same person using it today.
Gamers sell their high-level accounts all the time. It would be a simple matter of economics that the Discord users with the oldest accounts sell them to 12-year-olds. Likewise, accounts are shared willy-nilly, whether or not that violates the rules. And accounts can be stolen or compromised, if you're really hard up.
How often do you suppose they will be re-checking your ID? Once every... never?
But under that argument, you would have to prove your age on a regular basis, the plan right now appears to be that each account would only need to do so once.
You agree not to license, sell, lend, or transfer your account, Discord username, vanity URL, or other unique identifier without our prior written approval. We also reserve the right to delete, change, or reclaim your username, URL, or other identifier.
If transfer of accounts is a policy violation, then Discord has legal cover to confidently assert that, once ID is verified, the ID'd person is the owner and controller of the account thereafter.
Account selling, stealing, and sharing will certainly still happen, but that's grounds for banning, and not Discord's legal liability anymore.
Then why could they not also legally get away with using account age as a proxy?
Just remember that the Terms of Service you agreed to are about as firm as explosive diarrhea.
Just ban that in TOS. As we know TOS is inviolable. As such it is not possible to sell, gift or otherwise transfer an account. At least this should be considered how it works for age verification. If account transfer is found out account can be terminated thus closing the loop hole.
No law or regulation is ever 100% effective in real life. Income tax is not collected 100% effectively. Should we not do it? Criminals are not caught 100% of the time, should we not do it?
Of course this won't be 100% effective, maybe 80-90% effective. That's all they need and expect from this system.
Exactly.
HN is constantly obsessed with is it perfectly effective?
No law, none, is perfectly effective. Speed limits certainly aren’t self enforcing, but remove your neighborhood’s speed limits first if you truly believe laws must be demonstrated perfect.
Has discord even been around for 18 years?
Yeah, my youtube/google account is almost as old as youtube itself is, but will constantly ask me to verify my age when clicking on something as marked 'not for kids'. Can we just get the leisure-suit-larry age-verification system ;)
Apple deleted many legacy mac-dot-com accounts without qualms, not long ago. It was the phone accounts, in so many ways, driving it .. IMHO
No thanks. Discord, it has been fun, but I decline.
This is just the latest in a long trend of increasing spying on users. Why bother having to guess who your user is, or fingerprint a browser if you can just force them to show you their national ID?
This is transparently about spying on people, not "protecting children". The real world doesn't require you to show your ID to every business you frequent, or every advertiser you walk by. Someone can yell a swear word on the sidewalk, and not everyone within ear shot has to show ID.
Source: https://discord.com/press-releases/discord-launches-teen-by-...
Alternative: run your own self-hosted messaging server for you, your family and friends. No company should ever get such sensitive data as private conversations.
Use Discord with a throw-away account. Create a character in GTA 5 on your laptop and show its face (in "selfie" mode) to the web-camera on another computer with Discord open. All face scan checks so far gladly accept it. Instagram has been requiring occasional face checks for ages already.
Looks like it might be opt-in by server.
Honestly they're probably big enough to get away with it.
If it was only friend groups it would kill them for sure, we've seen that many times, but given the absurd amount many large online communities on Discord, I'd wager they can force it down and be relatively unscathed.
They played the long game - they provided a good service for 10 years, and got REALLY big before they started the enshittification process.
The CEO of Discord is Humam Sakhnini. He's from McKinsey. So that tracks.
How many people are doing age restricted stuff on Discord (besides the specifically there for adult content and gooning crowd)
All of my use is primarily professional and gaming and has no age concerns
Gaming certainly has age-concerns, many games are rated 13/15/16+ or 18+
But yeah, leaving discord... they are not getting my ID/Photo
Ratings aren't legally binding though are they? I bought games older rated than I was, and it's totally up to people's parents what they're allowed to play. Are you suggesting a 15 year old should be allowed to play the 16 rated game but not discuss it?
Can their parents also approve their discord usage?
Are you saying they need parents to buy the game, but shouldn't to join chats about the same game?
At least Google is pushing on zero-knowledge solutions
Maybe they can force everyone's hand like they did for https
https://blog.google/innovation-and-ai/technology/safety-secu...
Haven't cared about Discord in a long time. In fact I'm glad they're continuing to shoot themselves in the foot.
During the pandemic, I was on a Discord server for folks to socialize and blow off steam about the whole situation. Yes, there were some anti-vaxx wackos, but overall the place was civil and balanced, and I met some interesting people through it. We cracked jokes and it was a little bit of fun in a tough time.
One day I came to discover that Discord had banned the server for allegedly violating... something. I wish I had written down everyone's emails because I permanently lost contact with a bunch of friends in an instant.
I never signed in to Discord again, in spite of times where some other social group wanted to use it. I vowed never to use Discord again. Fuck those guys and the Teslas they rode in on. I hope this ID verification thing is another big step towards their irrelevancy.
Discord has 150 million monthly active users.
They’ll be fine. To them, this is just another internet boycott, with all that entails. Reddit survived a worse one and grew afterward.
The difference with Reddit is it has way more persistent value. Everything on Discord is throwaway, but valuable posts on Reddit from years past are easily retrievable. The two aren't so comparable.
One of the unspoken reasons many people have for using Discord is they don't want what they say to easily be associated with them in perpetuity. Requiring ID really chips away at that, in spite of what Discord has to say about privacy around ID.
By no means am I saying that Discord will go extinct. I just haven't observed anything about it that's irreplaceable. Reddit, on the other hand, has a wealth of discussion dating back to the mid-to-late 00's.
>valuable posts on Reddit
[removed]
[removed]
[removed]
[removed]
[removed]
There's this thing called the Wayback Machine, but I lol'd at your response. It's not untrue. xD
Hard no. Reality is that this push is everywhere. Authoritarian governments are cracking down hard on dissent, they're not going to leave huge platforms for communication untouched. We'll need open source decentralized alternatives.
Indeed, the article basically says as much in more pacifying terms:
> driven by an international legal push for age checks and stronger child safety measures
another one bites the dust.
No thank you, get fucked
This is not OK.