Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
This is not the case. They are discussing the default value of `g++ -std=...`. That does not complicate bootstrapping as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.
> as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.
I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.
It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.
> ... why would you do it when you can simply keep the compiler targeting 11 ...
It doesn't appear to me that the parent comment was implying otherwise.
The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.
> It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
I could be misreading this, but unless they have a different understanding of what it means to dog fooding than I do then it seems like the proposal is to use C++20 features in the compiler bootstraping.
Counterpoint: you could write a C++ compiler in a non-C/C++ language such that the compiler’s implementation language doesn’t even have the notion of C++20.
A compiler is perfectly capable of compiling programs which use features that its own source does not.
Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.
The way I read withzombies's comment (and it could be wrong) was they were talking about the language version of the compilers source. I assumed that from the "dogfooding" portion of the comment.
> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
> What is the downside of switching to the newest standard when it's properly supported?
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
I think if you were to poll people, a significant portion would be repulsed by this catgirl aesthetic, or (though this isn't the case for Anubis) the cliche inappropriately dressed inappropriately young anime characters dawned as mascots in an ever increasing number of projects. People can do whatever they want with their projects, but I feel like the people who like this crap perhaps don't understand how repulsive it is to a large number of people. Personally it creeps me out.
I'm not repulsed by it but I do wish the people that forced this stuff into their software/hardware realized how juvenile it makes their product look. There's a decent cheap Chinese pair of Bluetooth earbuds on Amazon that's been very popular among audiophiles but the feedback sounds are an anime girl making noises and there's no way to turn it off so I lost interest in purchasing them.
It sounds like something you might benefit from talking to a therapist. It's not normal to have such a strong reaction. I hope you can get the help you need!
It's particularly jarring to basically every site I've seen it on which is usually some serious and professional looking open source site.
I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
Anubis is a bit annoying over crappy internet connections, especially in front of a webpage that would work quite well in this case otherwise, but it still performs way better than Cloudflare in this regard.
Anubis is significantly less jarring than cloudflare blocks preventing any access at all. At least Anubis lets me read the content of pages. Cloudflare is so bleeding edge and commercial they do not care about broad brower support (because it doesn't matter for commercial/sales). But for websites you actually want everyone to be able to load anubis is by far the best.
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.
Many people have said they don't like it, and all that did is make its supporters even happier that it's there, because it makes them feel special is some strange way.
This is from 2019, prior to the finalization of modules in the standard. I'd be interested in how many of these issues were unaddressed in the final version shipped.
The coroutine convo is interesting. Does it mean that for example, a GCC program may not run correctly when linked to a clang binary and both use coroutines?
Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
This is not the case. They are discussing the default value of `g++ -std=...`. That does not complicate bootstrapping as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.
> as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.
I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.
It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.
> ... why would you do it when you can simply keep the compiler targeting 11 ...
It doesn't appear to me that the parent comment was implying otherwise.
The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.
> It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
I could be misreading this, but unless they have a different understanding of what it means to dog fooding than I do then it seems like the proposal is to use C++20 features in the compiler bootstraping.
Counterpoint: you could write a C++ compiler in a non-C/C++ language such that the compiler’s implementation language doesn’t even have the notion of C++20.
A compiler is perfectly capable of compiling programs which use features that its own source does not.
That's not a counterpoint—at least not to anything in the comment that you're (nominally) "responding" to.
So why has it been posted it as a reply, and why label it a counterpoint?
Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.
The way I read withzombies's comment (and it could be wrong) was they were talking about the language version of the compilers source. I assumed that from the "dogfooding" portion of the comment.
Well there are still some c++20 items that aren't fully supported, at least according to cppref.
https://en.cppreference.com/w/cpp/compiler_support/20.html
> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html
> What is the downside of switching to the newest standard when it's properly supported?
They are discussing in this email thread whether it is already properly supported.
> It's one reason why people care so much about self-hosted compilers
For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.
When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
> What is the downside of switching to the newest standard when it's properly supported?
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
That anime gating is very jarring, thought I clicked on the wrong link and clicked back.
Right? I hope it never goes away, we should make the web more fun instead of sad and clean!
I think if you were to poll people, a significant portion would be repulsed by this catgirl aesthetic, or (though this isn't the case for Anubis) the cliche inappropriately dressed inappropriately young anime characters dawned as mascots in an ever increasing number of projects. People can do whatever they want with their projects, but I feel like the people who like this crap perhaps don't understand how repulsive it is to a large number of people. Personally it creeps me out.
I'm not repulsed by it but I do wish the people that forced this stuff into their software/hardware realized how juvenile it makes their product look. There's a decent cheap Chinese pair of Bluetooth earbuds on Amazon that's been very popular among audiophiles but the feedback sounds are an anime girl making noises and there's no way to turn it off so I lost interest in purchasing them.
The internet was better when it repulsed a significant portion of people.
> inappropriately dressed
How do you think Anubis should dress?
Perhaps like he is depicted in temples, like this one from the tomb of Horemheb; 1323-1295 BC: https://commons.wikimedia.org/wiki/File:The_King_with_Anubis...
a dog man wearing short skirts is also inappropriate in my opinion
Other options would be: just the head, a black dog (common depiction), perhaps most fittingly to what Anybis does: the scales
It sounds like something you might benefit from talking to a therapist. It's not normal to have such a strong reaction. I hope you can get the help you need!
What? She's wearing a hoodie and a tee-shirt, how is that inappropriate? And how being young is inappropriate?
The whole Japanese cartoon schoolgirl thing is 100% creepy.
Anubis has been around for almost a year now, but it's also not particularly relevant to the content of the email thread.
It's particularly jarring to basically every site I've seen it on which is usually some serious and professional looking open source site.
I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
Because that's the difference between the paid and free versions
Anubis asks that you don’t change the logo and if you want to, pay them: https://anubis.techaro.lol/docs/funding/
I think they might also want to bring attention to the problem and advertise for an open-source solution.
Anubis is open-source (MIT).
I’m sure if you want you can offer to pay like $500/mo on their behalf and they’ll change it for everyone.
That's the paid upgrade for "enterprise" level quality.
Anubis is a bit annoying over crappy internet connections, especially in front of a webpage that would work quite well in this case otherwise, but it still performs way better than Cloudflare in this regard.
Recently, on HN: https://news.ycombinator.com/item?id=44962529
I wouldn't have known that this is anime, if not for all the HN comments pointing that out.
Anubis is significantly less jarring than cloudflare blocks preventing any access at all. At least Anubis lets me read the content of pages. Cloudflare is so bleeding edge and commercial they do not care about broad brower support (because it doesn't matter for commercial/sales). But for websites you actually want everyone to be able to load anubis is by far the best.
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.
See also discussion on https://news.ycombinator.com/item?id=44962529
Many people have said they don't like it, and all that did is make its supporters even happier that it's there, because it makes them feel special is some strange way.
Who cares tbh
Good. Let me use modules!
You can always specify the language version in your compiler invocation.
> Presumably we still wouldn't enable Modules by default.
Seriously, why? They are broken. https://vector-of-bool.github.io/2019/01/27/modules-doa.html
This is from 2019, prior to the finalization of modules in the standard. I'd be interested in how many of these issues were unaddressed in the final version shipped.
The coroutine convo is interesting. Does it mean that for example, a GCC program may not run correctly when linked to a clang binary and both use coroutines?