I have a lot of respect for the .NET team. They often publish great in-depth articles and their pursuit for performance is relentless (e.g. see Kestrel and Entity Framework evolution).
And ASP.NET is one of the few large projects which managed to survive a large breaking changes. Almost to Python 2->3 level. You had to change how your web app behaved completely if you relied on their magic session which worked hard to keep state synched between back and front.
Feels good to have 3 trillion dollars interested in improving the stack you use and actually care.
.NET was a solid choice for backend builds before Node became so popular (And .NET is generally more performant than Node).
I hope this churn in .NET builds is temporary because a lot of people might be looking to go back to something stable especially after the recent supply chain attacks on the Node ecosystem.
Not sure about the past tense here. .NET is still excellent and getting even better with every release. What instability are you talking about? There was the leap to .NET Core which was majorly breaking, but that was almost 10 years ago now.
> I hope this churn in .NET builds is temporary because a lot of people might be looking to go back to something stable especially after the recent supply chain attacks on the Node ecosystem.
Can you elaborate a bit? This article talks about internal machinery of building .net releases. What does that have to do with "this churn", whatever that is?
My guess is if you build with .NET Framework you can just forever run your builds, but if your source code is based on newer .NET you have to update to a new version each year, and deal with all the work in upgrading your entire project, which also means everyone in your team is also upgrading their dev environment, and now you have new things in the language and the runtime to deal with, deprecation and all that. Plus lots of packages don’t update as fast when version changes occurs, so chances are you will probably take more work and use as few dependencies as possible if at all, which may cause a lot of work. Instead it’s best to, if you need to depend on something, to be a very big Swiss Army knife like thing.
I think node is just more flexible and unless .NET Framework like forever releases or much longer term support make a come back, there’s no good trade off from node, since you don’t even get more stability.
What do you mean? The .Net ecosystem has been generalized chaos for the past 10 years.
A few years ago even most people actively working in .Net development couldn't tell what the hell was going on. It's better now. I distinctly recall when .Net Framework v4.8 had been released and a few months later .Net Core 3.0 came out and they announced that .Net Standard 2.0 was going to be the last version of that. Nobody had any idea what anything was.
.Net 5 helped a lot. Even then, MS has been releasing new versions of .Net at a breakneck pace. We're on .Net 10, and .Net Core 1.0 was 9 years ago. There's literally been a major version release every year for almost a decade. This is for a standard software framework! v10 is an LTS version of a software framework with all of 3 years of support. Yeah, it's only supported until 2028, and that's the LTS version.
I love working with dotnet, but lately I’ve been writing more backend applications in Python. The code is simpler, testing is simpler since method privacy doesn’t really exist, and code is quicker to deploy because you do not have to compile it.
This could also change but in my experience AI is better at generating Python code versus dotnet.
Problem is though Python is slow at runtime. May not matter for many use cases, but I've worked with a lot of startups that suffered terrible reliability problems because they chose Python (or Rails, or Node to some extent) and the service cannot handle peak time load without a lot of refactoring and additional app servers.
Depending on your framework Python is at best ~3x slower (FastAPI) and at worst ~20x (Django) than asp.net on the techempower benchmarks, which maps pretty well to my real world experience.
I don't spend a lot of time building services, but the last few I've done, I actually went straight to Rust. The downside is that it's quite slow to develop -- I probably don't have the knowledge that others do, but it seems that frameworks could really use some work. That said, I love that I can find and fix most my problems during development. Building a service in Python means I'm constantly fixing issues in production.
.NET is certainly better than Python, but I'm not very happy with the type system and the code organization versus my Rust projects.
Can confirm. Just finished load testing a FastApi service. Now the biggest selling point is that a lot of real backend never experience the level of load where this actually matters
I'm moving from Python to Java because of how much easier it is to actually use all CPU cores in Java and strict typing prevents so many bugs and it is much faster. I don't think it is actually that much more complicated than Python in 2025.
Agreed. It's sort of crazy how little people understand about multicore software design given nearly everyone is using machines with >8 CPU cores these days (even a cheap android phone tends to have 8 cpu cores these days).
In python and node it is _so_ painful to use multiple cores, whereas in .net you have parallel for loops and Task.WhenAll for over a decade. Java is similar in this sense that you don't have to do anything to use multiple cores and can just run multiple tasks without having to worry about passing state etc between 'workers'.
This actually becomes a really big problem for web performance, something I'm deeply passionate about. Not everything is just IO driven holdups, sometimes you do need to use a fair bit of CPU to solve a problem, and when you can't do it in parallel easily it ends up causing a lot of UX issues.
Has too much sugar, and without JetBrains IDE you're stuck with a plain text editor. Not sure if it's generalizable to normal Kotlin or not, but learning Gradle Kotlin DSL made me want to rip my hair out when trying to understand what happens under the hood.
one thing that struck me was that the foundation for this effort was the linux distro build system. in other words, the work they put into making .net open-source and cross-platform eventually made everyone's lives easier.
I can see that high level overviews of complex systems are useful to get some insights, but in the same way I have the feeling that this mentality of high level, abstract organization is the root of the problem. If you have a complex system and simplify the components into abstractions, you will repeatedly run into difficulties because you've actively ignored the dirty bits. It's an top down approach that tries to tackle all issues, but an bottom up approach could even eradicate myriads of issues.
Oh, wow, I didn't expect that the best thing I'd read about software engineering, like, this year would come out of Microsoft! Don't get me wrong: I like .NET, especially its recent incarnation, but until just now, I would have expected its robustness to be an against-all-odds under-the-radar lucky escape from the general enshittification that seems to be the norm for the industry.
Reading something like this, which outlines a coordinated effort (diagrams and even a realistic use case for agentic LLM usage and all!) to actually and effectively make things better was a breath of fresh air, even if towards the end it notes that the remarkable investment in quality will not be in full force in the future.
Even if you don't care about .NET and/or Microsoft, this is worth reading, doubly so if you're in charge of re-engineering just about anything -- this is how it's done!
No, read the article. It needs to build some "sub" SDKs to build the final 'full' SDK packages. That's the whole point; they want to get to a state where they can do that.
I have a lot of respect for the .NET team. They often publish great in-depth articles and their pursuit for performance is relentless (e.g. see Kestrel and Entity Framework evolution).
And ASP.NET is one of the few large projects which managed to survive a large breaking changes. Almost to Python 2->3 level. You had to change how your web app behaved completely if you relied on their magic session which worked hard to keep state synched between back and front.
Feels good to have 3 trillion dollars interested in improving the stack you use and actually care.
Developers! Developers! Developers!
.NET was a solid choice for backend builds before Node became so popular (And .NET is generally more performant than Node).
I hope this churn in .NET builds is temporary because a lot of people might be looking to go back to something stable especially after the recent supply chain attacks on the Node ecosystem.
Not sure about the past tense here. .NET is still excellent and getting even better with every release. What instability are you talking about? There was the leap to .NET Core which was majorly breaking, but that was almost 10 years ago now.
> I hope this churn in .NET builds is temporary because a lot of people might be looking to go back to something stable especially after the recent supply chain attacks on the Node ecosystem.
Can you elaborate a bit? This article talks about internal machinery of building .net releases. What does that have to do with "this churn", whatever that is?
My guess is if you build with .NET Framework you can just forever run your builds, but if your source code is based on newer .NET you have to update to a new version each year, and deal with all the work in upgrading your entire project, which also means everyone in your team is also upgrading their dev environment, and now you have new things in the language and the runtime to deal with, deprecation and all that. Plus lots of packages don’t update as fast when version changes occurs, so chances are you will probably take more work and use as few dependencies as possible if at all, which may cause a lot of work. Instead it’s best to, if you need to depend on something, to be a very big Swiss Army knife like thing.
I think node is just more flexible and unless .NET Framework like forever releases or much longer term support make a come back, there’s no good trade off from node, since you don’t even get more stability.
The past three years of dotnet upgrades have been completely painless for me.
What do you mean? The .Net ecosystem has been generalized chaos for the past 10 years.
A few years ago even most people actively working in .Net development couldn't tell what the hell was going on. It's better now. I distinctly recall when .Net Framework v4.8 had been released and a few months later .Net Core 3.0 came out and they announced that .Net Standard 2.0 was going to be the last version of that. Nobody had any idea what anything was.
.Net 5 helped a lot. Even then, MS has been releasing new versions of .Net at a breakneck pace. We're on .Net 10, and .Net Core 1.0 was 9 years ago. There's literally been a major version release every year for almost a decade. This is for a standard software framework! v10 is an LTS version of a software framework with all of 3 years of support. Yeah, it's only supported until 2028, and that's the LTS version.
This isn't really anything user facing. It's just yet again an example of why monorepos are better.
Anything is a monorepo if you submodule hard enough lol
.Net need a "node" level of developer experience and perfomance of rust/zig since node/python ecosystem rewrite make it more perfomance than ever
I cant see .net win againts those odds tbh
I love working with dotnet, but lately I’ve been writing more backend applications in Python. The code is simpler, testing is simpler since method privacy doesn’t really exist, and code is quicker to deploy because you do not have to compile it.
This could also change but in my experience AI is better at generating Python code versus dotnet.
Problem is though Python is slow at runtime. May not matter for many use cases, but I've worked with a lot of startups that suffered terrible reliability problems because they chose Python (or Rails, or Node to some extent) and the service cannot handle peak time load without a lot of refactoring and additional app servers.
Depending on your framework Python is at best ~3x slower (FastAPI) and at worst ~20x (Django) than asp.net on the techempower benchmarks, which maps pretty well to my real world experience.
Most web apps are waiting on the DB anyway. Rarely have I seen the speed of the actual framework make any meaningful difference.
I don't spend a lot of time building services, but the last few I've done, I actually went straight to Rust. The downside is that it's quite slow to develop -- I probably don't have the knowledge that others do, but it seems that frameworks could really use some work. That said, I love that I can find and fix most my problems during development. Building a service in Python means I'm constantly fixing issues in production.
.NET is certainly better than Python, but I'm not very happy with the type system and the code organization versus my Rust projects.
Can confirm. Just finished load testing a FastApi service. Now the biggest selling point is that a lot of real backend never experience the level of load where this actually matters
If you don't want your methods to be private make them public?
Just make them internal and use [InternalsVisibleTo] on the assembly.
I'm moving from Python to Java because of how much easier it is to actually use all CPU cores in Java and strict typing prevents so many bugs and it is much faster. I don't think it is actually that much more complicated than Python in 2025.
Agreed. It's sort of crazy how little people understand about multicore software design given nearly everyone is using machines with >8 CPU cores these days (even a cheap android phone tends to have 8 cpu cores these days).
In python and node it is _so_ painful to use multiple cores, whereas in .net you have parallel for loops and Task.WhenAll for over a decade. Java is similar in this sense that you don't have to do anything to use multiple cores and can just run multiple tasks without having to worry about passing state etc between 'workers'.
This actually becomes a really big problem for web performance, something I'm deeply passionate about. Not everything is just IO driven holdups, sometimes you do need to use a fair bit of CPU to solve a problem, and when you can't do it in parallel easily it ends up causing a lot of UX issues.
That’s one reason I’ve preferred .Net. Put ahead of time compilation on top and it is glorious.
out of curiosity, why not kotlin? I had the impression it was the jvm language to reach for by default these days.
Has too much sugar, and without JetBrains IDE you're stuck with a plain text editor. Not sure if it's generalizable to normal Kotlin or not, but learning Gradle Kotlin DSL made me want to rip my hair out when trying to understand what happens under the hood.
one thing that struck me was that the foundation for this effort was the linux distro build system. in other words, the work they put into making .net open-source and cross-platform eventually made everyone's lives easier.
I can see that high level overviews of complex systems are useful to get some insights, but in the same way I have the feeling that this mentality of high level, abstract organization is the root of the problem. If you have a complex system and simplify the components into abstractions, you will repeatedly run into difficulties because you've actively ignored the dirty bits. It's an top down approach that tries to tackle all issues, but an bottom up approach could even eradicate myriads of issues.
Oh, wow, I didn't expect that the best thing I'd read about software engineering, like, this year would come out of Microsoft! Don't get me wrong: I like .NET, especially its recent incarnation, but until just now, I would have expected its robustness to be an against-all-odds under-the-radar lucky escape from the general enshittification that seems to be the norm for the industry.
Reading something like this, which outlines a coordinated effort (diagrams and even a realistic use case for agentic LLM usage and all!) to actually and effectively make things better was a breath of fresh air, even if towards the end it notes that the remarkable investment in quality will not be in full force in the future.
Even if you don't care about .NET and/or Microsoft, this is worth reading, doubly so if you're in charge of re-engineering just about anything -- this is how it's done!
Must have been an amazing effort to be involved in.
> We’re asking how much it will cost to build 3-4 major versions with a dozen .NET SDK bands between them each month.
Why so many variants?
Well you've got .NET 8 (LTS), .NET 9 (standard support), .NET 10 (LTS). These are all supported at once.
Then you've got the .NET SDK/aspnet/runtime (on x64/arm32/arm64 linux/mac/windows), and also the various SDK packages themselves.
3**4 = 81 builds - but aren’t all of those independent and thus parallelizable?
No, read the article. It needs to build some "sub" SDKs to build the final 'full' SDK packages. That's the whole point; they want to get to a state where they can do that.
Ctrl-F "Nix" lol