It's basically the opposite situation from 150 years ago.
Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).
Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.
What about underexplained cosmological epicycles like dark matter (in explaining long-standing divergences of gravitational theory from observation), or the Hubble tension?
This is your regular reminder that epicycles were not an incorrect theory addition until an alternative hypothesis could explain the same behavior without requiring them.
Sure, but in that regard dark matter is even more unsatisfying than (contemporary) epicycles, because not only does it add extra complexity, it doesn't even characterize the source of that complexity beyond its gravitational effects.
Personally—and this is where I expect to lose the materialists that I imagine predominate HN—I think we are already in a nightmare scenario with regard to another area: the science of consciousness.
The following seem likely to me:
(1) Consciousness exists, and is not an illusion that doesn't need explaining (a la Daniel Dennett), nor does it drop out of some magical part of physical theory we've somehow overlooked until now;
(2) Mind-matter interactions do not exist, that is, purely physical phenomena can be perfectly explained by appeals to purely physical theories.
Such are the stakes of "naturalistic dualist" thinkers like David Chalmers. But if this is the case, it implies that the physics of matter and the physics of consciousness are orthogonal to each other. Much like it would be a nightmare to stipulate that dark matter is a purely gravitational interaction and that's that, it would be a nightmare to stipulate that consciousness and qualia arise noninteractionally from certain physical processes just because. And if there is at least one materially noninteracting orthogonal component to our universe, what if there are more that we can't even perceive?
Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.
You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.
The point is not to make better predictions of the things we already know how to predict. The point is to determine what abstractions link the things we don't presently understand--because these abstraction tend to open many new doors in other directions. This has been the story of physics over and over: relativity, quantum theory, etc, not only answered the questions they were designed to answer but opened thousands of new doors in other directions.
Maybe? We seem to be able to characterize all the stuff we have access to. That doesn't mean we couldn't say produce new and interesting materials with new knowledge. Before we knew about nuclear fission we didn't realize that we couldn't predict that anything would happen from a big chunk of uranium or the useful applications of that. New physics might be quite subtle or specific but still useful.
Yes, for all practical purposes. This is the position of physicist Sean Carroll and probably others. We may not know what is happening in the middle of a black hole, or very close to the big bang, but here on Earth we do.
"in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]
ER=EPR says something completely shocking about the nature of the universe. If there is anything to it, we have almost no clue about how it works or what its consequences are.
Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.
Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.
The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).
There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.
Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.
We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...
I think the problem is that GR and QFT are at odds with each other? (I am not quite versed in the subject and this is my high-level understanding of the “problem”)
Newtonian physics is good enough for almost everything that humans do. It's not good for predicting the shit we see in telescopes, and apparently it's not good for GPS, although honestly I think without general relativity, GPS would still get made but there'd be a fudge factor that people just shrug about.
For just about anything else, Newton has us covered.
If I have to make a guess, we are at the level of pre-copernicus in particle physics.
We are finding local maximums(induction) but the establishment cannot handle deduction.
Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'
I find the idea that reality might be quantized fascinating, so that all information that exists could be stored in a storage medium big enough.
It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.
Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...
If reality is quantized, how can you store all the information out there without creating a real simulation? (Essentially cloning the environment you want stored)
Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
Technically, the charge of a proton can be derived from its constituent 2 up quarks and 1 down quark, which have charges 2/3 and -1/3 respectively. I'm not aware of any deeper reason why these should be simple fractional ratios of the charge of the electron, however, I'm not sure there needs to be one. If you believe the stack of turtles ends somewhere, you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
There does appear to be a deeper reason, but it's really not well understood.
Consistent quantum field theories involving chiral fermions (such as the Standard Model) are relatively rare: the charges have to satisfy a set of polynomial relationships with the inspiring name "gauge anomaly cancellation conditions". If these conditions aren't satisfied, the mathematical model will fail pretty spectacularly. It won't be unitary, can't couple consistently to gravity, won't allow high and low energy behavior to decouple,..
For the Standard Model, the anomaly cancellation conditions imply that the sum of electric charges within a generation must vanish, which they do:
3 colors of quark * ( up charge 2/3 - down charge 1/3) + electron charge -1 + neutrino charge 0 = 0.
So, there's something quite special about the charge assignments in the Standard Model. They're nowhere near as arbitrary as they could be a priori.
Historically, this has been taken as a hint that the standard model should come from a simpler "grand unified" model. Particle accelerators and cosmology hace turned up at best circumstantial evidence for these so far. To me, it's one of the great mysteries.
I'm aware of the charge coming from quarks, but my point remains.
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
When the probability of coincidence is epsilon, then, no. Right now they are the same to 12 digits, but that undersells it, because that is just the trailing digits. There is nothing which says the leading digits must be the same, eg, one could be 10^30 times bigger than the other. Are you still going to just shrug and say "coincidence?"
That there are 26 fundamental constants and this one is just exactly the same is untenable.
If you imagine the universe is made of random real fundamental constants rather than random integer fundamental constants, then indeed there's no reason to expect such collisions. But if our universe starts from discrete foundations, then there may be no more satisfying explanation to this than there is to the question of, say, why the survival threshold and the reproduction threshold in Conway's Game of Life both involve the number 3. That's just how that universe is defined.
Why do you assume the two have to be small integers? There is nothing currently in physics which would disallow the electron to be -1 and the proton to be +1234567891011213141516171819. The fact they are both of magnitude 1 is a huge coincidence.
I'm not assuming they have to be small integers—I'm saying that if the universe is built on discrete rather than continuous foundations, then small integers and coincidences at the bottom-turtle theory-of-everything become much less surprising. You're treating the space of possible charge values as if it's the reals, or at least some enormous range, but I consider that unlikely.
Consider: in every known case where we have found a deeper layer of explanation for a "coincidence" in physics, the explanation involved some symmetry or conservation law that constrained the values to a small discrete set. The quark model took seemingly arbitrary coincidences and revealed them as consequences of a restrictive structure. auntienomen's point about anomaly cancellation is also exactly this kind of thing. The smallness of the set in question isn't forced, but it is plausible.
But I actually think we're agreeing more than you realize. You're saying "this can't be a coincidence, there must be a deeper reason." I'm saying the deeper reason might bottom out at "the consistent discrete structures are sparse and this is one of them," which is a real explanation, but it might not have the form of yet another dynamical layer underneath.
Whence your confidence? As they say in math, "There aren't enough small numbers to meet the many demands made of them." If we assume the turtle stack ends, and it ends simply (i.e. with small numbers), some of those numbers may wind up looking alike. Even more so if you find anthropic arguments convincing, or if you consider sampling bias (which may be what you mean by, "in stable particles that like to hang out together").
One argument (while unsatisfying) is there are trillions of possible configurations, but ours is the one that happened to work which is why we're here to observe it. Changing any of them even a little bit would result in an empty universe.
There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.
And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.
I agree with OP. The unexplained symmetry points to a deeper level.
Agreed (well, assuming the delta is more than a small fraction of a percent or whatever). But this is begging the question. If they are really independent then the vast, overwhelming fraction of all possible universes simply wouldn't have matter. Ours does have matter, so it makes our universe exceedingly unlikely. I find it far more parsimonious to assume they are connected by an undiscovered (and perhaps never to be discovered) mechanism.
Some lean on the multiverse and the anthropic principle to explain it, but that is far less parsimonious.
For a given calculation on given hardware, the 100th digit of a floating point decimal can be replicated every time. But that digit is basically just noise, and has no influence on the 1st digit.
In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.
This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
For example, pair production is:
photon + photon = electron + (-)electron
You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon:
electron + photon = electron - photon
There are similar formulas for beta decay, which is:
proton = neutron + electron + (-)neutrino
You can also "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.
One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation in 3D space by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges, and a simple additive algebra over these charges match Feyman diagrams. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.
I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.
> This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
But again, this is just observation, and it is consistent with the charges we measure (again, just observation). It doesn't explain why these rules must behave as they do.
> This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
This is exactly what I am suggesting in my original comment: this "coincidence" is not a coincidence but falls out from some deeper, shared mechanism.
Sure, but that's fundamental to observing the universe from the inside. We can't ever be sure of anything other than our observations because we can't step outside our universe to look at its source code.
> It doesn't explain why these rules must behave as they do.
Not yet! Once we have a a theory of everything (TOE), or just a better model of fundamental particles, we may have a satisfactory explanation.
For example, if the theory ends up being something vaguely like Wolfram's "Ruliad", then we may be able to point at some aspect of very trivial mathematical rules and say: that "the electron and proton charges pop out of that naturally, it's the only way it can be, nothing else makes sense".
We can of course never be totally certain, but that type of answer may be both good enough and the best we can do.
Or why the quarks that make up protons and neutrons have fractional charges, with +1 protons mixing two +2/3 up quarks and one -1/3 down quark, and the neutral neutron is one up quark and two down quarks. And where are all the other Quarks in all of this, busy tending bar?
They have fractional charges because that is how we happen to measure charge. If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.
Isn’t charge quantized? Observable isolated charges are quantized in units of e. You can call it -3 and +3 but that just changes the relative value for the quanta. The interesting question is still why the positive and neutral particles are nonelementary particles made up of quarks with a fraction of e, the math made possible only by including negatively charged ones (and yet electrons are elementary particles).
> If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Actually, I doubt it. Because of their color charge, quarks can never be found in an unbound state but instead in various kinds of hadrons. The ways that quarks combine cause all hadrons to end up with an integer charge, with the ⅔ and -⅓ charges on various quarks merely being ways to make them come out to resulting integer charges.
Experimental particle physicist here. It's just hard.
I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.
Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.
It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.
My measurement is a thread that's been dangling for decades, waiting to be pulled.
I never liked that the physics community shifted from 'high energy' particle physics (the topic of the article) to referring to this branch as just 'particle physics' which I think leaves the impression that anything to do with 'particles' is now a dead end.
Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.
I am sure others will say it better, but the cat-in-the-box experiment is a shockingly bad metaphor for the idea behind quantum states and observer effect.
I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.
If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.
It would be clear I am neither a physicist nor a logician.
Along similar lines, the double-slit experiment, seems simple. Two slits let light though and you get bands where they constructively or destructively interfere, just like waves.
However I still find it crazy that when you slow down the laser and one photon at a time goes through either slit you still get the bands. Which begs the question, what exactly is it constructively or destructively interfering with?
Still seems like there's much to be learned about the quantum world, gravity, and things like dark energy vs MOND.
I had a conversation about this in HN some months back. It's a surprisingly modern experiment. It demanded an ability to reliably emit single photons. Young's theory may be 1800 but single photon emission is 1970-80.
(This is what I was told, exploring my belief it's always been fringes in streams of photons not emerging over repeated applications of single photons and I was wrong)
The use of "AI" in particle physics is not new. In 1999 they were using neural nets to compute various results. Here's one from Measurement of the top quark pair production cross section in p¯p collisions using multijet final states [https://repository.ias.ac.in/36977/1/36977.pdf]
"The analysis has been optimized using neural networks to achieve the
smallest expected fractional uncertainty on the t¯t production cross section"
I remember back in 1995 or so being in a professor's office at Indiana University and he was talking about trying to figure out how to use Neural Networks to automatically track particle trails in bubble chamber results. He was part of a project at CERN at the time. So, yeah, they've been using NNs for quite awhile. :-)
Isn't it the mathematics that is lagging? Amplituhedron? Higher dimensional models?
Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then.
And probably multilinear transformations.
Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD
I find the arguments from those who say there is no crisis convincing. Progress doesn’t happen at a constant rate. We made incredible unprecedented progress in the 20th century. The most likely scenario is that to slow down for a while. Perhaps hundreds of years again! Nobody can know. We are still making enormous strides compared to most of scientific history.
Although we do have many more people now working on these problems than any time in the past. That said, science progresses one dead scientist at the time so might still take generations for a new golden era.
Maybe it's time for physicists to switch to agile? Don't try to solve the theory of the Universe at once; that's the waterfall model. Try to come up with just a single new equation each sprint!
The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.
In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.
Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.
CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.
Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.
China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.
On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.
Theoretical physics progresses via the anomalies it can't explain.
The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.
Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).
And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.
Please do not conflate the broad "theoretical physics" with the very specific "beyond the standard model" physics questions. There are many other areas of physics with countless unsolved problems/mysteries.
Is it more that even the most dedicated and passionate researchers have to frame their interests in a way that will get funding? Particle Physics right now is not the thing those with the cash will fund right now. AI and QC is the focus.
It's kind of legitimate, but it's kind of sad to see some of the smartest people in society just being like "maybe AI will just give me the answer," a phrase that has a lot of potential to be thought terminating.
>Cari Cesarotti, a postdoctoral fellow in the theory group at CERN, is skeptical about that future. She notices chatbots’ mistakes, and how they’ve become too much of a crutch for physics students. “AI is making people worse at physics,” she said.
this. Deep understanding of physics involves building a mental model & intuition how things work, and the process of building is what gives the skill to deduce & predict. Using AI to just get to the answers directly prevents building that "muscle" strength...
AI chatbots are also making people better at physics, by answering questions the textbook doesn't or the professor can't explain clearly, patiently. Critical thinking skills are critical. Students cheating with chatbots might not have put in the effort to learn without chatbots.
It's basically the opposite situation from 150 years ago.
Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).
Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.
What about underexplained cosmological epicycles like dark matter (in explaining long-standing divergences of gravitational theory from observation), or the Hubble tension?
This is your regular reminder that epicycles were not an incorrect theory addition until an alternative hypothesis could explain the same behavior without requiring them.
Sure, but in that regard dark matter is even more unsatisfying than (contemporary) epicycles, because not only does it add extra complexity, it doesn't even characterize the source of that complexity beyond its gravitational effects.
Even better, there are the "nightmare" scenarios where dark matter can only interact gravitationally with Standard Model particles.
Personally—and this is where I expect to lose the materialists that I imagine predominate HN—I think we are already in a nightmare scenario with regard to another area: the science of consciousness.
The following seem likely to me: (1) Consciousness exists, and is not an illusion that doesn't need explaining (a la Daniel Dennett), nor does it drop out of some magical part of physical theory we've somehow overlooked until now; (2) Mind-matter interactions do not exist, that is, purely physical phenomena can be perfectly explained by appeals to purely physical theories.
Such are the stakes of "naturalistic dualist" thinkers like David Chalmers. But if this is the case, it implies that the physics of matter and the physics of consciousness are orthogonal to each other. Much like it would be a nightmare to stipulate that dark matter is a purely gravitational interaction and that's that, it would be a nightmare to stipulate that consciousness and qualia arise noninteractionally from certain physical processes just because. And if there is at least one materially noninteracting orthogonal component to our universe, what if there are more that we can't even perceive?
Doesn't that imply our theories are "good enough" for all practical purposes? If they're impossible to empirically disprove?
Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.
You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.
But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.
The point is not to make better predictions of the things we already know how to predict. The point is to determine what abstractions link the things we don't presently understand--because these abstraction tend to open many new doors in other directions. This has been the story of physics over and over: relativity, quantum theory, etc, not only answered the questions they were designed to answer but opened thousands of new doors in other directions.
Maybe? We seem to be able to characterize all the stuff we have access to. That doesn't mean we couldn't say produce new and interesting materials with new knowledge. Before we knew about nuclear fission we didn't realize that we couldn't predict that anything would happen from a big chunk of uranium or the useful applications of that. New physics might be quite subtle or specific but still useful.
Yes, for all practical purposes. This is the position of physicist Sean Carroll and probably others. We may not know what is happening in the middle of a black hole, or very close to the big bang, but here on Earth we do.
"in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]
0: https://philpapers.org/archive/CARCAT-33
ER=EPR says something completely shocking about the nature of the universe. If there is anything to it, we have almost no clue about how it works or what its consequences are.
Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.
Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.
The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).
There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.
Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.
We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...
I think the problem is that GR and QFT are at odds with each other? (I am not quite versed in the subject and this is my high-level understanding of the “problem”)
Absolutely not. Newtonian physics was 'good enough' until we disproved it. Imagine where we would be if all we had was Newtonian physics.
Newtonian physics is good enough for almost everything that humans do. It's not good for predicting the shit we see in telescopes, and apparently it's not good for GPS, although honestly I think without general relativity, GPS would still get made but there'd be a fudge factor that people just shrug about.
For just about anything else, Newton has us covered.
You would still make it to the moon (so I've heard). Maybe you wouldn't have GPS systems?
If I have to make a guess, we are at the level of pre-copernicus in particle physics.
We are finding local maximums(induction) but the establishment cannot handle deduction.
Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'
But with particles.
I find the idea that reality might be quantized fascinating, so that all information that exists could be stored in a storage medium big enough.
It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.
Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...
If reality is quantized, how can you store all the information out there without creating a real simulation? (Essentially cloning the environment you want stored)
Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
Technically, the charge of a proton can be derived from its constituent 2 up quarks and 1 down quark, which have charges 2/3 and -1/3 respectively. I'm not aware of any deeper reason why these should be simple fractional ratios of the charge of the electron, however, I'm not sure there needs to be one. If you believe the stack of turtles ends somewhere, you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
There does appear to be a deeper reason, but it's really not well understood.
Consistent quantum field theories involving chiral fermions (such as the Standard Model) are relatively rare: the charges have to satisfy a set of polynomial relationships with the inspiring name "gauge anomaly cancellation conditions". If these conditions aren't satisfied, the mathematical model will fail pretty spectacularly. It won't be unitary, can't couple consistently to gravity, won't allow high and low energy behavior to decouple,..
For the Standard Model, the anomaly cancellation conditions imply that the sum of electric charges within a generation must vanish, which they do:
3 colors of quark * ( up charge 2/3 - down charge 1/3) + electron charge -1 + neutrino charge 0 = 0.
So, there's something quite special about the charge assignments in the Standard Model. They're nowhere near as arbitrary as they could be a priori.
Historically, this has been taken as a hint that the standard model should come from a simpler "grand unified" model. Particle accelerators and cosmology hace turned up at best circumstantial evidence for these so far. To me, it's one of the great mysteries.
I'm aware of the charge coming from quarks, but my point remains.
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
When the probability of coincidence is epsilon, then, no. Right now they are the same to 12 digits, but that undersells it, because that is just the trailing digits. There is nothing which says the leading digits must be the same, eg, one could be 10^30 times bigger than the other. Are you still going to just shrug and say "coincidence?"
That there are 26 fundamental constants and this one is just exactly the same is untenable.
If you imagine the universe is made of random real fundamental constants rather than random integer fundamental constants, then indeed there's no reason to expect such collisions. But if our universe starts from discrete foundations, then there may be no more satisfying explanation to this than there is to the question of, say, why the survival threshold and the reproduction threshold in Conway's Game of Life both involve the number 3. That's just how that universe is defined.
Why do you assume the two have to be small integers? There is nothing currently in physics which would disallow the electron to be -1 and the proton to be +1234567891011213141516171819. The fact they are both of magnitude 1 is a huge coincidence.
I'm not assuming they have to be small integers—I'm saying that if the universe is built on discrete rather than continuous foundations, then small integers and coincidences at the bottom-turtle theory-of-everything become much less surprising. You're treating the space of possible charge values as if it's the reals, or at least some enormous range, but I consider that unlikely.
Consider: in every known case where we have found a deeper layer of explanation for a "coincidence" in physics, the explanation involved some symmetry or conservation law that constrained the values to a small discrete set. The quark model took seemingly arbitrary coincidences and revealed them as consequences of a restrictive structure. auntienomen's point about anomaly cancellation is also exactly this kind of thing. The smallness of the set in question isn't forced, but it is plausible.
But I actually think we're agreeing more than you realize. You're saying "this can't be a coincidence, there must be a deeper reason." I'm saying the deeper reason might bottom out at "the consistent discrete structures are sparse and this is one of them," which is a real explanation, but it might not have the form of yet another dynamical layer underneath.
> you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?
No. It’s almost certainly not a coïncidence that these charges are symmetric like that (in stable particles that like to hang out together).
Whence your confidence? As they say in math, "There aren't enough small numbers to meet the many demands made of them." If we assume the turtle stack ends, and it ends simply (i.e. with small numbers), some of those numbers may wind up looking alike. Even more so if you find anthropic arguments convincing, or if you consider sampling bias (which may be what you mean by, "in stable particles that like to hang out together").
> coïncidence
Nïce
Shrugging and calling it a coincidence is generally not an end state when figuring out how something works.
One argument (while unsatisfying) is there are trillions of possible configurations, but ours is the one that happened to work which is why we're here to observe it. Changing any of them even a little bit would result in an empty universe.
There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.
And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.
I agree with OP. The unexplained symmetry points to a deeper level.
If it wasn't the case then matter wouldn't be stable.
Agreed (well, assuming the delta is more than a small fraction of a percent or whatever). But this is begging the question. If they are really independent then the vast, overwhelming fraction of all possible universes simply wouldn't have matter. Ours does have matter, so it makes our universe exceedingly unlikely. I find it far more parsimonious to assume they are connected by an undiscovered (and perhaps never to be discovered) mechanism.
Some lean on the multiverse and the anthropic principle to explain it, but that is far less parsimonious.
Is that actually true, if the charges differed at the 12th decimal place only? That’s non-obvious to me.
For a given calculation on given hardware, the 100th digit of a floating point decimal can be replicated every time. But that digit is basically just noise, and has no influence on the 1st digit.
In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.
This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
For example, pair production is:
You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon: There are similar formulas for beta decay, which is: You can also "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.
One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation in 3D space by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges, and a simple additive algebra over these charges match Feyman diagrams. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.
I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.
[1] https://arxiv.org/abs/0803.0223
> This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.
But again, this is just observation, and it is consistent with the charges we measure (again, just observation). It doesn't explain why these rules must behave as they do.
> This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.
This is exactly what I am suggesting in my original comment: this "coincidence" is not a coincidence but falls out from some deeper, shared mechanism.
> this is just observation
Sure, but that's fundamental to observing the universe from the inside. We can't ever be sure of anything other than our observations because we can't step outside our universe to look at its source code.
> It doesn't explain why these rules must behave as they do.
Not yet! Once we have a a theory of everything (TOE), or just a better model of fundamental particles, we may have a satisfactory explanation.
For example, if the theory ends up being something vaguely like Wolfram's "Ruliad", then we may be able to point at some aspect of very trivial mathematical rules and say: that "the electron and proton charges pop out of that naturally, it's the only way it can be, nothing else makes sense".
We can of course never be totally certain, but that type of answer may be both good enough and the best we can do.
Aren’t things like this usually explained by being the only viable configuration, or is that not the case here?
Or why the quarks that make up protons and neutrons have fractional charges, with +1 protons mixing two +2/3 up quarks and one -1/3 down quark, and the neutral neutron is one up quark and two down quarks. And where are all the other Quarks in all of this, busy tending bar?
They have fractional charges because that is how we happen to measure charge. If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.
Isn’t charge quantized? Observable isolated charges are quantized in units of e. You can call it -3 and +3 but that just changes the relative value for the quanta. The interesting question is still why the positive and neutral particles are nonelementary particles made up of quarks with a fraction of e, the math made possible only by including negatively charged ones (and yet electrons are elementary particles).
> If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.
Actually, I doubt it. Because of their color charge, quarks can never be found in an unbound state but instead in various kinds of hadrons. The ways that quarks combine cause all hadrons to end up with an integer charge, with the ⅔ and -⅓ charges on various quarks merely being ways to make them come out to resulting integer charges.
There are layers science can not access.
Experimental particle physicist here. It's just hard.
I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.
Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.
It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.
My measurement is a thread that's been dangling for decades, waiting to be pulled.
I never liked that the physics community shifted from 'high energy' particle physics (the topic of the article) to referring to this branch as just 'particle physics' which I think leaves the impression that anything to do with 'particles' is now a dead end.
Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.
I am sure others will say it better, but the cat-in-the-box experiment is a shockingly bad metaphor for the idea behind quantum states and observer effect.
I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.
If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.
It would be clear I am neither a physicist nor a logician.
Along similar lines, the double-slit experiment, seems simple. Two slits let light though and you get bands where they constructively or destructively interfere, just like waves.
However I still find it crazy that when you slow down the laser and one photon at a time goes through either slit you still get the bands. Which begs the question, what exactly is it constructively or destructively interfering with?
Still seems like there's much to be learned about the quantum world, gravity, and things like dark energy vs MOND.
I had a conversation about this in HN some months back. It's a surprisingly modern experiment. It demanded an ability to reliably emit single photons. Young's theory may be 1800 but single photon emission is 1970-80.
(This is what I was told, exploring my belief it's always been fringes in streams of photons not emerging over repeated applications of single photons and I was wrong)
The use of "AI" in particle physics is not new. In 1999 they were using neural nets to compute various results. Here's one from Measurement of the top quark pair production cross section in p¯p collisions using multijet final states [https://repository.ias.ac.in/36977/1/36977.pdf]
"The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"
I remember back in 1995 or so being in a professor's office at Indiana University and he was talking about trying to figure out how to use Neural Networks to automatically track particle trails in bubble chamber results. He was part of a project at CERN at the time. So, yeah, they've been using NNs for quite awhile. :-)
It's impossible to tell without opening the box the particle physics is in.
One interesting gap in the standard model is why neutrinos have mass: https://cerncourier.com/a/the-neutrino-mass-puzzle/
Isn't it the mathematics that is lagging? Amplituhedron? Higher dimensional models?
Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.
Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD
Tensors are pretty old in physics; they are a central concept in Einstein's General Relativity.
You can find tensors even in some niche stuff in macroeconomics.
Tensors are like 200 years old in mathematics. Gauss talked about Tensors.
I find the arguments from those who say there is no crisis convincing. Progress doesn’t happen at a constant rate. We made incredible unprecedented progress in the 20th century. The most likely scenario is that to slow down for a while. Perhaps hundreds of years again! Nobody can know. We are still making enormous strides compared to most of scientific history.
Although we do have many more people now working on these problems than any time in the past. That said, science progresses one dead scientist at the time so might still take generations for a new golden era.
Maybe it's time for physicists to switch to agile? Don't try to solve the theory of the Universe at once; that's the waterfall model. Try to come up with just a single new equation each sprint!
Information content of the article:
The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.
In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.
Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.
CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.
Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.
China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.
On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.
Theoretical physics progresses via the anomalies it can't explain.
The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.
Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).
And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.
Please do not conflate the broad "theoretical physics" with the very specific "beyond the standard model" physics questions. There are many other areas of physics with countless unsolved problems/mysteries.
Neutrino mass is another anomaly, which is at least slightly easier to probe than quantum gravity: https://cerncourier.com/a/the-neutrino-mass-puzzle/
Is it more that even the most dedicated and passionate researchers have to frame their interests in a way that will get funding? Particle Physics right now is not the thing those with the cash will fund right now. AI and QC is the focus.
Well, it's hard to make an argument for a $100 billion collider when your $10 billion collider didn't find anything revolutionary.
Scaling up particle colliders has arguably hit diminishing returns.
It's kind of legitimate, but it's kind of sad to see some of the smartest people in society just being like "maybe AI will just give me the answer," a phrase that has a lot of potential to be thought terminating.
That's mentioned in the article too:
>Cari Cesarotti, a postdoctoral fellow in the theory group at CERN, is skeptical about that future. She notices chatbots’ mistakes, and how they’ve become too much of a crutch for physics students. “AI is making people worse at physics,” she said.
this. Deep understanding of physics involves building a mental model & intuition how things work, and the process of building is what gives the skill to deduce & predict. Using AI to just get to the answers directly prevents building that "muscle" strength...
AI chatbots are also making people better at physics, by answering questions the textbook doesn't or the professor can't explain clearly, patiently. Critical thinking skills are critical. Students cheating with chatbots might not have put in the effort to learn without chatbots.
I'm quite happy that it might give me, with pre-existing skills, more time on the clock to stay relevant.