Fields aren't quantised, that's the problem, that's why gravity can't be quantized. It's a field!
Immediately, dozens of people are about hit the reply button and say something like: "But all of modern physics is based on quantization!" or something to that effect.
The issue is that there's two ways to think of quantization, and for almost all practical experiments, there's no way to distinguish between the two.
Essentially, you can think of either the fields being quantized, or the interactions with fields being quantized. Unfortunately almost all experiments measure only interactions with fields, so there's few practical ways to distinguish between the two. The mathematics is largely equivalent as well, so physicists "picked one" of the two options, and forgot about the other, equally valid option.
This is similar to the Veritasium video titled "Why No One Has Measured The Speed Of Light".[1] From inside the universe, it's basically impossible to measure the one-way speed of light, all experiments measure the two-way speed of light. So... we just assume that it's the same speed both ways. Right now, that works well enough. If it stops working, then we need to revisit that assumption instead of devising ever more complex mathematics to explain away our faulty assumption.
One difference between the "fields are quantized or not" issue and the "one-way speed of light" issue is that the former is testable. It's just that most experiments don't happen to test it.
Any experiment that uses atomic orbitals is conflating the quantization of atomic orbital levels with field quantization. There are experiments that don't involve orbitals, such as free electron lasers or radio waves. All such experiments show no quantization of fields.
Unfortunately, those experimental results are cheerfully ignored and hand-waved away. But stop and think about it: how exactly would you model a five kilometre long radio wave quanta as a point? How could something like that be instantaneously absorbed? How would the "rest of the wave" know that the tiny detector had made it vanish? It's madness, clearly, but that hasn't stopped thousands of physicists using this flawed model of quanta buzzing around in free space, because for short wavelengths like UV light you can make the mathematics work without paradoxes.
The channel Huygens Optics has some great videos[2] on the topic.
Another aspect of quantization is that for any wave in a continuum, you can model its behaviour in several ways mathematically that all arrive at the same numerical result, but "paint a different picture" in the imagination. For example, rendering something like light waves or sound waves bouncing around in a room can be done in two distinct ways: Either using local simulations with little oscillators at each volumetric point interacting only with their neighbours, OR as points moving around the space, bouncing around like particles, and carrying properties around with them such as intensity, wavelength, and phase.
The former is the "wave pool" approach, the latter is the "Monte Carlo" approach.
All of modern computer raytracing graphics uses the latter, not the former. Why? Because it requires less memory. The former scales a x^3 as the side-length of the volume 'x' goes up, the latter requires x^2 memory to accumulate the rays on the surfaces of the simulated volume, ignoring the intermediate states in the middle.
Richard Feyman's QED is famously successful, and in large part it is practical because it uses the more efficient Monte Carlo simulation approach, treating fields as little particles bouncing around. This has entrenched the "fields are made up of little quanta" in the minds of entire generations of physicists.
It's just a mathematical trick! An efficient way to do integration! It's not the One True Path, an insight into the truth of the universe.
We all need to take a step back and revisit our assumptions, and try to get away from thinking of integration tricks as having explanatory power in and of themselves.
> Essentially, you can think of either the fields being quantized, or the interactions with fields being quantized. Unfortunately almost all experiments measure only interactions with fields, so there's few practical ways to distinguish between the two. The mathematics is largely equivalent as well, so physicists "picked one" of the two options, and forgot about the other, equally valid option.
That's basically what Quantum Field Theory is though. Fundamentally everything is treated as a field, and we focus on the nature of interactions within that field, which results in particle-like behavior.
> Unfortunately, those experimental results are cheerfully ignored and hand-waved away. But stop and think about it: how exactly would you model a five kilometre long radio wave quanta as a point? How could something like that be instantaneously absorbed?
Well there are are manu acausal things in quantum world right? So it would work same as any other "spooky action at a distance". Somehow it turns out fine because it can't be used to transmit information.
As for the rest of the comment I'm not well-read enough to have an opinion.
Immediately, dozens of people are about hit the reply button and say something like: "But all of modern physics is based on quantization!" or something to that effect.
The issue is that there's two ways to think of quantization, and for almost all practical experiments, there's no way to distinguish between the two.
Essentially, you can think of either the fields being quantized, or the interactions with fields being quantized. Unfortunately almost all experiments measure only interactions with fields, so there's few practical ways to distinguish between the two. The mathematics is largely equivalent as well, so physicists "picked one" of the two options, and forgot about the other, equally valid option.
This is similar to the Veritasium video titled "Why No One Has Measured The Speed Of Light".[1] From inside the universe, it's basically impossible to measure the one-way speed of light, all experiments measure the two-way speed of light. So... we just assume that it's the same speed both ways. Right now, that works well enough. If it stops working, then we need to revisit that assumption instead of devising ever more complex mathematics to explain away our faulty assumption.
One difference between the "fields are quantized or not" issue and the "one-way speed of light" issue is that the former is testable. It's just that most experiments don't happen to test it.
Any experiment that uses atomic orbitals is conflating the quantization of atomic orbital levels with field quantization. There are experiments that don't involve orbitals, such as free electron lasers or radio waves. All such experiments show no quantization of fields.
Unfortunately, those experimental results are cheerfully ignored and hand-waved away. But stop and think about it: how exactly would you model a five kilometre long radio wave quanta as a point? How could something like that be instantaneously absorbed? How would the "rest of the wave" know that the tiny detector had made it vanish? It's madness, clearly, but that hasn't stopped thousands of physicists using this flawed model of quanta buzzing around in free space, because for short wavelengths like UV light you can make the mathematics work without paradoxes.
The channel Huygens Optics has some great videos[2] on the topic.
Another aspect of quantization is that for any wave in a continuum, you can model its behaviour in several ways mathematically that all arrive at the same numerical result, but "paint a different picture" in the imagination. For example, rendering something like light waves or sound waves bouncing around in a room can be done in two distinct ways: Either using local simulations with little oscillators at each volumetric point interacting only with their neighbours, OR as points moving around the space, bouncing around like particles, and carrying properties around with them such as intensity, wavelength, and phase.
The former is the "wave pool" approach, the latter is the "Monte Carlo" approach.
All of modern computer raytracing graphics uses the latter, not the former. Why? Because it requires less memory. The former scales a x^3 as the side-length of the volume 'x' goes up, the latter requires x^2 memory to accumulate the rays on the surfaces of the simulated volume, ignoring the intermediate states in the middle.
Richard Feyman's QED is famously successful, and in large part it is practical because it uses the more efficient Monte Carlo simulation approach, treating fields as little particles bouncing around. This has entrenched the "fields are made up of little quanta" in the minds of entire generations of physicists.
It's just a mathematical trick! An efficient way to do integration! It's not the One True Path, an insight into the truth of the universe.
We all need to take a step back and revisit our assumptions, and try to get away from thinking of integration tricks as having explanatory power in and of themselves.
[1] https://www.youtube.com/watch?v=pTn6Ewhb27k
[2] https://youtu.be/ExhSqq1jysg?t=283