I had a great conversation this evening with a marine biologist studying the recent collapse of a 100,000 sq km California kelp forest. The grants for the science of ecosystem stewardship are in the 10k-50k$ range.
Over the past few years, something killed over 5 billion giant starfish and “science” has no idea what did it. Without the starfish, the sea urchin population exploded, turning kelp forest into desert.
I love the LHC. But we need to seriously grow the science pie and prioritize the science of ecosystem collapse and management. There simply aren’t enough trained scientists.
---
[..] we do not have a fixed pot of money that we sit down and say, “What is the best way we can spend this on research?”
It’s not like there’s a fixed number of dollars and we say, “Okay, biology gets so many and chemistry gets so many, and particle physics gets so many.” This is a lesson that has been beaten home over and over again when we’ve had an expensive science project that people have campaigned against on the theory that the money could be better spent elsewhere, and the project gets cancelled, and guess what, the money does not get spent on science at all ’cause there’s no rule that the amount of money spent on science has to be fixed
---
This is not accurate. There is no fixed annual amounts for anything the US Congress passes, notwithstanding the fact that the LHC is an international project.
What you're ignoring is the reasoning and the process behind that decision. Of course it matters what it is that needs funding and how effectively the people requesting the funds are making their case.
If it was a competition between english literature and greek philosophy you can be absolutely certain that the fixed amount would be a tiny fraction of what it is now.
Curing cancer, infinite amounts of cheap energy, staying competitive, winning against China both economically and militarily, you really think the size of the pie would be the same without these things?
Interesting point. So, hopefully the science of environmental stewardship gets a name like “Environmental Engineering” or “Resilience Engineering” from the NSF and grows the pie.
Because the costs of not adapting to climate-related environmental change will be astronomical. I mean, would you say climate adaptation is on par with the importance and urgency of curing cancer? The arguments would be interesting.
No, all countries with large science programs use similar processes for total fund size and suballocations (dunno about zh but jp and eu are quite similar, with some non important details)
Ultimately there is a fixed amount of effort people can expend - however the science budget isn't fixed in the sense that projects like the LHC can generate the excitement etc to enlarge the overall pie ( for science - taking away from something else of course - though that could be time spent at home unemployed ).
Most science is done through small incremental advances that add up over time - one of the problems with that is it's rather hard for politicians to see or understand that.
Big projects capture the attention. Though obviously big projects can also, if they fail, damage the reputation of experimental science as a whole.
So yes - LHC isn't probably the best use of money - there are more pressing immediate concerns - however on the other hand - everyone has heard of it - it attracts funding and people to science ( and does good science of course ).
Like the NSF, a “Global Science Foundation” is a vision we should all promote. There is a big need for supporting global science. It would have so many follow-on benefits culturally, politically, economically, ecologically…
See the ssc as an example. Note also the nih is a massive fund (25+ billion in grants per year, all bio). Nsf is a better example because they have to fund many fields. Doe has fixed funding for nuclear, plus soft funds for scientists in many fields (they funded the human genome long before nih).
I think about this with space exploration as well. What is the most efficient and practical way to tackle large exploration and science projects that may not see fiscal profit in near term? I’m fiscally conservative capitalist, but want to find a way to do all the science and help the public have higher risk tolerance. My family worked on Apollo and I just worry our generation’s tolerance for risk and ambition of that sort is sapped. Or, that a WWIII is necessary trigger an age of growth as I and II did.
The amount of money that goes into these unuseable results is just sad. This and astrophysics of remote galaxies.
Either we have a breakthrough that shakes physics and that leads to brand new discoveries that further shake engineering, or it is a waste of time. Between ~1870 and ~1950 we had a fireworks of discoveries that wee shaking our view of the world. And explained transistors.
Today? Nothing special except for a bunch of people and PhD students.
We are just starting to scratch the surface of biology - this is where we need to invest. Solid state physics could be another one. But not particle physics with their crazy energies that are completely unreachable outside of an accelerator.
For context, I have a PhD done in CERN. If I were to choose again, I would bo for bioinformatics or biophysics.
I have a PhD done in CERN, too, and I can state that your remark is idiotic.
Money was not spent in order to confirm that the Higgs boson existed: rather, it was invested in order to improve technology to such a degree that we could have an answer. That is what people paid for, and that is what humanity will enjoy.
Whether the Standard Model is validated or challenged, it matters as much as a grain of sand on top of the Himalayas.
> it was invested in order to improve technology to such a degree that we could have an answer. That is what people paid for, and that is what humanity will enjoy.
What part exactly is humanity going to enjoy? The 8 T magnet? Where do you envision using the technology to drive particles?
Or, to be fair, which technology from the LHC was ever used for humanity?
Anything you do in biology today can have a direct effect OTOH.
For what it’s worth, a very close friend of mine got his Physics PhD at CERN in the late 80s (high energy particle physics) where he witnessed the birth of HTTP.
This experience, coupled with the US government’s decision to not build an equivalent super collider in the 90s, led him to switch to computer science, where he had a bit of a “head start” due to his experience at CERN.
He always said the data analysis, measurement, and collection stuff was more advanced than anything else in the world at the time. The original “big data”. But in 2022, this is less of a distinguishing feature in his opinion. Also, Bell Labs probably faced similar issues.
I asked, and he agrees that he’d do molecular biology if he were starting over today. So you’re not at all crazy!
This is very similar to my trajectory (PhD in the mid to late 90's).
I made friends with some people working in the computer center (the famous 513 at the time, may have changed by now). I discovered UNIX, then Linux and I was sold. During my PhD I started to work for the industry in IT and never stopped since then.
CERN had an outstanding computer system, they invented the "grid". As for HTTP, it just happened to be invented at CERN when Tim Berners-Lee was working there (there was nothing special related to CERN). The hardware-software interface was bleeding edge technology (built into the detectors) and the data bandwidth capacities were ahead of their times.
All of this, however, is not particle physics and if CERN did not exist there would be not much "holes" in today's computers (as opposed, say, to DARPA or Linus).
Very high field magnets are critical for MRIs, tokamak fusion reactors, mass spectrometers, and probably many other uses if we consider pure science as “useful”.
Actually particle accelerators somewhat funded the early investment in high field superconducting magnets which might now appear to yield improvements in fusion energy Q factors due to the strong dependence on field intensity [1]
Of course it has to be useful. Based on that we build our world (through engineering) and if we work for something that cannot be used then it does not make sense. I do not see any use for quantic foam or quasars any time soon.
As for magnets: this certainly helps, but has CERN done any breakthroughs that were used afterwards? They had at some point the world record for the density of a magnetic field. There is no practical use for that neither in MRIs (that require a lower field and were used for many years already), nor tokamaks (where the homogeneity of the filed is paramount on larger scales compared to the ones in CERN.
I am not saying that the engineering work that is done in CERN is useless - it is just that the money poured there goes primarily in some "whose is bigger" contest that has exactly zero chances to be used.
> the only things you can know are the things how they actually are
Yes, but investing MM€ to know something that has no use does not make sense, if there is much more important knowledge competing for the fund. I guess that knowing how to cure MS is more important that discovering a four quarks particle, right? In an ideal world we could do everything but we have to choose wisely because the amount of funds is limited.
> A PhD from CERN should not confuse science with engineering.
> further shake engineering, or it is a waste of time. Between ~1870 and ~1950 we had a fireworks of discoveries....
... which started with a simple observation. 200 years ago in 1820, Ørsted was giving a lecture when he noticed that a current-carrying wire deflects a compass needle. Of course, he had no idea what that would lead to, nor did anyone else. It was a completely 'pure', 'unuseable' observation. But, as a student of Kant (what a waste of time!) he shared it in a four-page pamphlet.
Sure, that doesn't help. Most scientists don't do it for the money though, so that doesn't have to be an issue per se. The barrier to entry for becoming a scientist, however, is extremely high and not filled with equal opportunity.
One example is a female friend of mine with two Master's degrees who's currently not getting paid a lot in a government job. She would love to do a PHD and start doing research in the areas she's been educated in and worked in, but she hardly gets invited to interviews and when she does, she never seems to get the job, because she lacks research experience. The longer she works outside of research, the less of a chance she gets to actually get hired for a research position.
I understand that there have to be high barriers to entry, because quality needs to be high, but I feel like the balance currently might not be right.
Believe it or not, scientists need to eat, want to own homes, have vacation, and buy things like everyone else. There's a selection bias in "scientists don't do it for money"--the scientists that don't do it for the money tend to be the most successful because they're willing to undercut their competition. They're the ones who will invest significant amounts of their life towards an area of research, spend their weekends and nights pursuing some goal, sometimes even put family second. They are the entrepreneurs of science.
There's a lot of other scientists who are just as skilled, may have just as good ideas, as so on. They view their work as work, enjoyable for some or just a job for others (though these are very few as by this point they just find something less stressful that pays).
So it's not that the lack of money doesn't help, it basically shapes the entire labor pool of scientists selecting for the most passionate, the ones who might not be making much over a small resturaunt manager when you normalize their time spent to comp. Skilled scientists who want a life and see other opportunities tend to leave science as it's high risk low reward and apply their skill elsewhere. It's a lot to science.
If you look at CS, the research community is almost starving as most leave for big tech or others that can comp 3-5x what academia can provide. I suspect big pharma is a hit the same.
Funding needs to come first then extra scientists if budgets permit. The problem in immunology as far as I can see from a close onlooker perspective is there is less and less funding. Staff get by on meagre wages, hoping to hold onto their role as budgets get cut and grants get smaller.
Kinda sad when the government wastes millions in the funding of worthless R&D projects. I've never been more disappointed to see that such a huge portion of the whitepapers in my field aren't worth the paper they're printed on. They have also spent like $40M on certain rounds of funding to groups and it doesn't go anywhere as anyone in industry can clearly see that the projects are completely useless and don't solve any kind of real industry problem besides getting large amounts of cash.
To grow the science pie you need to shrink the military pie. To shrink the military pie you need to make profiting off of science much more lucrative than profiting off of slaughter.
I get where you're coming from, but honestly, I'm not sure there's even that much logic to it.
The US federal government seems to have a largely arbitrary budget. No real experience or rational principle restrains it. We have never seen a clear case of negative consequences related to spending too much, so no one is quite sure how much might be too much. Sometimes a politician sees political advantage in grandstanding about fiscal responsibility, but seconds later, that same politician will be found funneling some arbitrary number of billions of $$$ in pork to their district.
I suspect the truth is that politicians mostly just don't care that much about science because they aren't smart enough to understand its value, or decent enough to care.
We could and should 10x the science budget, and it would have vastly more impact than almost everything else the government spends money on. But we don't because politicians see it as some amusement to fritter a limited amount of money on. Occasionally Republicans will publish a list of ridiculous scientific projects and no one wants to be caught in the crosshairs of that -- even though tens of billions are probably being simultaneously spent on murdering people in a country you've never heard of, or flushing thousands of tons of soybeans down a toilet.
Defense-related science gets an exception to this, which is why a big war is often also a big leap forward for science.
No, politicians are very (street) smart. Science produces results in the long-term, but elections are every x years and a career is maybe y years, all of which are shorter than the time in which science will show definitive results. So politicians will just look out for themselves as much as they can. The reason militaries get funded is because it plays on the basest fear in people and the money makes it back into politicians' pockets fast without having to show any more results than a pile of rubble.
You're right however, that nowadays it's structural that scientists request funding through defense, like of course it needs that cover.
Building housing and infrastructure can be a jobs program. Science itself (e.g. Apollo program, even though that's dual use) can be a massive jobs program. Being the security apparatus of the world as a pure play though is a choice, and a rather unique one.
> I suspect the truth is that politicians mostly just don't care that much about science because they aren't smart enough to understand its value, or decent enough to care.
Most politicians understand that humanity is never going to leave earth and so all of this stuff is basically pointless to the actual business of governing. So we will miss out on the next MRI or microwave because we missed some new technology... so what? How does that make their lives, personally, worse in any practical way?
Humanity is going to melt down in the pretty near future here... if the Russia stuff doesn't escalate, and the US doesn't slide into fascism, there will be yet another wave of conflicts tomorrow due to extremity from climate change or something else. Water is going to become extremely scarce, habitable lands are going to become uninhabitable and undesirable tundra is going to become extremely desirable, and countries will fight for ownership of the reshuffled deck.
Humanity has passed the peak of this enlightenment cycle and is heading for a new dark age, maybe the Forever Dark Age given the depletion of most of the accessible reserves of energy and critical materials. The best-case scenario is that we wind down to some steady-state existence with some level of mechanization/etc, but the exponential growth thing mathematically cannot continue indefinitely. The Earth does have physical limits and just because Malthus was wrong about it in 1700 doesn't mean you can grow an exponential rate in a finite system forever.
Who cares about space? We're not going there. And their job is making themselves comfy and keeping things going mostly straight in the meantime. Science really doesn't matter in that context as long as someone else isn't drastically ahead of you such that it produces a military advantage, they don't care about the absolute advancement, only the relative to everyone else.
I'm not saying this is a good thing, I'm saying this is how it is.
> the exponential growth thing mathematically cannot continue indefinitely.
It's not continuing indefinitely. Since the 1960s, women have been drastically curtailing their reproduction when given education and access to contraceptives. Fertility is above replacement only in Sub-Saharan Africa and a couple other places. Global fertility is projected to drop below replacement in this decade, leading to a population peak and decline a few decades later.
People are constantly predicting doomsday because they think humanity deserves it. Maybe we do, but by my reckoning, there is no guarantee of our imminent demise just yet.
it's extremely humorous how "trend-chasing" the small arms market was at that time. First cartridges... then brass cartridges... then centerfire... then brass... then repeating weapons... then high-velocity cartridges... then spitzer point bullets... it's almost a Pentagon Wars style story chasing the latest technological developments and often trying to modify your existing inventory to fit the latest need.
The Krag didn't have spitzer-point bullets so it had to go, despite poor tactics really being more at play than the rifles themselves. And the Krag was banned from the service-rifle category of the influential National Match shooting competition after some humiliating upsets showed this.
Is the 10k-50k "grants for the science of ecosystem stewardship" refers to the sum of all grants to the "the science of ecosystem stewardship", or just that marine biologist got for his particular study? Is there only his team doing study in this field or there are other teams doing similar ones got separate grants?
I always wonder if the implicit telos of modern science is space colonization (leave this planet behind), and what the ramifications of that mindset is. Is this a dogma that can be challenged?
We've still barely scratched the surface of ecological sciences
Yes, and it's the same telos driving all life, which has spread to every possible corner of the globe, and now will spread throughout the galaxy. I don't think anything that fundamental should be challenged.
Such a thing sounds disasterous. Was a private campaign organized for donations? I think you'd have a better chance convincing private citizens at this point than any government.
What changed to make a slew of new discoveries possible? Is it pure
chance like Bitcoin mining? If so, what's the chance of discovering
nothing for years and then, like London buses, three all come along at
once?
What are the implications of a new "particle zoo"? Can I do anything
with these, like build new atoms, or use them to detect something?
The change is the gradual accumulation of statistics. These are relatively rare events. The LHC has been running, high-energy proton-proton collisions have been occurring, and the LHCb detector in this case has been measuring them. The statistics increase, and eventually the characteristic peaks of short-lived resonances can be identified above the noise of "background" collisions.
I think the goal of this work is to understand the nature of the strong force. Quantum chromodynamics (QCD) is pretty difficult as far as quantum field theories go, its strongly-coupled, meaning making first-principles predictions of what to expect is really tough. Its a huge computational effort being run on some of the biggest computers on the planet (lattice QCD).
We observe that all the hadrons in experiment are "colour singlets" meaning that the colour charge of QCD is hidden. These are usually three-quark states (protons, neutrons, etc) or quark-antiquark states (pions, kaons, etc). There are many other ways of making "colour singlets". For example, these tetra and pentaquark combinations. There are also "hybrids" made of a gluon and some combination of quarks. There is some evidence on both experimental and theoretical sides for at least a few of these hybrids. Glueballs are also possible, states made entirely of gluons, but there is only really theoretical evidence for these so far in specific limits. We just don't know if they exist in reality.
Everything is made of this stuff. Most of the mass around us comes from the strong interactions. It's important to understand it.
> The change is the gradual accumulation of statistics. These are relatively rare events. The LHC has been running, high-energy proton-proton collisions have been occurring, and the LHCb detector in this case has been measuring them. The statistics increase, and eventually the characteristic peaks of short-lived resonances can be identified above the noise of "background" collisions.
I think people are a bit spoiled by the Higgs leak/announcement/discovery timeline. I'm sure those in the know have known about this discovery for some time but, like you said, it takes some time to gather enough data to be confident (and to qualify as the mathematical standard set for "discovery").
Right, after the bump reaches 3x sigma they have some confidence it deserves their own attention, and at 4x they are sure, but the rule is you don't publish until you have enough data for a 5x sigma result, which just takes a lot more data and a lot longer.
"Sigma" here refers to standard deviations off the Gaussian normal mean. Zero means completely random. In psychology they publish at 2x sigma, 95%, which means 20:1 odds against a spurious result, and they publish a lot of spurious results because you can generate an unlimited number of hypotheses. In physics, things are considered more deterministic, and an experiment doesn't need to recruit undergrads to be data points, so you run your LHC for a few more months and avoid wasting people's attention.
The chance of falsely rejecting the null hypothesis increases as you gather more data. Put more simply, finding something that differs "significantly" from some distribution becomes easier as you gather more data. Imagine having only 3 psychology student in a study, the required effect size has to be huge for the test to say that it is significantly different.
However, the approach taken by CERN is of course right. They find a result at a certain significance level and then collect more data to verify the result. As long as there aren’t thousands of simultaneous verifications running, this approach is sound. Obviously yes, physicist’s know what they’re doing.
Having said that, please don’t read this comment as me approving of frequentists statistics. Bayesian or cross-validations are way easier to interpret where possible.
Thanks for trying to explain. It's all still largely beyond me TBH.
But more idiot's questions if you have any thoughts....
My understanding was that particle accelerators were being used to try
and deconstruct matter, to do for want of a better word "fission" by
smashing things together and seeing what smaller bits came out - by
analogy to mass spectrometry.
What seems to be going on now is that we're trying to make new
particles. Have we switched to a sort of "fusion" - to see if smashing
things together will get them to stick in bigger configurations?
Have all the most fundamental bits (quarks?) been found now? Can we
prove that those are irreducible?
We will never, in the lifetime of anybody who guesses we ever existed, be able to build an accelerator powerful enough to check whether it is right about gravity. So, they potter and try to show this or that family of variations (among 10^500 imagined) does or doesn't contradict details of the Standard Model we have most confidence in.
it would require being able to generate a high enough energy beam
But using current accelerator technology it would require an accelerator many times the size of the earth, _many_.
I use to work at a particle accelerator, part time, when i was in college. Fun fact i once confirmed Einsteins photoelectric effect using a high energy x-ray beam, a copper target, and high voltage.
In our basest theory QFT, no particle is fundamental because they're actually "fields" - but those fields are fundamental so that doesn't answer your question. We just think they're fundamental because we don't have enough evidence to construct anything better than the Standard Model.
It's not possible to do better than that, though, because you can't prove a negative statement like "there are no more fundamental particles". Even if we understood the laws of physics completely, they could always change on us. It's all up to the guy who owns the universe simulator.
But doesn't fundamentality in fields imply something quite different than fundamentality in parts (particles)?
I sense that the next step in physics and ontology can only happen when we have created a new linguistic approach to capture the 'fundamental' idea here.
If information is physically fundamental, the fundamental "particle" would be some sort of bit. Planck's constant could be that bit.
All other particles would be derivative, and their being caused by some base rules.
Physics will only be complete when fully explained in terms of information, regardless of the physical reality of information. The two aspects of explanation are 1. the rules and 2. what are the bits? Perhaps both those things are one.
Perhaps bits are not fundamental and quaternary bits are, but that would still implicate information as fundamental.
> We have no proof (and it's probably impossible to do so), that anything we've found is fundamental.
this entire discussion is fully outside of my knowledge wheelhouse but why should we believe that the universe is anything less than infinitely fractal at the micro scale? like you said, how would we even know if something is fundamental?
sheer intuition based upon the adage "you don't know what you don't know", repeated incorrect assumptions that we've finally discovered fundamental building blocks of reality, and lack of capacity for imagination (sorry—I tried!) for what the discovery of absolutely positively provably fundamental building blocks of reality could even potentially look like
> repeated incorrect assumptions that we've finally discovered fundamental building blocks of reality
Physics has had the other problem for a while now - they know the current theory is wrong, but they can't find any evidence to disprove it, and it's wasting generations of scientists and particle accelerators to do it.
Probably also just time: time to run more experiments, time to improve analysis compute capability, and to analyze new data and re-analyze the data they already have. These experiments yield enormous amounts of data.
They do. The Tcs0 tetraquarks don't have quark-antiquark pairs however, you see from the article and figures, that the quark content is charm + anti-strange + up + anti-down, these can't annihilate because the quarks have different flavours. They can "annihilate" via the weak interactions though, which can connect quarks and anti-quarks of different flavours. For example the charm-antistrange part could decay via a W-boson to a positron and a neutrino. This is a much slower process however.
In the pentaquark, charm-anticharm annihilation can and will happen. The time for charm-anticharm annihilation is usually slower relative to light and strange hadronic interactions though. In part because the strength of strong interactions reduces at higher energies, and the charm quark is more massive and so the relevant energy scale for the decay is higher.
One charm-anticharm resonance, the J/psi(3097) is very long lived even though the quarks can annihilate. In many theoretical models of these things, its often treated as a stable particle.
Think of it like taking series of blurry photos of an unknown object. A single photo just looks like a blob, but accumulate enough of them and apply some algorithmic magic and eventually the picture sharpens.
When particles are collided and the result measured, there's probably lots of noise in the data. In a single picture, a single pixel (datapoint) tells you nothing. But capture enough results, and you can begin to filter the noise out, revealing patterns underneath.
They came together most likely because they were detected in the same/similar analysis.
There are probably hundreds of analysis performed in parallel on the LHCb data sets, by different sub-groups of the collaboration, looking at different reactions / channels etc. It could also be that they grouped several results together because they are similar. There is always an internal review process, and committee meetings (for the final "go ahead, publish") can impose a granularity on the time of release. Could also be that the paper is already on the arXiv for days, and this is just the common press release.
They would definitely not be publishing results of the power increase on the same day of the increase. These experiments take a lot more time than a day to perform and analyse.
> If so, what's the chance of discovering nothing for years and then, like London buses, three all come along at once?
First, there was a pretty constant stream of these kinds of discoveries over the last years. Most of them even discussed here [1, 2, 3 and more]
Second, what you need to find these rare events are two things: enough statistics and someone actually doing a specialized analysis looking for them.
Enough statistics have been accumulating over the past years, and now someone stepped forward, developed the analysis, tested it on small scale dataset, got it approved by the Collaboration, ran it on the large dataset and then chances are that you don't just find one but three very similar things.
Disclaimer: I am not part of LHCb and have no insight into the specific analysis of this discovery, just sitting on the same floor as the LHCb guys and knowing the general procedure.
Third, as noted in a sister comment, these quark states are expected from the standard model, they are just very rare and very short lived, so hard to find. Nothing new normally refers to "no physics found incompatible with the standard model".
These newly discovered particles are composed of multiple quarks and were found (at least up to the measurement precision) with the expected rules of how you can combine the elementary particles into composed particles (Baryons in thos case).
LHCb actually discovered a bunch of those in the last couple of years.
But these are expected from the standard model, just rare and thus not experimentally confirmed before.
"Nothing new" for LHC means no new physics. And if these quark states don't happen to be much more frequent or much more rare than predicted, there is no new physics here.
Is it possible, even it principle, that some of these exotic hadrons could be long-lived (let alone stable)?
They're probably interesting to study on their own, but the engineering instinct is to want to build something out of them, or use them as tools, which seems pretty hard if they disintegrate in a quintillionth of a second!
If a particle can decay into a lighter set of particles and still obey all of the conservation principles, they will. The heavier they are, the more they're going to decay and the more "options" they have to decay. An electron isn't going to do anything because there is nothing lighter than an electron that still carries charge, etc. Something much heavier, like a free neutron, will fall apart into a proton, an electron, and an anti-electron neutrino.
These particles have options galore as to what they can fall apart into being, and so they do, and with great haste.
I know nothing about physics as it is not my domain. So I don't know if what you're saying is true or not. But if it is, I find that notion somewhat poetic.
For the sake of my own edification, I'd like to follow this up with a few somewhat seemingly dumb questions if you dont mind:
Is it the case that a given particle is trying to settle into a "lowest energy state" possible? I am not using physics terms here. More like conceptually, are these particles, due to the number of options available to them, decaying into the lightest stable variant allowed by the laws of physics? if that is the case, then could we perhaps find ways to engineer structures within which these particles last for a whole lot longer than they should (on a human timescale)? And what is stopping us from doing that? is it the energy cost associated with such a structure/device or is there a more fundamental reason we cant do that?
> Is it the case that a given particle is trying to settle into a "lowest energy state" possible?
Not exactly. Energy is conserved during these decays. In fact, energy is conserved during all physical processes, so the "lowest energy state possible" is a little bit of a white lie. What makes it a white lie is that it is a very good approximation to the truth for thermodynamic systems, i.e. systems consisting of large numbers of particles. But for quantum systems, it is no longer a good approximation. In quantum systems, what happens is that you have a wave function that describes all of the possible states a system can be in. The more mass the system contains, the more possible states there are in its wave function, and so the more likely it is to end up in some state other than the one it started out in.
It is even possible for the process of decay to reverse itself, and for the constituent particles to come back together and reconstruct the original, but for that to happen all the constituents have to be brought back together, so as a practical matter this never happens spontaneously in nature. In fact, that is the whole reason for building the LHC -- to make particles (protons) come together and make high-mass systems which then decay in interesting ways.
> are these particles, due to the number of options available to them, decaying into the lightest stable variant allowed by the laws of physics?
Not the lightest stable variant, just to one of the possibilities described by that particle's wave function. These will always be subject to the constraints of conservation laws, so the decay products will always be lighter than the original. But which particular set of possible decay products is actually produced in any given decay event is fundamentally random.
> if that is the case, then could we perhaps find ways to engineer structures within which these particles last for a whole lot longer than they should (on a human timescale)?
No. The wave functions for particles are fixed by nature. They are what give particles their identities. They cannot be engineered. The only thing that we can engineer is the arrangement of particles. Particles are like Lego bricks. You can stick them together in lots of different ways, but you can't change the shape of a given brick. Sometimes quantum Lego bricks fall apart spontaneously, but there is no way to control that.
I thought in q field theory there is on a fundamentally level no particle. They are artificial excite state in a field. Hence the all possible state possibly as it is not the particle as this is already in a state, but a wave. The interaction of fields … wonder if we reframe the q as CSB we have or find new operators to …
Neutrons don't decay while being part of a stable atom, because the atom has actually less energy than the sum of the constituents -- the difference is the binding energy. Look at deuterium, for example. It has a mass of 2.0141 u. A proton alone is 1.0073 u, and a neutron is 1.0087 u. Deuterium is lighter than the mass of proton + neutron. It's also slightly lighter than two protons, so the neutron cannot decay without external energy input.
This actually brings me to a physics question I've had for a while and if it is unrelated, please feel free to ignore it as I might be mixing concepts.
Both fusion and fission release energy when they occur. Which seems somewhat weird to me. Is it the cases that the reason a stable atom has less energy than the sum of its parts (as you pointed out) is because it gave off some energy during the fusion process?
Pretty much. To simplify a bit: the most stable atom is iron-56 anything lighter can be fused and anything heavier can be split to release energy. Essentially after iron the forces that bind atoms together start to lose out to the repulsion between it's constituents, which makes heavier atoms more and more unstable.
This is also why stars that start to fuse iron together will start to cool down.
Edit:
If neutron is likely to decay into. Why *neutrons* dont decay while being part of an atom?
(Wouldn’t this be example of a structure that prevents decaying?)
As another not-physicist who is interested in physics, I've found the articles/explainers at Of Particular Significance very helpful. There are a few on particle decay, and I think that these two provide a longer answer to your questions:
> Is it the case that a given particle is trying to settle into a "lowest energy state" possible?
It's more accurate to think of the energy "spreading out" (remember that mass is a form of energy too, since E=mc^2). The energy can rearrange (subject to conservation laws), between being one massive particle, or several lighter ones (in fact there's a superposition of possibilities, because quantum).
In principle the probability of switching back-and-forth is equal, e.g. the probability of particle A decaying into a B+C pair, is identical to the probability of a B+C collision producing an A. However, most of the directions those light particles can take will result in them flying apart rather than colliding; that spreads out the energy, so it can no longer switch back into the massive particle configuration.
Note that this is essentially the first and second laws of thermodynamics (energy is conserved, and concentrations tend to "spread out" over time)
Sometimes they will have intermediates, which then decay, and then those products decay, and so on. That's quite common. Eventually they just ... fall apart. The more options, the faster. The greater the energy stepdown, the faster, by which I mean "can it release a gamma? Or fall apart into some much smaller things?"
However, it is independent of "nearby" structure, where nearby is any distance larger than the nucleus. So, no, we cannot contain these particles within anything to prevent their decay, it is like trying to build a bouncy castle around a hand grenade in hopes that it won't go off.
Note that there is an apparent delay in decay, from our perspective, when particles are moving very fast, like a relativistic muon lasting longer (although still a very brief period of time by our standards) than expected, simply due to special relativity. But here this also would not help.
Things fall apart, the center cannot hold, and so on.
It is the case that particles always try to settle into the lowest energy, and the more options they have the faster. We may be able to engineer places where they're stable, like in the example from my grandparent of a neutron. They are unstable since their mass is greater than the mass of a proton and an electron combined, but they're stable in all common elements we're used to. So much so that we think of radioactive elements as the exception, but (mostly) all that's happening there (in beta decay) is a neutron decaying.
I'm not an expert, but I'd imagine making a stable situation for a heavier particle much harder than just making an atom, and the fine grained control is even hard still.
>>trying to settle into a "lowest energy state" possible?
What if its actually the reverse: Its attempting and succeeding to be the most it can be given the eddy of forces around it - the particle is "becoming" - not "falling apart"
an interesting analog is if you have a house of cards on a table, it's in a way "trying to settle" into the lowest energy state possible which is post collapse all the cards on the table! But it's also not "trying" to do anything, it's just vibing / vibrating and what ever happens to it happens to it :)
Depends entirely on the particle. A free neutron might have a half life of around twenty minutes. These pentaquark particles, well, nanoseconds are too long to describe, by about ten orders of magnitude.
Some of the heavy elements assembled in colliders are described as decaying so quickly that one side of the nucleus is coming together even as the other side is disintegrating, a sort of brief wave of existence traveling at nearly the speed of light across this thing that has been forced together and wants to fly apart.
An example of something considered to be "slow" is the muon. You could kind of thinking of it as a heavy electron (though that hand waves away a lot). It has a mean lifetime of 2.2 μs - which is fairly slow.
Also note that they're not rare and there's a fair bit of neat science behind that too.
> About 10,000 muons reach every square meter of the earth's surface a minute
There's also neat stuff with time dilation and muons ( http://hyperphysics.phy-astr.gsu.edu/hbase/Relativ/muon.html ) - there should be far fewer observed muons at the surface if muons didn't experience time dilation from their relativistic speeds.
> The historical experiment upon which the model muon experiment is based was performed by Rossi and Hall in 1941. They measured the flux of muons at a location on Mt Washington in New Hampshire at about 2000 m altitude and also at the base of the mountain. They found the ratio of the muon flux was 1.4, whereas the ratio should have been about 22 even if the muons were traveling at the speed of light, using the muon half-life of 1.56 microseconds. When the time dilation relationship was applied, the result could be explained if the muons were traveling at 0.994 c.
(note: mean lifetime and half-life are different numbers)
The thing here is that 2.2 μs is slow, but even with something that is that fast (on a human scale), there's a lot of neat science that can be done with them. They've even made muonic atoms (where the electron is replaced by a muon) https://en.wikipedia.org/wiki/Exotic_atom ... and that leads to possibilities on lowering fusion temperature ( https://en.wikipedia.org/wiki/Muon-catalyzed_fusion ) because the muon is much closer to the nucleus in its ground state.
These particles are hadrons held together by the strong force. They all contain charm or strange quarks which are relatively heavy, and via the weak interactions, they can decay to hadrons containing light quarks. Because of the large mass difference between the charm and strange compared with the light quarks, it's unlikely that any of these will be stable.
Lone neutrons are unstable, they decay in about 15 minutes to lighter particles. They have only a few MeV mass difference to the stable final state (the proton + electron + anti-neutrino).
For comparison, these particles are 2000+ MeV above their ground states so they decay pretty quickly.
sicne neutrons are stable in a neucleous, is it possible that there is any atom-like structure that is stable that contains at least one 'abnormal' particle, like a bozon, or any 'large' particle made of quarks that is not a proton or neutron?
Typically no, because higher energy collisions exist naturally with cosmic particles, so we would have probably observed some of these stable particles by now. But in practice they could be rare and hard to detect.
I'm not even sure if "typically no" is justified here. We don't know what dark matter is, but we do know that it appears that there's a lot of it out there. You could definitely do some math and maybe if you were really clever about it relate the quantity of it that exists to exclude some particular energy range of interactions, but I don't think that's been done.
> There exists no formal definition of a WIMP, but broadly, a WIMP is a new elementary particle which interacts via gravity and any other force (or forces), potentially not part of the Standard Model itself, which is as weak as or weaker than the weak nuclear force, but also non-vanishing in its strength.
That's not regular matter.
Its MACHOs that are made up of regular matter (well, brown dwarfs and black holes).
There are also theories that put an undetected form of neutrino as dark matter which would be a bit more regular.
No, a WIMP is the theoretical dark-matter equivalent of a particle.
A MACHO is a low-energy star or whatever that would explain the apparent presence of dark matter without actually requiring anything exotic like WIMPs. The idea is that these objects are (relatively) massive, numerous, and so low-energy that they are hard to detect and their combined mass would theoretically explain the effects we currently attribute to dark matter.
Or in other words, a WIMP would be like claiming that your fridge is disintegrating your cheese, and a MACHO would be the kid raiding the fridge for cheese at midnight when you're asleep.
The person who started off this comment thread made a sloppy reference to "normal matter (i.e. quarks)". That statement should be read in good faith as meaning the existing known elementary particles as "normal" matter. That implies that some particle which only interacts via gravity and some unknown force is not included in "normal" matter.
To twist that by claiming such a particle would still be viewed as matter just like all the rest of the matter that goes into the stress-energy tensor is where the pedantry started in this thread. The original statement is pretty clear in its intent. The pedantic reading that followed that comment results in "normal" matter just being all matter by definition and hence "normal" is redundant since there can't be abnormal matter. That clearly isn't what the first comment intended since they actually meant something by "normal".
It'd be cool if they could be created and combined in specific combinations and quantities to create new particles that wouldn't exist due to natural formation. Sounds like a good sci-fi mechanic, creating new particles without the Higgs Boson to facilitate FTL or at least some sort of anti-gravity.
Note that the Higgs Boson has no part in the mass (and presumably gravitational interaction) of elementary fermions like quarks or electrons, and even less to do with the mass of hadrons (whose mass is the mass of quarks + the mass of the extraordinary amount of energy sticking the quarks together, which dwarfs the mass of the quarks by ~100:1). Even if all elementary particles were massless, protons, nuclei, and atoms would still have most of the mass they have today (though of course many other interactions would be very different indeed).
The mass of the massive gauge bosons, the W and Z bosons.
Edit: there is an excellent in-depth (but math-light) explanation about the Higgs mechanism by Leonard Susskind. I highly recommend it if you are interested, it lasts about 1h (plus some Q&A) and is extremely approachable, while being presented by an established authority in the field.
unless im wrong (im not a physicist) interaction with the higgs field does give a none zero amount of mass to quarks. but the vast majority of mass of a proton ~99% is due to the energy in the gluon field
On the surface they might not seem useful, but these discoveries develop sub-atomic models, which help predict atomic, and in turn molecular, models, which helps materials research. There are countless materials we haven't discovered/invented yet. We don't know how far we can push it.
I'm not aware of any predictive power the standard model has over nuclear physics and atomic physics directly. In principle, yes the standard model should be able to predict things at those energy scales, in practice no one has a clue how.
To use a more relatable analogy, it's a bit like using quantum mechanics to build a skyscraper. In principle it should be possible, it practice it is incalculable. Newtonian physics does the job fine in that scenario.
You use the quantum mechanics to design the graphene conformation that yield the best loads, and then infuse the concrete in your skyscraper with the graphene. Everything needs abstraction layers otherwise of course the complexity becomes mindboggling.
As I understand it, this is not really how most practical materials research is done today. Bridging the "scale gap" between nano-scale research, micro-scale research, etc. up to something the size of a foundation for a skyscraper is very hard (read: almost impossible) right now. Those nano-scale research areas are pretty siloed and only in extreme cases like transistor manufacturing is there any meaningful overlap with production use cases.
In general, materials researchers for something like concrete are going to be better off exploring the (very large!) high dimensional space of possible formulations of existing concrete ingredients and pushing out the pareto frontier for the best possible concrete that way. Also, one probably shouldn't be using bleeding-edge concrete tech for a skyscraper foundation - in a safety critical application like that you just build it 1.2x bigger than you need and it'll still be much cheaper and safer than a process like what you just described.
Materials research is super interesting, though, even if it's not building up from quantum-particle scale research. And atomic / molecular features of inputs can yield interesting material candidates.
Source: I work (as a software dev, not a materials researcher) at Citrine Informatics, selling software to assist companies who are trying to do practical materials things like make better concrete.
Material sciences, condensed matter physics, chemistry and etc work up from the abstraction layer of "atoms". It's a quite well defined and relevant layer. So, until that work brings some different configuration¹ for atoms, they will have no impact at all.
1 - It doesn't need to be as new elements, but even for the resonance between the nucleus and electrosphere they didn't create anything new, and only things affecting the electrosphere matter. (Even then, they didn't create anything new on a nucleus either.)
If you wanted to build a skyscraper taking into account quantum mechanics then maybe you are hoping to induce a scaled quantum mechanic effect? Perhaps in the ultra-modern evolution of buildings the structure of the building itself will have a communication aspect associated with its natural largess and to accomplish this quantum mechanics is used to derive the appropriate building structure. Just because it sounds far-fetched or hard doesn't mean it won't be humdrum engineering decades or a century in the future. It all starts somewhere.
One of the few molecular level effects that depends on the virtual particles that are important in the standard model is the Lamb shift https://en.wikipedia.org/wiki/Lamb_shift
In this case the virtual particle is a virtual photon, not a virtual weird particle, so it's just scratching the standard model.
You reply in jest, but I'm not asking to be funny with a movie reference.
We see it day in and day out where science has developed something without slowing down to do research into the affects other than the one they are scoped in on while making what they are making. I'm specifically thinking of the new chemical sciences that have brought out some formulas that are great at a specific thing, but are absolutely tragic to nature in so many more ways. The science shows these chemicals to be tragically toxic, yet that info gets shoved in a drawer so inventors can make money.
Great, we made something, but we should be able to say thanks but no thanks. Let's put that in the column of good idea, good science tech to achieve, but best left alone. Take that learning and try to achieve the samething in a different manner so that it doesn't kill everything else.
>Shelving discoveries based on the perceived effect they (could) have (who would even evaluate that?) is a slippery slope if I ever seen one.
This is precisely what should happen though. We made ICE powered cars that used leaded gasoline because reasons, but the results of that were horrible for everything except the ICE. We shelved that tech because it was just bad.
We've shelved the widespread use of lead in paint. We've shelved the widespread use of asbestos in lots of things. There's nothing wrong with realizing the juice isn't worth the squeeze. We know that it is something that happens. Sometimes we make something that comes with a heavy cost. Obviously we don't have a way to know that until it exists. Then again, we should be able to start recognizing that particular chemical chains results in bad things so we should be super careful with the new thing because it is looks like something we've seen before. We can do this with virus and what not. Why not with chemistry?
> There's nothing wrong with realizing the juice isn't worth the squeeze.
The LHC employs a lot of people working on smart things. CERN gave rise to the world wide web and there are many other innovations in computing, construction, and theory that come from the work being done there.
> The Large Hadron Collider took about a decade to construct, for a total cost of about $4.75 billion. [1]
> Since the opening of Mercedes-Benz Stadium in 2017, the Falcons organization has publicly pegged the cost of the building at $1.5 billion [2]
It's the same order of magnitude of cost as a sports stadium. It's a tiny slice of the worldwide economy.
We don't know where the key discoveries in "theory state space" are, so we continue to search. Finding the right evidence or surprises could lead to rapid changes in how we think and view the universe.
I'm sure some medieval people must have found scientific tinkerers wasteful as well.
Diversification of investment is good. It's not like all research dollars are going to high energy physics.
What does the price of a stadium or the LHC have to do with the price of tea in China? You've taken the conversation in a direction nobody else was discussing.
It was well known that leaded gasoline was bad at the time it was invented; we kind of just ignored them.
The inventor of leaded gasoline (Thomas Midgley) also invented CFCs, but at least we didn't already know those were bad for the ozone layer at the time.
> The science shows these chemicals to be tragically toxic
I doubt that the science can show a compound to be tragically toxic any more than it could show a compound to be hilariously toxic, frightenly toxic or delightfully toxic.
Apart from an observer, who is typically human (though sometimes in our mind an athropomorphized animal or superhuman deity) I'm not sure anything in nature can be tragic. It just is. No one mourns the trilobites.
You say that, but it should be part of the creating of something new. It should be studied to see what negative effects it has. We have enough collective knowledge to know that even when things are created to do good, some negative things sometimes occur. It's not beyond reasonable to have the new thing tested in these negative reactions as well.
Closer than business as usual. The problem to get out of the current "long stasis" [1] is to find a new elementary particle or an experiment that can't be explained with the current elementary particles and can be refined to discover a new elementary particle.
They discover two new composite particles. There are hundred of composite particles, so it's somewhat business as usual. Anyway, most composite particles have 2 or 3 quarks, but the new particles have 4 or 5 quarks. So they are weird new composite particles.
Making calculations of particles made of a few quarks is very difficult, borderline impossible, so it's interesting to find new particles and verify that the current approximations for particles made of a few quarks are good enough or fix them.
Also, the approximations for particles made of a few quarks use virtual particles that appear and disappear. And some of these virtual particle may be a unknown new particle. So if the calculation is too wrong it may be an indirect way to discover a new elementary particle and escape the "long stasis". But I'd not be too optimistic about a groundbreaking discovery.
[1] I don't think it's a problem yet. The current "long stasis" it's overrated IMHO.
What makes these kind of particles interesting and "exotic" is that they are not the kind of particles the Standard Model was originally developed to describe. Those particles, mesons and baryons, consist of two and three quarks, respectively, with some quantum numbers that must obey certain rules for the particles to exist, and we have found that (almost?) all of the two and three quark combinations allowed by the rules are in fact observable as particles in experiments.
But those rules for the quantum numbers can also be fulfilled with certain combinations of four or five quarks, and there is nothing in the Standard Model that either forbids or requires these combinations to exist as real particles. So it was new information when the first resonances that could be interpreted as those kind of particles were discovered and it is interesting that there are more of these. But it is not unexpected, either, the earliest paper on pentaquarks cited on the wikipedia page is from 1987.
So it is indeed close to business as usual. It is interesting, and new, but is is still filling out the corners of the Standard Model.
I’m no particle physicist, but this doesn’t look like anything too fundamental — no new elementary particles, just some new and interesting combinations of existing elementary particles (in this case, quarks). It might have lots of further relevance, but it might not just as easily.
Discovers? Or invents? I've only studied particle physics at undergrad level but strikes me that these tetraquarks and pentaquarks could be combinations never created by any [other] natural process.
By my understanding, the LHC isn't doing anything different from what's happening in the upper atmosphere every microsecond, when solar rays are hitting the Earth; except that the LHC is much lower energy than some of those collisions.
Yep, there's a class of thing called UHECR (Ultra-High Energy Cosmic Rays). We still have barely a clue what generates them, but they hit our atmosphere with roughly ten million times the energy of what LHC can muster.
Could it simply be far travelled gamma-ray bursts? Not sure how it works, if tiny bits of them towards Earth can survive very far, without far less likely extinction events. Just trying to think of extremely energetic sources…
Of course, we aren’t entirely sure of what GRB’s come from either :D
Isn’t it more accurate to say we don’t know their primary sources? It’s extremely likely that the are generated in stellar processes and black hole ejects, no?
Kind of... maybe... there are some interesting problems with various sources.
First, not sure about the process that generates them. Saying "they came from an active galactic nuclei is ok... but how did they get accelerated to such energies?
Part of the problem is that we're not entirely sure what they're made of. Most theories have been working on the "they're protons" assumption, but other approaches with having them be heavier nuclei means that they don't need to travel as fast to have the same amount of energy (which also changes the equation for the GZK limit as that applies to protons).
Thanks this makes sense, My underlying assumption was that a typical star has magnetic acceleration paths which have many orders of magnitude more energy than the LHC (Many intentionally used ambiguously as I have not done the math).
I suppose given the energies involved, we would need to observationally ascertain where in the sky the cosmic rays come from in order to put bounds on how they were made and what they are made of.
Do yo know of any efforts to observe cosmic ray sources or build a cosmic ray telescope?
> ... we would need to observationally ascertain where in the sky the cosmic rays come from in order to put bounds on how they were made and what they are made of.
This is part of the challenge - the map of where they are hint at some hot spots ( https://skyandtelescope.org/astronomy-news/cosmic-rays-hint-... ) but as these are charged particles (not light) the path that they follow isn't necessarily a "draw a straight like back to the source"
> Do yo know of any efforts to observe cosmic ray sources or build a cosmic ray telescope?
We don't directly observe the cosmic rays, but rather the cascade of particles that they make as they crash through the atmosphere.
> But since these high energy particles have an estimated arrival rate of just 1 per km2 per century, the Auger Observatory has created a detection area of 3,000 km2 (1,200 sq mi)—the size of Rhode Island, or Luxembourg—in order to record a large number of these events. It is located in the western Mendoza Province, Argentina, near the Andes.
I mean, saying something in the universe is generated in a stellar process is borderline tautological, no? We are all stardust and we derive all our energy from the sun, after all.
I have to keep reminding me of this! We’re here down on Earth looking for ways to push science further and reconcile the quantum world with the Standard Model, while the atmosphere may already produce things like gravitons or some exotic version of Supersymmetry that demands higher energies than expected.
You joke, but before powerful particle accelerators, physicists really did just lift detectors higher into the atmosphere—either on planes or by carrying them up very tall mountains.
I know of some high altitude balloon and satellite experiments as well. I guess size and weight are a limiting factor to what one can detect with those.
It's not exactly the same - the energy of solar rays is higher, but the residual momentum is a problem. Most of the available energy is needed to conserve momentum in the final product, unlike the LHC which collides two particles moving in opposite directions, leaving all kinetic energy available.
There are cosmic rays of almost arbitrary energy, collisions like the ones at LHC are happening all the time in the universe. So surely these particles will have been created before somewhere else.
To quote Alan Watts, "There is no end to the minuteness that you can unveil through physical investigation. For the simple reason that the investigation itself is what is chopping things into tiny little pieces. And the sharper you can sharpen your knife, the finer you can cut it. And the knife of the intellect is very sharp indeed. And with the sophisticated instruments that we can now make, there’s probably no limit to it."
It's almost as if we're looking at a continuous functions that can generate an infinite number of discrete segments.
Not surprising you're downvoted on HN, but your comment is the only one that makes sense in this entire thread. I wonder at one point our collective consciousness will wake up to the fact that we've completely hit a wall in our understanding of the universe, and that spending billions detecting more particles isn't magically going to explain reality.
I think you're both quite right that we're not likely to find much in this subdivision. It seems to me that the people who want to detect more particles are not driven by a desire to explain reality itself. No, I think the research is done in the hopes of cataloguing particles that could be useful, as other comments have said.
Is there a way to even begin to assess the likelihood that research on esoteric subatomic particles is going yield useful results for energy, health care, or transportationn? Or are we just digging away ad nauseum with zero idea if the next rock contains some precious ore?
As with all physics, I think we ultimately hit the point of diminishing returns before we need to introduce a new category of thought to explore into. That was supposed to be Quantum and/or Particle Physics, but neither have taken us anywhere and we no longer have new ground to explore.
Our fundamental understanding of the Universe has barely over the past half century, and it continues to slow with every anticlimactic LHC collision that emerges.
The comment is incorrect because it confuses "each quantum field is apparently continuous" with "the number of quantum fields is a real number". That's two different dimensions.
> The eventual recognition of the muon as a simple "heavy electron", with no role at all in the nuclear interaction, seemed so incongruous and surprising at the time, that Nobel laureate I. I. Rabi famously quipped, "Who ordered that?"
It may need clarifying that "exotic hadron" simply and specifically means "hadron with more than three quarks." What's being reproted is finding new particles belonging to a family that already has several known members.
So if atoms are composed of electrons, protons, and in turn electrons, protons are composed of these particles, can we potentially drill down a bit deeper into these hadrons, might they be composed of further sub particles yet unknown to science yet?
As far as we know, both leptons (electrons, positrons, muons, taus) and quarks are fundamental, i.e. not composed out of other particles (also, they are pointlike, i.e. they have no extend). That's not true for protons (which are made out of quarks and gluons. It's a little bit complicated because of vacuum polarization, but if you count a certain way, you'll find it's composed out of two up and one down quark). They also have a measurable size.
These new particles are also made from quarks (4 for the tetra quark and 5 for the pentaquark), and they also have a size.
Doesn't have to be free to be studied. We can do deep inelastic scattering to measure the quark shape, the same way we can do quasi-elastic scattering to study the neutron shape. SM certainly assumes leptons and quarks to be point-like.
(of course, in general, we can not prove any theory, only disprove.)
Deep inelastic scattering of the nucleus did not find quarks. It only found nucleons.
The same thing here: it's not known if quarks have an internal structure. And unless your energy is higher than the size of the object being measured you certainly can't tell if it's point like.
No, that's quasi-elastic scattering.
DIS was done, for example, by HERA, an elctron-proton collider. SLAC did it in 1968 and essentially discovered the quarks.
Of course there is always a measurement limit. Current limits for the size are "smaller than 1e-19m". That's why I wrote "as far as we know", as we have no evidence that they are not point-like, and we have rather strong limits.
The colorfulness of quarks hints at non trivial internal structure, and given the number of colors, quarks probably consist of three somethings, bound together in a complex dynamically stable motion.
Allright, at high energy, the LHC manages to observe some particule configuration never seen before.
Anything to "milk" something interesting from those?
Given that all of these exotic particles that are apparently discovered are extremely short-lived (at the limits of human technology to even register them) is it possible that scientists are radically misinterpreting the meaning of these experiments and finding particles where there's some other force or phenomena being observed? I'm very skeptical that it's apparently just smaller and smaller particles all of the way down, and every time they get bigger and badder technology to collide atoms, they tell us that they coincidentally discover new particles. What if everything here about reality is being misinterpreted?
Here's an analogy that might not make sense to everybody, but to me, this feels a bit like the famous memes from the Chernobyl movie. "3.6 Roentgen. Not great, Not terrible." where the real answer about the radiation exposure was radically different. Some the people tasked with coming up with that 3.6 answer might not have had a bad intent, but they were at the limits of their technology to provide an answer and radically misinterpreted what they were seeing partially because of that.
Currently the best known theory of physics at small scales is a quantum field theory known as the Standard Model.
"Particles" are phenomena which appear when fields are quantized. They aren't really balls of stuff. E.g. photon is a quantum of a wave in electromagnetic field.
These exotic particles are simply a confirmation of predictions made by the Standard Model, they are not surprising, it's just the first time there was enough data to test this particular prediction of the theory.
> I'm very skeptical that it's apparently just smaller and smaller particles all of the way down
The Standard Model was formulated in 1970s, no principally new phenomena were discovered since. So it's been strong for ~50 years.
It's known to be incomplete as it does not explain gravity, and physicists hope that better theories will be found in future. But so far nobody was able to formulate a theory which would make better predictions than the SM.
> What if everything here about reality is being misinterpreted?
The topic of physics is to build models which predict observed phenomena. Knowing what reality _is_ is a topic of philosophy and religion.
These aren't really "discoveries". These particles were already known to exist (or, to be precise, whose existence was predicted by the Standard Model, just like the Higgs boson). This is just the first time they have actually been observed. The headline is misleading, but the first sentence in the article gets it right:
"The international LHCb collaboration at the Large Hadron Collider (LHC) has observed three never-before-seen particles."
> These aren't really "discoveries". These particles were already known to exist (or, to be precise, whose existence was predicted by the Standard Model, just like the Higgs boson). This is just the first time they have actually been observed.
But doesn't that make it a discovery? Sure, a discovery of something already predicted by the Standard Model, but unless it has been observed, we just don't know for sure if it actually exists.
The Standard Model just has been very successful at predicting stuff :)
To my way of thinking one of the defining characteristics of a discovery is that it is unexpected, so a confirmation of a theoretical prediction doesn't count.
I think I would be very happy to be the first person to observe a new particle, its discovery enough for me :) It does not need to appent all of known physics or be unexpected for us to call it a discovery
I get what you're saying, but it's not like someone looked through a microscope and actually saw one of these things for the first time. It's computers crunching trillions of data points and spitting out a result. So it's not clear who "the first person to observe this" actually was.
Your analogy is poor because the “3.6 Roentgen” was a story the politicians were telling each other; where the actual scientist knew they were measuring the limit of their equipment. Perhaps it’s uncharitable but I feel like I could restate your question as: “I know nothing about particle physics - is it possible the entire field is just cranks deluding themselves?”
Would not be the first time humans deluded themselves with big ideas the masses could not falsify.
It’s as likely as anything else given how common it is for even STEM educated humans to be religious and read into their tea leaves.
There is no center of the universe or higher power; why should a handful of physicists be unequivocally “correct”. Deference to their figurative identity is all the untrained can conjure. Doesn’t mean we need to shower them in praise and empower their agency at the expense of one’s own.
Even if the science is right they’re still just one of seven billion.
The new particles in this case aren't "smaller" than the ones we already know about. They are short-lived because they are heavier. They quickly decay into something simpler.
Extraordinary claims need extraordinary evidence. This article doesn't really describe anything that unexpected. It's not that hard to believe.
Are these really new particles? It appears to me these are quarks in different combinations. I ask because this hadron collider machine needs to show amazing results but this announcement seems kind of forced.
Space marketing is tough as it is with space station updates, rovers, Mars etc from NASA et al, but this is just a whole other struggle. Is the general public supposed to care about the LHC starting up again etc? Higgs Boson was a total 'meh' moment ten years ago after a bit of hype. I suppose particle physics etc is just too obscure.
So.. maybe it's a good time to pause the collider and see if we can use this discovery for any practical purpose. If not, that money could go towards clean energy research or lab grown meat research, anti cancer or anti aging research... something that might do people some good.
I understand the sentiment, but can't you imagine that advances in our understanding of fundamental physics could lead to advances in some of the areas that you mention? For example, new particles -> better models of chemistry -> better cancer drugs; or new particles -> better fusion models -> cleaner energy.
I guess so, but there are not 7.8B scientists that can develop the next battery or solar breakthrough, which is actually what is needed in a very big way right now.
whatever they did 10 years ago opened up pandora’s box . can we please return to the pre kony 2012 timeline ? Ever since Harambe died it’s been a total apocalypse
can someone explain where Cern gets the $50B+ to build and operate the LHC? their discoveries have a dubious theoretical and practically zero commercial application. there must be a hidden weapons / military application to justify a massive money hole
Since first "high energy" particle accelerator in 1932, we have used particle accelerator to smash and study subatomic particles. First we found a whole zoo of them, but then 1961 Gell-Mann's quark model explained how all those subatomic particles are not elementary particles, but made of various combinations of quarks.
From 1975 we've had the standard model of particle physics, and after 2012 with the experimental discovery of the Higgs boson, now we've found all the particles in the standard model. But the road up to 2012 was: Keep building bigger accelerators, keep adding more energy, keep finding new particles. Why stop now?
Some skeptics say that now we have maybe found them all. And we don't have a good theory that would predict new particles, so maybe we won't find new elementary particles, no matter how we go on. Maybe we should pause and reconsider. Work on new theories.
But, we also have theories: Supersymmetry, and string theory. Supersymmetry predicts the existence of the superpartners of all the 17 elementary particles. Maybe their discovery is just behind the corner, if we just keep going? Or maybe supersymmetry is wrong, and the superpartners don't exist at all.
Timeline of discoveries of the elementary particles:
1800-1895 photon
1897 electron
1937 muon
1956 electron neutrino
1962 muon neutrino
1969 down quark
1969 strange quark
1969 up quark
1974 charm quark
1975 tau
1977 bottom quark
1979 gluon
1983 W boson
1983 Z boson
1995 top quark
2000 tau neutrino
2012 higgs
It's the continuation of the same line of work that long ago gave us nuclear power, nuclear bombs, nuclear medicine (MRI and PET scans, cancer treatments). We don't know if this line of work has / will have practical value anymore. But we have been at this for such a long time, there is momentum and it's maybe difficult to stop and reorient.
Or rather, planning useful projects only gets you things you already knew to ask for. Doing basic research is a way to create things you didn't know to ask for.
Over the past few years, something killed over 5 billion giant starfish and “science” has no idea what did it. Without the starfish, the sea urchin population exploded, turning kelp forest into desert.
I love the LHC. But we need to seriously grow the science pie and prioritize the science of ecosystem collapse and management. There simply aren’t enough trained scientists.