hckrnws
Did the Particle Go Through the Two Slits, or Did the Wave Function?
by Tomte
According to modern QFT, there are no particles except as an approximation. There are no fields except as mathematical formalisms. There's no locality. There is instead some kind of interaction of graph nodes, representing quantum interactions, via "entanglement" and "decoherence".
In this model, there are no "split particle" paradoxes, because there are no entities that resemble the behavior of macroscopic bodies, with our intuitions about them.
Imagine a Fortran program, with some neat index-based FOR loops, and some per-element computations on a bunch of big arrays. When you look at its compiled form, you notice that the neat loops are now something weird, produced by automatic vectorization. If you try to find out how it runs, you notice that the CPU not only has several cores that run parts of the loop in parallel, but the very instructions in one core run out of order, while still preserving the data dependency invariants.
"But did the computation of X(I) run before or after the computation of X(I+1)?!", you ask in desperation. You cannot tell. It depends. The result is correct though, your program has no bugs and computes what it should. It's counter-intuitive, but the underlying hardware reality is counter-intuitive. It's not illogical or paradoxical though.
This is incorrect. There are particles. They are excitations in the field.
There still is the 'split particle paradox' because QFT does not solve the measurement problem.
The 'some kind of interaction of graph nodes' by which I am guessing you are referring to Feynman diagrams are not of a fundamental nature. They are an approximation known as 'perturbation theory'.
I think what they must be referring to is the fact that particles are only rigorously defined in the free theory. When coupling is introduced, how the free theory relates to the coupled theory depends on heuristic/formal assumptions.
We're leaving my area of understanding, but I believe Haag's theorem shows that the naïve approach, where the interacting and free theories share a Hilbert space, completely fails -- even stronger than that, _no_ Hilbert space could even support an interacting QFT (in the ways required by scattering theory). This is a pretty strong argument against the existence of particles except as asymptotic approximations.
Since we don't have consensus on a well-defined, non-perturbative gauge theory, mathematically speaking it's difficult to make any firm statements about what states "exist" in absolute. (I'm certain that people working on the various flavours of non-perturbative (but still heuristic) QFT -- like lattice QFT -- would have more insights about the internal structure of non-asymptotic interactions.)
This is a pretty strong argument against the existence of particles except as asymptotic approximations.
I think it's also a pretty strong argument against the mathematical well-definedness of typical (interacting) QFTs in the first place.
Though it doesn't resolve whether a "quanta" is a particle or a measurable convergence of waves, Electrons and Photons are observed with high speed imaging.
"Quantum microscopy study makes electrons visible in slow motion" https://news.ycombinator.com/item?id=40981054
There exist single photon emitters and single photon detectors.
Qualify that there are single photons if there are single photon emitters:
Single-photon source: https://en.wikipedia.org/wiki/Single-photon_source
QFT is not yet reconciled with (n-body) [quantum] gravity, which it has 100% error in oredicting. random chance. TOD
IIRC, QFT cannot explain why superfluid helium walks up the sides of a container against gravity, given the mass of each particle/wave of the superfluid and of the beaker and the earth, sun, and moon; though we say that gravity at any given point is the net sum of directional vectors acting upon said given point, or actually gravitational waves with phase and amplitude.
You said "gauge theory",
"Topological gauge theory of vortices in type-III superconductors" https://news.ycombinator.com/item?id=41803662
From https://news.ycombinator.com/context?id=43081303 .. https://news.ycombinator.com/item?id=43310933 :
Probably not gauge symmetry there, then.
Perhaps a better way to say it is that particles are not longer small balls of dirt [1], but a mathematical construction that is useful to generate an infinite serie [2] to calculate the results.
Since in some conditions these mathematical tricks behave very similar to small balls of dirt, we reused the word "particle" and even the names we used when we thought they were small balls of dirt.
[11] We probably never thought they were made of dirt, and in any case the magnetic moment is the double of the value of the small ball of dirt model.
[2] That has so many infinites that would make a mathematician cry.
Note that particles are not just for perturbation theory. There is a particle whenever there exists a particle annihilation/creation field configuration. A proton is a particle so writing down its creation/annihilation field configuration is in theory possible, though maybe not in practice.
Another point is that infinities do not necessarily make mathematicians cry. Abraham Robinson is quite pleased with them. It seems a possible hypothesis that at least some QFT are mathematically well-defined using non-standard analysis. Where 'some QFT' at least renormalizable and perhaps also asymptotically free. I don't know enough about it to know how the Haag theorem, mentioned in another comment impacts this.
Another analogy (flawed as any of them). Sports teams "exist" in a sense. They meet one another in well-defined interactions, called matches, and such an interaction can be described as if teams were well-defined atomic entities, producing a score.
But a sports team is not atomic, not a "final reality" entity. A sports team can pass through one gate, or through several gates, when entering a stadium. From a doctor's perspective, the team "does not exist", a doctor only operates in terms of individual players' organisms.
Particles are an approximation to the actual behavior of the field, and are used in perturbation theory to calculate the more complicated field behavior.
This works well when interactions are weak. Electrons do not couple strongly to the electromagnetic field, so it makes sense to view electrons as particles. However, quarks couple very strongly to the strong force (hence the name), so the perturbative approach breaks down, and it makes less sense to view quarks as particles.
So in a non-perturbative QFT calculation which has a well defined particle-number operator, that's just "an approximation" within the theory? What is it approximating?
Energy capacity?
I'll bite: Explain yourself.
Also, for context, my question was posed because the idea of "particle number" as well as "quantum states of particles (which are countable) represented in a Fock space" and in general the idea of particles are, like, page 2 of any QFT textbook. It doesn't approximate anything in the theory. Creation and annihilation of particles (and hence the well-defined concept of a particle) is fundamental to the construction of the theory itself, perturbative or not.
Comment was deleted :(
False in multiple ways.
QFT doesn’t discard local fields and replace them with only nonlocal graph nodes.
Maybe this is coming from some speculative quantum gravity ideas.
There's no locality.
How so? QFT is Lorentz invariant. Even has such a thing as the norm flux.
My bad; QFT actually postulates locality. I was thinking about the casual set theory which strives to solve some of the QFT's difficulties, and where locality is an emergent / statistical phenomenon rather than a postulated condition.
Lorentz invariance is also violated in QFT assuming non-zero temperature.
If you couple your system to a heat path that is at rest wrt a specific Lorentz frame, you of course lose Lorentz incariance. On the other hand the lagrangian of the standard model itself is to my knowledge fully Lorentz invariant.
I don't know what they talk about there, but it sounds like some kind of thermodynamic approximation is involved there. Does thermodynamics survive Lorentz transformation?
s/FOR/DO/
just because QFT follows an internal logic, doesn't mean the jump from macro physics to quantum physics itself is logical. In my opinion we still don't have a logical explanation for why the model changes so dramatically from classical to quantum physics.
The Universe is fundamentally quantum in nature; if anything, we'd need a model that explains why classical physics works so well most of the time.
As a naïve fool with no understanding of quantum physics, I want to take a stab at this! Here’s my hypothesis:
Consider a world in which everything is “very quantum”, and there are no easy approximations which can generally be relied on. In such a world, our human pattern-matching behavior would be really useless, and “human intelligence” in the form we’re familiar with will have no evolutionary advantage. So the only setting in which we evolve to be confused by this phenomena is one where simple approximations do work for the scales we occupy.
Sincerely, I don’t think this argument is super good. But it’s fun to propose, and maybe slightly valid.
The main objection is: if there wasn't a classical limit, our brains would have evolved differently.
So yes, we can use the antrophic argument as evidence for the existence of the classical limit, but it doesn't have explanatory power for why there is a classical limit.
This is called the anthropic principle. I personally have objections to it, specifically that due to emergence it is hard to make definitive statements about what complex phenomena may emerge in alternate universes. However, it's taken seriously by many philosophers of physics and certainly has merit.
Isn't that argument from ignorance? You can consider a class of physics similar enough to our physics, it should give enough space for research.
My point is that it isn't possible to determine the emergent behaviour of a complex system from first principles. So arguments of the type "these physics don't result in atoms being produced, so life can't emerge" doesn't imply that other complex structures _like_ life don't emerge.
Then how do we make technology if we don't know the result?
Technology is made iteratively by repeated trial and then observed error in the physical structures we've created (i.e. we build machines and then watch them fail to work properly in a particular way).
Technology that works in a different universe without atoms, would require us to be able to experiment within that universe if we wanted to produce technology that works there with our current innovation techniques.
have no understanding of quantum physics
But you know about the Anthropic Principle :)
I'm a fool too but two things I remember. One was a paper discussing the thermodynamics of groups of particles. When they have strong interactions with nearby particles classic behavior emerges very quickly as the number of particles increases. And not n equals 1 million, or 1000, but more like two dozen.
And then there was Feynman asked to explain in layman's terms how magnets work. And he said I can't. Because if I taught you enough to understand you wouldn't be a layman. But he said it's just stuff you're familiar with but at a larger than usual scale. And he hinted even then one level down and you run out of why's again.
We do have a model. That’s statistical physics.
Any standard course goes over various derivations of classical physics laws (Newtonian dynamics) from quantum mechanics.
I did study physics, and our statistical physics lecture only derived thermodynamic laws.
We also had a somewhat shoddy derivation of Newton's Laws from the Schrödinger equation, but wasn't really satisfactory either, because it doesn't really answer the question when I can treat things classically.
What I'd really like (and haven't seen so far, but also haven't searched too hard) is the derivation of an error function that tells me how wrong I am to treat things classically, depending on some parameters (like number of particles, total mass, interaction strength, temperature, whatever is relevant).
(Another thing that drove me nuts in our QM classes where that "observations" where introduced as: a classical system couples to a quantum system. Which presupposes the existence of classical systems, without properly defining or delineating them. And here QM was supposed to be the more fundamental theory).
What I'd really like (and haven't seen so far, but also haven't searched too hard) is the derivation of an error function that tells me how wrong I am to treat things classically, depending on some parameters (like number of particles, total mass, interaction strength, temperature, whatever is relevant).
There are plenty of ways to do this and things like Wigner functions literally calculate quantum corrections to classical systems.
But generally if you can't even measure a system before it's quantum state decoheres then it's quantum status is pretty irrelevant.
I.e. the time it takes for a 1 micrometer wide piece of dust to decohere is ~10^-31 s and it takes a photon ~10^12s to cross it's diameter. So it decoheres 10 billion billion times faster that a photon could even cross it.
The error is usually taken as ratio of wavelength to your desired precision, but in general depends on your use case, sometimes you have full precision all the way down, sometimes you have insufficient precision on astronomic scale. Quantum physics doesn't have an absolute scale cutoff.
i started writing a response about how the human brain is designed to operate in an environment where classical physics is the norm, so we need to bridge the deviations from that if we are to really understand the world. But I don't know how much that's really true if you consider neural biology and I won't claim to know where quantum stops and classical begins as it relates to brain function.
You need quantum physics to understand how chemistry works.
So, given that chemistry plays a huge role in how the human (or any) brain works, it would be quite a stretch to argue that the brain works with classical physics.
We are often sloppy and sort all the chemistry in with classical physics, but that's a very human-centric approach. In reality, the Universe doesn't have different "domains" with separate rules for chemistry and physics; it evolves according to the Schrödinger equation, and we use Chemistry as an abstraction to not have to deal with nasty mathematics to predict how certain reactions will work.
I do think there's something to this approach though - our sensory organs and processing ability are not abstract powers of understanding the universe - they developed exactly to give us enhanced survival chances. We should not expect to even be able to detect (let alone intuitively understand) aspects of reality that can't be used for survival.
I do understand the point you’re making but my counter argument to that would be that physics hasn’t relied on our sensory input for a hundred years or more.
It’s been almost entirely based on maths and careful measurements from machined instruments purpose built for observing phenomena.
So at this point you’d hope the limitations of our biological senses would have been long surpassed.
our [...] processing ability are not abstract powers of understanding the universe
Neural nets are called universal approximators for a reason. If what you guys are discussing is true, then a neural net would not be able to learn from a dataset about quantum experiments. I doubt this is the case. Also there is quantum cognition, and by that I mean the fact some researchers figured out a lot of puzzling results from experimental cognitive science seem to make more sense once analyzed from a quantum perspective.
Humans are affected by prejudice, so learning straight from a dataset is easier said than done, but possible, yes.
Yes, though our cells have machinery that does use quantum phenomena.
In my opinion we still don't have a logical explanation for why the model changes so dramatically from classical to quantum physics.
I think you have this backwards. QM IS the law of the universe and Classical Physics is just a high mass low energy approximation of it. In any case there doesn't need to be a logical explanation at all, the laws of physics are as they are. Why is the value of the fine structure constant what it is?
Observation is more important than model; if we take the model too seriously, we can be led astray. It's much like extending a metaphor too far.
We observe double-slit diffraction and model it with the wave-function. This doesn't preclude other models, and some of those models will be more intuitive than others. The model we use may only give us a slice of insight. We can model a roll of the dice with a function with 6 strong peaks and consider the state of the dice in superposition. The fact that the model is a continuous real function is an artifact of the model, a weakness not a strength. We are modeling a system who's concrete state is unknown between measurements (the dice is fundamentally "blurred"), and we keep expecting more from the model than it wants to give.
Programmers may have better models, actually. The world is a tree where the structure of a node births a certain number of discrete children at a certain probability, one to be determined "real" by some event (measurement), but it says little about "reality". The work of the scientist is to enumerate the children and their probabilities for ever more complex parent nodes. The foundations of quantum mechanics may be advanced by new experiments, but not, I think, by staring at the models hoping for inspiration.
The models of quantum mechanics have already withstood experiments to a dozen decimal places. You aren't going to find departures just by banging around in your garage; you just can't generate enough precision.
The only way forward at this point is to start with the model and design experiments focusing on some specific element that strikes you as promising. Unless you're staring at the model you're just guessing, and it's practically impossible that you're going to guess right.
You aren't going to find departures just by banging around in your garage
This kind of rhetoric saddens me. Someone says "design an experiment" and you jump to the least charitable conclusion. That people do this is perhaps understandable, but to do it and not get pushback leads to it happening more and more, to the detriment of civil conversation.
No, the experiment I had in mind would take place near the Schwarzchild radius of a black hole. This would require an enormous effort, and (civilizational) luck to defy the expectations set by the Drake equation/Fermi paradox. It's something to look forward to, even if not in our lifetimes!
I mean you did just suggest that classical QM can be supplanted by your heavily underspecified finite(?)-state model for which you provide essentially no details, you must admit that's pretty crank-y behaviour.
Comment was deleted :(
No, the experiment I had in mind would take place near the Schwarzchild radius of a black hole
I think the GP was thinking of more practical experiments, not science fiction.
Comment was deleted :(
To be fair quantum mechanics was invented by guessing that energy might be quantized. It just happened to model the universe well.
Waves are quantized (one wave, two waves, ...), so energy transfers by waves are quantized too.
This is one of the reasons I believe science and technology as a whole are on an S-curve. This is obviously not a precise statement and more of a general observation, but each step on the path is a little harder than the last.
Whenever a physics theory gets replaced it becomes even harder to make an even better theory. In technology low hanging fruit continues to get picked and the next fruit is a little higher up. Of course there are lots of fruits and sometimes you miss one and a solution turns out to be easier than expected but overall every phase of technology is a little harder and more expensive.
This actually coincides with science. Technology is finding useful configurations of science, and practically speaking there are only so many useful configurations for a given level of science. So the technology S-curve is built on the science S-curve.
I don't think this is strictly true. Rather it seems that the problem is that we, at some point, invariably assume the truth of something that is false, which then makes it really difficult to move beyond that because we're working off false premises, and relatively few people are going out of there way to go back in time and challenge/rework every single assumption, especially when those assumptions are supported by decades (if not centuries) of 'progress.'
An obvious example of this is the assumption of the geocentric universe. That rapidly leads to ever more mind-boggling complex phenomena like multitudes of epicycles, planets suddenly turning around mid-orbit, and much more. It turns out the actual physics are far more simple, but you have to get passed that flawed assumption.
In more modern times relativity was similar. Once it became clear that the luminiferous aether was wrong, and that the universe was really friggin weird, all sorts of new doors opened for easy access. The rapid decline in progress in modern times would seem most likely to suggest that something we are taking as a fundamental assumption is probably wrong, rather than that the next door is just unimaginably difficult to open. This is probably even more true given the vast numbers of open questions for which we have defacto answers, but yet they seem to defy every single test of their correctness.
---
All that said, I don't disagree that technology may be on an s curve, but simply because I think the constraints on 'things' will be far greater than the constraints on knowledge. The most sophisticated naval vessel of modern times would look impressive but otherwise familiar to a seaman of hundreds or perhaps even thousands of years ago. Even things like the engines wouldn't be particularly hard to explain because they would have known full well that a boiling pot of water can push off its top, which is basically 90% of the way to understanding how an engine works.
It's true that Ptolemaic cosmology stuck thinkers in a rut for a very long time; but what got us out of that rut was observation (and simplification). Copernicus saw that heliocentrism led to a simpler model that fit observation better (ironically he wanted to recover Ptolemy's perfectly circular orbits!). In turn, Kepler's perfectionism led him to ditch the circular orbit idea to yield the first accurate description of orbits as ellipses. Yes, transgression against long-held belief was necessary to move forward, but in every case the transgression explained observation. Transgression itself is undesirable. In fact, transgression unmotivated by observation is what powers the dark soul of the "crank", who is at best a time-waster and at worst a spreader of mental illness.
Even Einstein did not produce (e.g. special relativity) out of whole cloth. He provided a consistent conceptualization of Lorentz contraction, itself the result of observing descrepencies in the motion of Jupiter's moons. The same could be said of the photoelectric effect, the ultraviolet catastrophe, and QM.
All this to say that your statement "The rapid decline in progress in modern times would seem most likely to suggest that something we are taking as a fundamental assumption is probably wrong" is unsupported. Nothing could be more popular than questioning fundamental assumptions in science today!
It could very well be that, as Sean Carroll puts it, we really know how everything larger than the diameter of a nuetron works! Moreover, we know that even if we find strangeness at tiny scales, our current theories WILL remain valid approximations, just like Newtonian mechanics are valid approximations of special and general relativity. The path to progress will not happen because a rogue genius finds something everyone missed and boldly questions assumptions long-held. Scientific revolution first requires an observation inconsistent with known models, but even the LHC hasn't given us even one of those. There is reason to think that GR, QM, and the standard model are all there is...until we do some experiments near a black hole!
Copernicus and Kepler did interpretations, not observations, they explained observations, but geocentrism explained observations too, so heliocentrism wasn't unquestionably superior.
Copernicus saw that heliocentrism led to a simpler model that fit observation better.
That's not true, he didn't.
Geocentric model of the time was a better fit of the data than the Copernican model. What Copernican model had was simplicity (at some cost to observational data fidelity).
Making the heliocentric model approach (and breach) the accuracy obtained by the geocentric model took a lifetime of work by many people.
As a kinematic model (description of the geometry of motions) as observed from Earth's reference frame geocentric is still pretty darn accurate. There's a reason why it is so. Compositions of epicycles are a form of Fourier analysis -- they are universal approximators. They can fit any 'reasonably well behaved' function. The risk is, and it's the same risk with ML, deep neural nets, that one (i) could overfit and (ii) it could generate a model with high predictive accuracy without being a causal model that generalises.
Heliocentric model was proposed much much earlier than Copernicus but the counterarguments were non-ignorable. Reality, it turned out was very surprising and unintuitive.
Truth be told, I don't know much about Copernicus. He may indeed have been right but for the wrong reasons! If so, he's a very good example against my point that observation must precede successful revolution. It seems strange that the Catholic church took him so seriously if his claim was supported by his enthusiasm and not observation. It's definitely something I'd like to learn more about - any book recommendations?
This history is absolutely fascinating. Let me find a blog post by Baez that covers a lot of that history.
I don't think this history says anything against your point -- sometimes the time is just not right for the idea -- and even classical science can be very unintuitive and weird, so much so that common sense seems like very strong counter arguments against what eventually turn out to be better models.
I of course learned this over many books, but the mind blanks out over which one to suggest. I think biographies of Copernicus and Kepler would be good places to start.
Edit: you may find this interesting:
https://news.ycombinator.com/item?id=42347533
HN do you know what happened to John Baez's blog that listed his multiparty blog posts ? They are a treasure trove that I do not want to lose. Azimuthproject too seems to have disappeared
As a tangential hit on this issue, the relationship between the Catholic Church and science [1] is an interesting read. It's nowhere near as antagonistic as contemporary revisionary takes would suggest. In particular the most famed example of this is with Galileo (whose name is mentioned no less than 146 times on that fairly short page...) yet that was far more interpersonal issues than his concepts being an affront to theology. He wrote a book calling the Pope (at the time very much one of his supporters) through hardly veiled proxy, a simple minded idiot. Burning bridges is bad enough, but burning one you're standing on is lunacy.
If one does genuinely believe in a God then the existence of science need not pose a threat to that, since there's nothing preventing one from believing that God also then created the sciences and rationality of the universe. The classical 'gotchas' like 'Can God create a stone so heavy that he could not lift it?' were trivial to answer by simply accepting that omnipotence does not extend to things which are logically impossible, like a square circle.
[1] - https://en.wikipedia.org/wiki/Science_and_the_Catholic_Churc...
LLM could have kept the geocentric theory alive for another hundred or more years! Awesome.
I especially like your last paragraph. Even if our fundamental assumptions are wrong, current theories still work very well within appropriate bounds. And those bounds basically contain all practical scenarios here on earth. That's a big reason why it's hard to make progress on string theory, because we can't create scenarios extreme enough here on earth to test it.
So even if our fundamental assumptions are wrong and some new theory is able to explain a bunch of new stuff, chances are it won't impact the stuff we can practically do here on earth, because scientists have already been doing the most extreme experiments they can, and so far progress is still stalled on fundamental physics.
Heliocentrism from its earliest formulation was pretty bad for many reasons, including as you mentioned the desire to maintain circular orbits, as well as uniform velocities, epicycles, and more. You could easily pick a million holes in heliocentrism to 'disprove' it. And the geocentric view, as convoluted as it was, was observably accurate and predictive with 'holes' being plugged by simply having the entire dysfunctional model absorb them - e.g. by simply assuming retrograde motion as a natural phenomena, and otherwise - just add more epicycles.
Heliocentrism was most fundamentally driven by somebody, with extremely poor interpersonal skills (which is much more the reason he was left living his final days in house imprisonment, rather than his theory itself), moving forward on his own somewhat obsessive bias.
Similarly, with relativity. I have no idea what you mean by a 'consistent conceptualization' of Lorentz contraction, but length contraction was a completely ad hoc explanation for the Michelson Morley experiment. It's correctness was/is more incidental than anything else. Einstein did not cite Lorentz (or anybody for that matter), and I do not think that was unfair or egotistical of him.
--
I'm also unsure of what you're referencing with Sean Carroll, but I'd offer a quote from Michelson of the Michelson-Morley experiment saying essentially the same, "The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote.... Our future discoveries must be looked for in the sixth place of decimals."
So convinced was Michelson that the 'failure' of his experiment was just a measurement issue that he made that comment in 1894, near to a decade after his experiment and shortly before physics and our understanding of the universe was about to revolutionary explode thanks to a low ranking patent inspector.
I have no idea what you mean by a 'consistent conceptualization' of Lorentz contraction, but length contraction was a completely ad hoc explanation for the Michelson Morley experiment. It's correctness was/is more incidental than anything else. Einstein did not cite Lorentz (or anybody for that matter), and I do not think that was unfair or egotistical of him.
In "On the Electrodynamics of Moving Bodies"[1] Einstein checks his derivation against Lorentz contraction. It's on page 20 of the referenced English translation. Lorentz' model was ad hoc, E derived it with only 2 postulates (equivalence principle; c invariance). Lorentz was indeed cited, and the cite is useful to connect E's theory to real-world observation. This is true whether or not you want to get pedantic about the meaning of "cite" vs "reference".
1 - https://www.fourmilab.ch/etexts/einstein/specrel/specrel.pdf Originally "Zur Elektrodynamik bewegter Koerper"
Max Planck famously said, "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."
Now we know how to prevent it: popularize ideas like "physics is mathematics", "shut up and calculate", "it's useless philosophy not worth to think about", "nobody can understand it, so it's useless to even try". Also a nice excuse for ignorance.
The rapid decline in progress in modern times would seem most likely to suggest that something we are taking as a fundamental assumption is probably wrong, rather than that the next door is just unimaginably difficult to open.
We actually know we have:
Bell’s inequality tells us that the universe is non-local or non-real. We originally preferred to retain locality (ie, Copenhagen interpretation) but were later forced to accept non-locality. But now we have a pedagogy and machinery built on this (incorrect) assumption — which people don’t personally benefit from re-writing.
Science appears trapped in something all too familiar to SDEs:
A technical design choice turned out to be wrong, but a re-write is too costly and risky for your career, so everyone just piles on more tech debt — or modern epicycles.
And perhaps that’s not a bad thing, in and of itself. Eg, geons were initially discarded because the math doesn’t work out — but with the huge asterisk that they might still be topologically stabilized. But the math there is hard and so it makes sense to continue piling onto the current model until enough advances in modeling (eg, 4D anyons) allow for exploring that idea again.
Similar to putting off moving tech stacks until someone else demonstrates it solves their problems.
But at least topological geons would explain one question: why does space look like geometry but particles look like algebra?
Because topological surgery looks like both!
- - - -
clear that the luminiferous aether was wrong
Another interpretation is that the aether exists, but we’re also made of aether stuff — so we squish when we move, rather than rigidly moving through it (as per the theory tested by Michelson-Morley). That squishing cancels out the expected measurement in MM. LIGO (a scaled MM experiment) then works because waves in the aether squish and stretch us in a detectable way.
Modern theories are effectively this: everything is fields, which we believe to be low-energy parts of some unified field.
It's an S-curve only so long as intelligence doesn't increase exponentially as well. What would the story look like if an ASI existed?
The first turn in an S-curve can easily look like an exponential. ASI has physical limitations, so I don’t see why it wouldn’t take an S-curve as well, although at a much different rate than human intelligence.
Exponential increases in intelligence doesn’t imply that the universe is more complex to compensate.
It's just accelerated. AI is bound by physics just like everything else.
The S-curve is really about fundamental limits. Lets say ASI helps us make multiple big leaps ahead, I mean mind blowing stuff. That still doesn't change that there must be a limit somewhere. The idea that science and tech is infinite is pure science fiction.
One particular model: the electron g-factor.
Now go look up how precise a prediction the same model makes for the muon g-factor.
That is true for classical probability, but the idea that unknown quantities are determining the outcomes in quantum mechanics has been disproven in the event of the speed of light being a true limit on communication speed. This is known as, "Bell's theorem."
Bell's Theorem disproves local hidden variables.
Reality can be interpreted as non-local. There has been no conclusive proof it isn't.
c isn't a limit on the kind of non-locality that is required, because you can have a mechanism that appears to operate instantaneously - like wavefunction collapse in a huge region of space - but still doesn't allow useful FTL comms.
Bell's Theorem has no problem with this. Some of the Bohmian takes on non-locality have been experimentally disproven, but not all of them.
The Copenhagen POV is that particles do not necessarily exist between observations. Only probabilities exist between observations.
So there has to be some accounting mechanism somewhere which manages the probabilities and makes sure that particle-events are encouraged to happen in certain places/times and discouraged in others, according to what we call the wavefunction.
This mechanism is effectively metaphysical at the moment. It has real consequences and was originally derived by analogy from classical field theory, with a few twists. But it is clearly not the same kind of "object" as either a classical field or particle.
There may be no conclusive proof, but it's a philosophically tough pill to swallow.
Non-locality means things synchronise instantly across the universe, can go back in time in some reference frames, and yet reality _just so happens_ to censure these secret unobservable wave function components, trading quantum for classical probability so that it is impossible for us to observe the difference between a collapsed and uncollapsed state. Is this really tenable?
Strip back the metaphysical baggage and consider the basic purpose of science. We want a theoretical machine that is supplied a description about what is happening now and gives you a description of what will happen in the future. The "state" of a system is just that description. A good _scientific_ theory's description of state is minimal: it has no redundancy, and it has no extraneous unobservables.
Why isn’t the accounting mechanism a quantum extension of the principle of least action?
Comment was deleted :(
De Broglie and Bohm would like to have a word…
And that word would be....?
De Broglie–Bohm theory is a hidden-variable theory but does not allow for FTL communication.
My understanding is that it is not that simple, pilot-wave theories, are not the traditional hidden-variable theories. While some setups look very simple in pilot-wave compared to say the schrodinger equation, other setups are as unintuitive in pilot-wave as schrodinger equation is in some.
My lightly held conclusion is if it really was a full and more straight forward solution it would dominate the conversation more than it does now. This option was formed reading some primary sources but mostly reviews and comparisons of QM theories. Unlike other methodologies I have never working through a full QM example problem in pilot-wave theory.
I'm not sure what the point is you're trying to make. OP claimed
the idea that unknown quantities are determining the outcomes in quantum mechanics has been disproven in the event of the speed of light being a true limit on communication speed.
and I provided an immediate counterexample. Yes, Bell's Theorem and its exact assumptions are not entirely straightforward but let's please stop propagating those falsehoods that die-hard proponents of the Copenhagen interpretation commonly propagate.
Thank you! I need to read more in that area.
Let me throw in "Hydrodynamic Quantum Analogs" [1] as a fascinating review of how quantum effects emerge in experiments with bouncing oil drops on liquid. This is fully a pilot wave driven experiment and there has been a lot of academic work analyzing the system and trying to fit it into the de Broglie-Bohm formulations of quantum dynamics.
To quote section 10.2: "The [experimental] system represents a classical realization of wave–particle duality as envisaged by de Broglie, wherein a real object has both wave and particle components."
We've already got all those fields interacting in the real world, so I don't find it very far fetched that quantum mechanics emerges from their fully classically described interactions, probably expressed in some really gnarly 4D math.
[1] https://thales.mit.edu/bush/wp-content/uploads/2021/04/BushO...
The foundations of quantum mechanics may be advanced by new experiments, but not, I think, by staring at the models hoping for inspiration.
To come up with new experiments that might shed light it certainly helps to spend time exploring the models to come up with new predictions that they might make. Sure, one can also come up with new experiments based only on existing observations, but it's most interesting when we can make predictions, as testing those advances some theories and crushes others.
A model is supposed to be accurate. When it's inaccurate, you should understand where and how it's inaccurate and not just become agnostic.
The trouble with QM is with it's interpretations, not with the accuracy of it's predictions. The latter informs interest in the former. QM works, but the models imply that nature is neither "local" - e.g. entanglement experiments undermine hidden-variables, nor "real" - e.g. a particle does not have a momentum (or position) until you measure it. These physical properties are not just hidden, they are undefined. These implications fly in the face of basic macroscale intuitions about what "physical reality" means, which makes it interesting. Inconsistency is a signal that we have discoveries yet to make. Note that "Many worlds people" think there is no inconsistency - my sketch of a model is fully consistent with that interpretation, if you wish, by simply assign a new universe to every child node in which the node is reached.
What you say doesn't quite correspond to quantum physics as it's known. Quantum physics is quantitative and precise, so it's difficult to say there's something undefined there. It doesn't suggest nonlocality, absence of hidden variables means only absence of hidden variables. It doesn't suggest antirealism, if only due to precision, you can say it doesn't work how you want, but at worst this makes it unintuitive. Conversely Dirac formalism works as if quantum state exists in itself in precise form, which has a good compatibility with basic macroscale intuitions about what "physical reality" means.
But quantum physics can't predict exactly where the individual dots on the detector will be, only their distribution. That does not sound totally quantitative and precise and defined. You would not accept such predictions for macroscopic objects :)
It can't do it, because you want it to make a classical prediction. It shouldn't be able to predict what doesn't happen.
I want any theory to predict what actually happens, which is individual dots at detector.
Or at least some clear statement how comes our reality is not like that.
Would you be satisfied if the theory clearly states: "At the time of measurement, the position of the photon interaction is determined by randomly sampling from the quantum distribution"?
Your 6-sided dice example sort of brings some focus to his argument of 'its not a real wave it's a math wave ". The result of a 6-sided dice roll exists more in our minds as "math dice" because for most people, if you rolled and it fell in a sewer, lost etc, you wouldn't consider the roll complete until you grabbed a different dice and rolled it. More attached to the person rolling it and the resulting 'what does the number affect'.
Finally! Too much of physics is obsessed with the map and not the territory.
This is how you get the tortured reasoning that views measurement and observation as somehow different. Even einstein struggled.
Doesn't the difference between measurement and observation stem from an extension of the double slit experiment discussed in thus artucle?
It you place a detector on one of the two slits in the prior experiment, (so that you measure which slit each individual photon goes through) the interference pattern disappears.
If you leave the detector in place, but don't record the data that was measured, the interference pattern is back.
If you leave the detector in place, but don't record the data that was measured, the interference pattern is back.
This is not remotely true. It looks like you read an explanation of the quantum eraser experiment that was either flawed or very badly written, and you're now giving a mangled account of it.
I have heard similar things but this is THE most deeply weird result and I’ve never heard a good explanation for the setup.
A lot of people pose it as a question of pure information: do you record the data or not?
But what does that mean? The “detector” isn’t physically linked to anything else? Or we fully physically record the data and we look at it in one case vs deliberately not looking in the other? Or what if we construct a scenario where it is “recorded” but encrypted with keys we don’t have?
People are very quick to ascribe highly unintuitive, nearly mystical capabilities with respect to “information” to the experiment but exactly where in the setup they define “information” to begin to exist is unclear, although it should be plain to anyone who actually understands the math and experimental setup.
It's a little simpler than you're thinking: only fully matching configurations (of all particles etc) can interfere. If you have a setup where a particle can pass through one of two slits and then end up in the same location (with the same energy etc) afterward, so that all particles everywhere are in the same arrangement including the particle that passed through one of the slits, then these two configurations resulting from the possible paths can interfere. If anything is different between these two resulting configurations, such as a detector's particles differently jostled out of position, then the configurations won't be able to interfere with each other.
An interesting experiment to consider is the delayed-choice quantum eraser experiment, in which a special detector detects which path a particle went through, and then the full results of the detector are carefully fully stomped over so that the particles of the detector (and everything else) are in the same exact state no matter which path had been detected. The configurations are able to interfere once this erasure step happens and not if the erasure step isn't done.
Another fun consequence of this all is that we can basically check what configurations count as the same to reality by seeing if you still get interference patterns in the results. You can have a setup where two particles 1 and 2 of the same kind have a chance to end up in locations A and B respectively or in locations B and A, and then run it a bunch of times and see if you get the interference patterns in the results you'd expect if the configurations were able to interfere. Successful experiments like this have been done with many kinds of particles including photons, subatomic particles, and atoms of a given element and isotope, implying that the individual particles of these kinds have no unique internal structure or tracked identity and are basically fungible.
Interference pattern also disappears when detector detects absence of detection, which shouldn't change properties of the particle.
If anything is different between the two resulting configurations of possibly affected particles, such as the state of the particles of the detector, then interference can't happen. It's not just about whether the individual particle going through one of the slits is in an identical location.
An important thing to realize is that interference is a thing that happens between whole configurations of affected particles, not just between alternate versions of a single particle going through the slit.
Do you have a reference for that last paragraph?
I'm not a physicist, but that doesn't really sound right. Might I ask you a reference or an explanation?
It is correct. There's SO MUCH weirdness surrounding the double slit.
https://en.wikipedia.org/wiki/Double-slit_experiment#Variati...
Hm, it says the observer-at-the-slit experiment hasn't been performed because it would absorb the photons. But it also says the experiment can be done with larger particles, so that shouldn't be a problem ...
I believe I first read about it in the book, Gödel, Escher, Bach.
Comment was deleted :(
Comment was deleted :(
The fact that the model is a continuous real function is an artifact of the model, a weakness not a strength.
The wave function is the square root of a probability distribution. The wavefunction is a continuous real function of position because position is modeled as a continuous real variable. The idea of the wavefunction as a function of position is generally supported by the fact that it can be used to predict the measurement results of diffraction experiments like the double-slit experiment, but also practically the whole field of X-ray diffraction.
There is not just one experimental result that is explained by wavefunctions. There are widely used measurement techniques whose outcomes are calculated according to the quantum properties of matter — like X-ray diffraction and Raman scattering — which are widely considered to be extremely reliable. There is a good reason to explain the model of reality expressed by the equations as clearly as possible, because we want people to be able to use the equations.
Plenty of people (though certainly not all) expect quantum mechanics to be eventually modified to have a consistent theory of gravity. But physicists have experience with this. Special relativity and classical quantum mechanics were both more complex than Newtonian (classical) mechanics, and quantum field theory is more complicated than either. General relativity is substantially more involved than special relativity. It is likely that further extensions will continue to get worse.
The model of reality taught by Newtonian (classical) mechanics is also still widely discussed and used in introductory physics courses and many areas of physics (such as fluid dynamics) and engineering. This model also discusses position on the real line. Even though classical mechanics had to be modified, the use of Cartesian coordinates and real numbers turned out to be durable.
Usually the finitists will formally "rescue" countability by suggesting that the world could exist on the computable numbers, which are countable and invariant under computable rotations. But the computable numbers are a very unsatisfying model of reality, and have a lot of the same "weirdness" as the real numbers. Therefore they suggest that some other model must exist without giving a lot of specifics. Why this should be somehow helpful and not injurious to the pedagogy of physics is not clear.
"The Tao that can be told is not the eternal Tao."
The wave went through the two slits, for any normal everyday definition of "go through". Yes, you can say "the wave function is just a function that assigns an amplitude for the particle's presence at every (x, y, z, t) co-ordinate, it doesn't go anywhere". But that's no more valid than saying that a regular water wave is "just a function that assigns a height to the water at every (x, y, t) co-ordinate, it doesn't go anywhere".
There is a pattern to the wavefunction, where the amplitude at (x+delta, y, z, t+delta) is closely related to the amplitude at (x, y, z, t). (Specifically, it's that amplitude rotated by delta times the mass of the particle). Or, unless you're being wilfully obtuse, the wave packet moves from x to x+delta in time t to t+delta, rotating as it goes as quantum mechanical waves do.
You can, if you really want, insist in Zeno's paradox fashion that nothing ever goes anywhere, that things just exist at given places and times, and in a certain sense that's true. But there's nothing QM-specific about that, and it's misleading to complicate a discussion of QM by claiming so. If we allow that things can move through space, and waves can move through space, then the wave moves through the two slits in the normal sense of all those concepts.
I wish people would stop going out of their way to make QM sound confusing/weird/"spooky". Most of it is just normal wave behaviour for which you can observe exactly the same thing with everyday classical waves.
I agree things like the slit experiments are easily explained by classical waves, and thus seem to point to particles being waves; and I see no inherent reason to bring up probability distributions and such to answer that observation.
Where I'm struggling is that classical waves will always spread out spherically, and their energy must do so too. The issue here being that if a photon is a minimal quanta of energy, but is just a classical wave, what prevents it from spreading out and having sub-photon energy? Or if indeed it does, how does that sub-photon quantity get measured? -- if these experiments claim to be emitting a time-series of single photons, classical wave interference won't occur (again, being separated in time).
But that's no more valid than saying that a regular water wave is "just a function that assigns a height to the water at every (x, y, t) co-ordinate, it doesn't go anywhere".
I think this is a very important distinction actually. A wave amplitude represents an actual displacement in some medium, and waves interfere constructively/destructively because they both move the medium in the same/opposite direction at the same time at the same location. So when a water wave gets pushed through two slits, it breaks into two separate water waves, one coming from each slit, and those two waves push the water up and down at the same time at different locations.
But wavefunctions are very much not like that. A wavefunction amplitude does not represent a displacement in any kind of medium. They represent a measure of the probability that the system being described is in a particular state at a moment in time. That state need not even be a position, it might be a charge, or a spin, or a speed, or any combination of these. Basically quantum systems oscillate between their possible states, they don't oscillate in space-time like matter affected by a wave does.
This also makes it very hard to conceptualize what it means for these wavefunctions to interfere. So the simple picture of "wave A and wave B are pushing the water up at the same time in the same location, so the water rises higher when both waves are there" is much harder to apply to probability oscillations than a direct comparison makes it out to be.
An additional problem when comparing wavefunctions to waves in a medium is that there is no source of a wavefunction. Any system you're analyzing has a single wavefunction, that assigns probability amplitudes to every possible configuration of that system. You can decompose the system's wavefunction as a sum of multiple wavefunctions corresponding to certain measurables, but this is an arbitrary choice: any such decomposition is exactly as valid. In matter waves, if I drop two stones in water at different locations, the water surface's movements can be described as a single wave, but there is a natural decomposition into two interfering waves each caused by one of the stones. There is no similar natural decomposition that quantum mechanics would suggest for a similar quantum mechanical system.
when a water wave gets pushed through two slits, it breaks into two separate water waves, one coming from each slit, and those two waves push the water up and down at the same time at different locations.
What physically observable distinction are you drawing? The points of water on the far sides of the slits will have a certain height at each point at each time, forming the interference pattern you'd expect. You can decompose that function into a sum of two separate waves, if you want, but you don't have to. And exactly the same thing is true of the quantum mechanical wavefunction for a particle passing through a pair of slits.
A wavefunction amplitude does not represent a displacement in any kind of medium. They represent a measure of the probability that the system being described is in a particular state at a moment in time. That state need not even be a position, it might be a charge, or a spin, or a speed, or any combination of these. Basically quantum systems oscillate between their possible states, they don't oscillate in space-time like matter affected by a wave does.
I don't think that's a real distinction. Water height is a different dimension from (x,y) position and it behaves very differently; that the wave is moving across the surface and that the surface is moving up and down are orthogonal facts, the reason the former is movement isn't that the latter is movement. A classical electromagnetic wave moves even though it isn't in a medium that's moving (and so does e.g. a fir wave).
You can decompose the system's wavefunction as a sum of multiple wavefunctions corresponding to certain measurables, but this is an arbitrary choice: any such decomposition is exactly as valid. In matter waves, if I drop two stones in water at different locations, the water surface's movements can be described as a single wave, but there is a natural decomposition into two interfering waves each caused by one of the stones. There is no similar natural decomposition that quantum mechanics would suggest for a similar quantum mechanical system.
Again I don't think this is a real distinction. You have exactly that natural decomposition in the QM system - it's not the only valid decomposition, but it is a valid one and it has some properties that make it nice to work with. And similarly for dropping stones in the water, infinitely many other decompositions are possible and equally valid (e.g. decomposing as two copies of a wave where you dropped two half-sized stones into the water).
My personal interpretation is that elementary particles physically are those waves and never anything else. Those waves interact with each other with probabilistic events exchanging some energy and momentum and reshaping each other. They can get narrowed down if they exchange energy/momentum or get spread apart, for example through interactions on the edge of an object. What's completely virtual for me is the idea of pointlike particles occupying some specific location and having some specific momentum. Almost everything we know contradicts this idea, and yet we cling to it.
I think Craig Bohren wrote in one of his books, that to get anything calculated and done the waves are all you need. Particles are nice for some kind of visualization, but they don't really help getting things done. I liked that.
Instead of particles I like to view the interactions like the forming of a lightning in a thunderstorm. The energy-field builds up, And at some point of contact the energy is being released in a single lightning strike.
What I still wonder is, if the interaction really depletes the energy-field instantly in a single point, or if there is more going on (on different timescales - maybe with speeds not related to the speed of light).
I always feel that we are inclined to ask this question because we want to treat the wavefunction as if it were a probability distribution. While they share some properties, fundamentally they are not the same thing.
In typical probability, we deal with an ensemble of fixed states, or at least phenomena that can be simulated as such.
In quantum physics, the wavefunction is fundamental. The question "what was the exact path?" is meaningless. In particular, if we take the approach of Feynman path integrals, we find that particles take many paths - including circular paths through each slit - before arriving somewhere else where they interact (i.e., become entangled) with, say, an electron in the screen.
Sure, we may consider different experiments (e.g., quantum erasers, see https://lab.quantumflytrap.com/lab/quantum-eraser), but analogies with deterministic particles are whimsical - sometimes they work, sometimes not.
So I only have a B.S. in physics but my impression is that the weird parts of quantum mechanics are fundamentally a measurement problem. At the quantum level, we are very very limited in what we can use to measure properties of a quantum system - which is why we resort to probabilities. Wave functions are just a mathematical representation of a physical property that are (only?) ever operated on using quantum operators which result in a statistical distribution. Because they are so closely tied to probabilities I struggle with interpretations that try to say that perhaps these wave functions are something physical and based in reality (i.e. they are in superposition so particles must take on every possible state at once). An analogy I use is its like when we talk about sample sizes of a population of people, what is an ‘average person’? An average person is not something physical we can pick out, it exists in abstract. I’m curious if anyone with more experience in QM can shed light on how sound my thinking is here.
You're partially correct, but describing it in that way makes it sound like if you could "just look a little bit closer" the statistics would disappear, which doesn't happen. So it's more subtle than this. Fundamentally it's because QM doesn't use additive probabilities, but rather additive amplitudes which are complex numbers, and the probability is the square of the sum of these, so you can get interference between amplitudes. You can never get interference by adding probabilities.
In the dual slit experiment this is visible as you can't get the interference effects by summing the probabilities for "particle through slit 1" and "particle through slit 2" but rather you need to sum the amplitudes of the processes.
Working physicists (since 100 years) just do this, there is no practical need to interpret it further, but it would be cool if someone could figure out some prediction/experiment mismatch that does indeed require tweaking this!
Sounds like Einstein's hidden variables theory: below wave function picture there's more fundamental newtonian reality that produces the higher level wave function behavior, but is itself inaccessible due to insufficiently fine instruments, aka hidden variables. "God doesn't throw dice" is about that.
Wave functions are just a mathematical representation of a physical property that are (only?) ever operated on using quantum operators which result in a statistical distribution.
It is not correct— at least not unless you subscribe to the Copenhagen interpretation. Yet, while this interpretation is a simple heuristic for interaction with big systems (e.g., a photon hits a CCD array), none of the quantum physicists I know treat it seriously (for that matter, I have a PhD in quantum optics theory).
I mean, at some certain level, everything is "just a mathematical representation" - in the spirit of "all models are wrong but some are useful". But the wavefunction is more fundamental than measurement. The other can be thought of as a particle entangling with a system so large that, for statistical reasons, it becomes irreversible - because of chaos, not fundamental rules.
For some materials, I recommend materials on decoherence by WH Zurek, e.g. https://arxiv.org/pdf/quant-ph/0105127. Some other references (here a shameless self ad) in https://www.spiedigitallibrary.org/journals/optical-engineer... - mostly in the introduction and, speaking about interpretations, section 3.7.
EDIT: or actually even simpler toy model of measurement, look at the Schrodinger cat in this one: https://arxiv.org/abs/2312.07840
The measurement, i.e. the Born rule, is just as fundamental as the wavefunction. The wavefunction doesn't mean anything on its own, it's not a measurable quantity that can be used to make any observable prediction whatsoever. If I claim that the wavefunction amplitude for some electron being at some location at some point in time is 1/2(1+i), how would you verify this prediction without invoking the Born rule?
You may say that nuclear fusion in stars does not mean anything; only what matters is that we see light. At a certain level it is true, but to simulate a system we need to simulate its inner workings, not only - the end effect.
The exact phase of a wavefunction does not matter - but it is an important phenomenon, giving raise to gauge invariance. The Born rule can be derived. In short, since we use unitary operators, length is preserved. For a derivation, see https://journals.aps.org/pra/abstract/10.1103/PhysRevA.71.05....
Also, to be nitpicky, we also never measure probabilities. Something (macroscopic) happened or not. It gives rise to quite fundamental and philosophical questions, including "what is (classical) probability" (I don't know an answer that fully satisfies me), many world interpretations (maybe all possible things just are), and in general what on indeterminism and free will.
I'm not a quantum physicist and so can't really comment on why, but it's clear that the paper you linked is not widely accepted, as the Born rule is still taught as a postulate of quantum mechanics, not a derived property of the wavefunction. I'd wager a guess that the paper ends up inventing some other postulate that is itself not derivable from the wavefunction, so it becomes at best a philosophical matter which postulate you actually prefer.
I also don't agree with your comparison of what I said to the nuclear reactions happening inside a star. The problem with the wavefunction without the Born rule is not that it's difficult to observe, it's that it's literally meaningless: knowing the value of the wavefunction for some state of a system doesn't tell you anything at all unless you apply the Born rule to this value.
And as for probabilities, certain kinds of probabilities at least have a very clear and simple definition (though they are rather narrow cases): if you repeat an experiment in exactly the same conditions N times, and an outcome O happens in p/N times and doesn't happen (1-p/N) times, then we define P(O), the probability of outcome O, as the value p/N. For systems where this applies, it is very much a measurable quantity (with some noise, of course, related to the fidelity with which you can reproduce the same experiment).
I do agree that this well-defined, measurable, concept of probability is rarely what we mean by "the probability of O", since (a) it's often hard or impossible to repeat (or even perform) the experiment, and (b) we often care about what will happen the next M times we repeat this experiment, and the measure P(O) I defined above does not tell us anything about future events.
The wavefunction doesn't mean anything on its own, it's not a measurable quantity
There was an experiment that measured and built a picture of electron orbitals in a water molecule.
It did that by checking whether an electron is found at a particular position relative to the nucleus lots and lots and lots of times, and building a heatmap of where the electron was actually found in ach individual experiment; the heatmap of course corresponded to the wavefunction model. So the experiment found that the probability of finding the electron at a certain position exactly corresponds to the square of the amplitude of the wavefunction at that position, i.e. the Born rule.
What the experiment did NOT do is directly detect the wavefunction of the electron, because that is, again, not a phsycially meaningful quantity.
Thank you, I will look into these resources
I’ve always wondered, has there ever been a definitive experiment where one photon hits a slit and on the other side two photons come out, but then when you add a photon observer, it immediately only comes out on one side? Or has the proof always been mathematical rather than a live experiment?
Edit: Thank you all for the responses, it has been very educational. It appears I was misunderstanding the most important aspect of the double slit experiment. A photon is a wave function when unobserved, it literally goes through both slits and creates an interference pattern like how waves in water would. However, when observed at the slit, or at the detector screen, the wave function collapses and only one photon(billiard like particle) will be detected.
Double slit experiment did happen and totally reproducible even then photons/electrons are sent by one at a time.
"two photons come out" part makes no sense though. On a target side, there's always single hit after single photon/electron, but distribution of theses hits as if said electron got through both slits and interfered with itself
P.S. the funny thing is - this works on any small thingy, measured up to 2000 atoms-big, as if it's the property of the universe itself
The experiment working on clusters of atoms is news to me and I loved getting to know about that. But the thing that really breaks my mind is the experiment that proves that the behavior depends on the possibility of getting information from which slit the particle went through. So we can rule out the act of measurement itself interfering with the behavior of the particle.
They did it by splitting a beam of particles into a pair of entangled particles and then setting up a way to measure the polarity of one of them after the point in time where it even hits the final screen. If you measure the polarity then, after the other stream of particles from the beam had already had time to make the pattern, the pattern will be two clusters. If you don't, it goes back to an interference pattern.
That one really cemented the notion in my head that this is just how the Universe is and not some local weirdness with particles and measurements.
I think Sabine explained this social effect few years ago. I know she's a little controversial, but the key thing in the video (as opposed to all other videos about DSE on the internet) was that you don't get "two clusters" actually. They are both statistical parts of a single [non-]interference pattern. "||" is a lie. I'm not in a physics rebel camp and don't prefer Sabine either, but after that I sort of lost trust in the interpretations that can't even get the resulting picture right. I even suspect that showing dumbed results amplifies "wow" effect and monetizes better.
https://www.youtube.com/watch?v=RQv5CVELG3U
This is the video if you're interested. Again, I'm no physicist and don't know if explanations are legit or statistically correct. But that little || trick that all other popsci videos play on you, that's a true concern.
You are correct, the experiment I referred to does not yield two distinct clusters of collisions at all.
You can check it out here.
Summary https://www.stonybrook.edu/laser/_amarch/eraser/index.html
Paper https://www.stonybrook.edu/laser/_amarch/eraser/Walborn.pdf
My fascination with these experiments has never been due neat clusters of impacts, although popsci depictions have clearly tainted my memory.
Hadn’t hear that, that one is wild
That version is called the quantum eraser experiment. It really is brain bending.
I would love to try this experiment with something basketball sized out in space. Like we build an enormous basketball detector behind a double slit inside an unobservable black box. If thr basketball started acting like a wave I would be sooo freaked out
The largest double-slit projectile I know of is C-60, a soccer-ball shaped molecule of sixty atoms.
https://iopscience.iop.org/article/10.1088/2058-7058/12/11/4
Wiki, citation 19
The largest entities for which the double-slit experiment has been performed were molecules that each comprised 2000 atoms (whose total mass was 25,000 atomic mass units).[19]
Comment was deleted :(
That website's captcha is horrible!
Or amazing.
I got cats with sunglasses
Me too. And I had to click on bird legs to verify o_O
I don’t know what it’s called but I read about a proposed experiment to do it in space with macrosized glass beads.
If the slits are big enough, isn’t that just gravity?
And Why can't gravity serve as an observer?!
The electron/proton entering a slit is affected by gravity!
The entanglement theory would imply that if you build a detector that turns the gravity interaction into a finite piece of data observable by an (amplified) system, then gravity will act as an observer and collapse the waveform when it reaches that point. That's my take on the whole thing... It's almost like information theory. If the information is lost to the sands of time (below noise floor if you will) then the entanglement can continue.
Are you referring to the double-slit experiment? If so, yes: It has always been an experiment. The experiment came before any theory explaining the behavior AFAIK. https://en.m.wikipedia.org/wiki/Double-slit_experiment
The double slit experiment has been replicated even with fairly hefty molecules.
https://www.nature.com/articles/s41567-019-0663-9
Here, we report interference of a molecular library of functionalized oligoporphyrins with masses beyond 25,000 Da and consisting of up to 2,000 atoms, by far the heaviest objects shown to exhibit matter-wave interference to date.
It would be awkward to say that the 2000 atom molecule comes out of both sides... but it does, until you look.
The double slit experiment is not a duplication cheat of reality... it's weirder than that.
Am I misunderstanding the significant of the double slit experiment?
I thought the takeaway wasn’t that the particle comes out both sides, the implication is that the behavior of a single particle is the same as the behavior of multiple particles - that is to say, it appears to be an interference pattern, even when there should be no other particles to interferes with the single one.
No you're understanding correctly (I think), the behaviour of a single detected particle depends on all possible paths it could take to get to the detection.
This is fundamental to 100 years of quantum mechanics and underlies most of physics including all semiconductors, materials science, chemistry, lasers, etc. The double slit experiment is just a very good illustration of the principle boiled down to its essentials, which is why it's everywhere in pop-sci. It makes for more accessible story than describing how a hydrogen atom works.
Live experiments have been done and they get really weird: https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser
I’ve always wondered, has there ever been a definitive experiment where one photon hits a slit and on the other side two photons come out, but then when you add a photon observer, it immediately only comes out on one side? Or has the proof always been mathematical rather than a live experiment?
Only one photon comes out, but it can interfere with itself if it had the possibility of going through either slit.
That nuance aside, the Quantum Eraser Experiment is a real physical experiment that covers what I think you're asking about. If you send photons through double slits in a setup where you can tell which slit the photon went through, you don't get an interference pattern. If you can't tell, you do get the interference pattern.
What does "one photon hits a slit and on the other side two photons come out" mean?
There is no photon multiplication happening on the double slit.
One photon hits the slit and one photon comes out. It is only if you repeat the experiment many times that you start to see a strange wawe-like pattern in where the photons hit.
It is as if every photon that went through the slit is somehow aware of all other photons that did so too so each photon can choose the (random) position where it hits on the wall behind the slit such that together they look like as if a WAWE went through the slit.
That is (one reason) why they call it "Quantum Weirdness". God is playing dice with us
No. The same photon is aware of all alternative paths it can take, without creating or interacting with any other photon.
There's no photon multiplication, and no "all other photons" changing their path.
There is some inter-photon interaction because they are bosons. But it's not significant enough to impact the multi-slit experiment. And the experiment works exactly the same way if you send only one photon at a time.
It is as if every photon that went through the slit is somehow aware of all other photons that did so too
Why isn't it just that there's a probability density function that describes the aggregate outcomes of a large number of samples from a random process? Why is "memory" involved?
I think because instead of two clusters like you'd expect from random BBs being shot, you get multiple bands like you'd see with interfered waves. Even when shot one at a time.
I really don’t understand the topic much but this veritasium video is quite eye opening and goes into further depth than any layman explanation I’ve ever seen in the topic: https://youtu.be/qJZ1Ez28C-A?si=6gSQYcJPpaSIt1x1
The experiment came first, all the rest of it came after to try and figure out why the results are as weird as they are.
RE: your edit
You still do not understand what is happening, please READ the article, it shows that the wave function doesn't go through anything and the it certainly doesn't create the interference pattern.
Perhaps the closest thing would be some nuclear decays that spit out two gamma rays of equal energy in opposite directions. I'm struggling to remember which isotope does this.
It’s a cadmium isotope. Super cool technique, I think perturbed angular correlation.
https://en.m.wikipedia.org/wiki/Perturbed_angular_correlatio...
I haven’t used it for my research, but it’s an incredible local probe of electric and magnetic fields in materials. There’s no other technique that I’m aware of that smuggles information about the chemical structure of a single coordination sphere into such clean, distinct emissions. The brief excited state of the isotope after the first emission event and before the second is sensitive to practically everything. It all shows up in the deconvoluted spectra.
Shame nearly all the isotopes that work for this are not ones that are super interesting for modern quantum materials. Perhaps that will change out of necessity.
If you want to learn more of about current theory and experiments I suggest you pick up “Waves in an impossible sea” by Matt Strassler
I think the problem is in insisting on referring to the photon as a particle.
In fact the photon may not actually exist. and I have questions as to what "single photon experiments" are actually measuring. let me explain.
The EM field is not quantized, or at least not quantized at the level of a photon, what we call a photon is the interaction of the EM field with matter, or more precisely with the electron shell of matter. it is the sound of the wave breaking on the shore, not the wave.
Now none of this actually matters as the only method we have of interacting with the EM field is through matter(electrons really) so we can only measure it in photon sized increments.
Well, the EM field CAN be quantified. Just look up any textbook on quantum field theory. And the quanta of the EM field is called the photon.
But, to "solve" the wave /particle conundrum, I like to think of it as fields all the way down. A "particle" is then a localized and quantisized interaction of said field with another field.
If you think of particles as small billiard balls flying through space on some ballistic trajectory, you'll soon run into all kinds of trouble and the mental model breaks down.
The EM field is not quantized, or at least not quantized at the level of a photon, what we call a photon is the interaction of the EM field with matter, or more precisely with the electron shell of matter.
I don't agree with this. You can absolutely consider a classical (non-quantized) EM field interacting with quantized matter. This semi-classical model can describe the photoelectric effect, but it cannot describe other experimental observations such as sub-poissonian photo-detections / photon anti-bunching.
Comment was deleted :(
Comment was deleted :(
Just for sake of argument, when looking at it from this angle, EM particles could exist and we lack the ability to emit a single one? But then why would these "single photon" double slit problems not split the particle bunch further?
I honestly don't know, that is my question as well.
However note that we can only perturb the em field in photon sized energy levels, and we can only pick up disturbances of the em field in photon sized bunches as well. Not sure what this implies for how em field energy is accumulated on electrons in order for us to detect it.
I think the problem is in insisting on referring to the photon as a particle.
Or in insisting on referring to the electron as a particle.
“We begin by throwing an ultra-microscopic object — perhaps a photon, or an electron, or a neutrino”
It's always seemed to me that these types of question only exist because we're considering a choice between two imperfect models. If we had a better model of what a "particle" really is, then there would be no dualing models nor paradox.
Do we really have to choose between wave and particle? What does the "particle" model bring to the table that a localized (wavelength-sized) wave/vibration could not?
Being pedantic about the language, there is only one model, and effectively every physicist agrees on it.
What they differ about is the interpretation of that model. The equations are the same, but differ in what the variables refer to in the real world. It's really a matter of solving the equation for X vs Y, saying which one is independent and which is dependent.
The purpose is to take the fact that none of the variables correspond directly to anything we have any experience with. The best we can hope for is to isolate part of it and say "this much is like this thing we understand, but there's an additional thing that we'll treat as a correction".
We can try to take the whole thing seriously, and just call it "a quantum thingy" which is not like anything else. This is sometimes called "shut up and calculate", but even that makes assumptions about what things are feasible to calculate and which are hard. That skews your understanding even if you're trying to let it speak for itself.
there is only one model
There is one set of observations, and many many models to describe them: Schrödinger equation formulation, matrix mechanics (Heisenberg, Born, and Jordan), path integral formulation (Feynman), phase space formulation, density matrix formulation, QFT or second quantization, variational formulation, pilot wave theory aka de Broglie-Bohm theory, Hamilton-Jacobi formulation, PT-symmetric quantum mechanics, Dirac equation formulation (well, not really independent, just for spin 1/2 particles).
They all give the same results, and are therefore mathematically equivalent, but different models tend to be associated with different interpretations:
Schrödinger Equation : Copenhagen, Bohmian Mechanics, Many-Worlds
Matrix Mechanics : Copenhagen
Path Integral : Many-Worlds, Stochastic
Density Matrix: Ensemble, Decoherence-based
Second Quantization : Many-Worlds
Pilot Wave Theory : Bohmian Mechanics
Consistent Histories : Decoherence-based
Relational QM : Relational Interpretation
Stochastic Models : Stochastic Interpretations, GRW (Ghirardi–Rimini–Weber) Collapse
We have a better model, but it's an ecuation so horrible that nobody want to solve it.
Luckly, sometimes the exact solution can be very accuately aproximated with a wave ecuation.
Luckly, sometimes the exact solution can be very accuately aproximated with a particle ecuation.
(Sometimes, the exact solution can be aproximated saying that the lowest energy state is an eigenvector of the Schoedinger equation. Is that a wave? It's not localized, but not very wavy.)
But neither are the exact solution, just aproximations that solve tpgether 99% of the experiment.
It's difficult to explain, because to explaing the detials you need like two years of algebra and calculus and then like another 2 years of physics, and now you get a degree in physics.
It's possible to solve the difficult ecuation only in very simple cases like electron-electron colissions, if you allow some cheating and a tiny error. For more complicated systems like electron-muon there are some problems. And for more complicated systems, you get more technical problems and more aproximations.
What is the name of the better model which you are talking about?
I'm also not sure, but I'm thinking some variation of Quantum Field Theory.
Yes, in general https://en.wikipedia.org/wiki/Quantum_field_theory but also the list of details in https://en.wikipedia.org/wiki/Standard_Model
The photoelectric effect [0] can be explained if light behaves as discrete particles, but not when it's a wave since a higher amplitude does not imply a higher energy transfer.
You can explain the photoelectric effect with classical light (i.e. as EM waves) as long as you properly quantize the atomic energy levels. This is often called s semi-classical model.
However, photo-detections with sub-poissonian statistics cannot be explained under this semi-classical model, but it can be explained with properly quantized EM field (i.e. with photons).
For reference, see Mandel and Wolf's Quantum Optics textbook.
OK - a bit like the fairground game of trying to knock coconuts off a stand by throwing a wooden ball at them. It doesn't matter how many balls you are throwing per minute (total energy being delivered) if the energy of each ball doesn't cross the threshold to knock the coconut off.
OTOH, the energy of a photon is such an abstract concept (not like the kinetic energy of a ball) that I'm not sure it really helps explain it.
Well, there is a saying about spin of an electron. Imagine that you have a ball and it's spinning. Except, it's not a ball. And it's not really spinning.
Particles aren't necessary for it, any quantization is sufficient.
Sure, I'm using the word "particle" loosely.
If I emit a bass signal at a low amplitude, but then emit it at a higher amplitude, I can see the effect on a glass of water on the table. What’s happening here if amplitude does not carry power?
My understanding is that theoretically energy transfer is a function of wavelength.
The point here is that the total displacement of water caused by a sound wave depends on both the amplitude of the wave, and its frequency, with no limit: if the wave has high enough amplitude, it will displace water even if the wave is very low frequency.
However, this is not true for EM interactions. If you shine infrared light on a solar panel, you'll see 0 current from it, even with an extremely powerful source of light (at some point the material might heat up enough it starts showing some thermo-electric effect, but that's a different thing). However, if you take even a very low intensity ultraviolet source, you'll see a measurable current right away. This is the unexpected behavior that quantized interactions have, which can't be reproduced with non-qunatized waves like sound waves.
Sorry, my last sentence wasn't formulated well. Yes, a wave with higher amplitude (or one could say "intensity") has a higher energy. The photoelectric effect happens when you shine light with "enough" energy on some material such that the atoms of the material are ionized, i. e. electrons are freed. You need a minimal energy for this and if you use dim light with a low frequency, you will not see the effect. Now, if you increase the frequency of the light, you can measure electrons. If, instead, you make the light brighter, that is, increase the amplitude of the wave (if it were a wave), you don't see electrons. So at least in this experiment, light does not function as a wave.
But once you've increased the light frequency (i.e. photon energy) above the required threshold, THEN making the light brighter (more photons) will increase the number of electrons emitted.
Comment was deleted :(
We do have a better model, but the mathematical model doesn’t analogize well to the real-life concepts we’re used to.
In video games that have procedural generation, there's often a seed function that predicts a continuous geometry.
But in order to track state changes from free agents, when you get close to that geometry the engine converts it to discrete units.
This duality of continuous foundation becoming discrete units around the point of observation/interaction is not the result of dueling models, but a unified system.
I sometimes wonder if we'd struggle with interpreting QM the same way if there wasn't a paradigm blindness with the interpretations all predating the advances in models in information systems.
Precisely. The question "is it a particle or a wave" is wrong. It's neither. It's a particle-wave. Something that behaves like a classical wave or particle depending on the situation, but it doesn't switch between them or anything like that. It's not a "particle that has interference" or a "wave with a location".
Classic labelling issue.
What does the "particle" model bring to the table that a localized (wavelength-sized) wave/vibration could not?
A lot of the article is about this. Start with the section "The Wave Function of Two Particles and a Single Door". The wave packet view can't explain why you don't for example see a "particle" (that is, a dot on a detector) show up simultaneously having gone through two different doors. You have to think about it in terms of a wave in the space of possible joint particle positions.
I also don't understand this. AFAIK "particle" in this context means quantized unit rather than contiguous solid object. And I see no reason why a quantized unit of a wave can't propagate through two slits simultaneously. But my level of understanding here is YouTube level so if you know more please correct me.
I trust the physics works out.
The problem in these discussions is how to build an intuition about the underlying physical model.
I fail to have an intuition of how can a quantized unit of wave propagate through both slits.
I know that the equations say that the probability of finding the particle at a given location is given by the amplitude squared of the wave function (Born rule).
The image that a "quantized unit of wave propagates through two slits simultaneously" doesn't help me build any further intuition.
Do the two parts going through the two different paths carry half the unit? Clearly that's not the case otherwise they wouldn't be quanta anymore. So does it mean that the entire wavefront is "one unit" no matter how spread out? But in that case, "one unit" of what?
If one just sends photons through a narrow single slit, then the pattern that builds up on the screen (if you send multiple photons, and record their positions) will be a banded diffraction pattern.
If you have two slits, with a detector to determine which slit the photon went thru, then it'll behave as if it only went thru one of the two slits, at random, and what'll build up on the screen will be the two (slit A + slit B) overlayed diffraction patterns.
Finally, if you have two slits with NO detector, then what will build up on the screen is the interference pattern as if the photon had gone thru both slits simultaneously and the two resulting banded diffraction patterns interfered with each other. So, what SEEMS to be happening in this case is that the quantum state of the system post-slit is that of the photon simultaneously having gone thru both slits, each slit having diverted it per diffraction, and then these diffraction patterns (probabilities) interferering. Wave collapse can only be happening after this interference (if it was before then there would only be one diffraction pattern and no interference), presumably when quantum state interacts with the screen.
So, yeah, it seems that the "photon" does "go" through both slits, but this is a quantum representation, not a classical one.
If you consider the photon (particle), a single photon does not go through both slits at once. It takes the path of least action. It's only when the experiment is repeated that the wave-like behavior emerges. The wave function is simply a probability approximation of the least action required for any set of start and end position.
So the question in the title doesn't make much sense.
Particles don't exist. We just perceive waves with high decoherence rates as particles. Things we call objects effectively have a 100% decoherence rate. Things we call waves like light have low decoherence rates.
But underneath it is all quantum mechanics.
What's a decoherence rate?
Decoherence is the process that makes it impractically difficult for an experiment to be designed that makes your observations the two interfering possibilities in some kind of double-slit experiment.
Interpreting this in the many-particle case is more difficult, but the basic idea is that due to single-particle uncertainty, you can't have a definite number of particles indexed by momentum and a definite number of particles indexed by position at the same time. If I had 100 particles that were definitely at x=0, in terms of momentum they'd be spread out over the range of possibilities unpredictably.
There is a difference between them actually having these momentum, and your knowledge about these attributes.
The Heisenberg uncertainty principle is not about particles. It’s about statistics and our knowledge about something.
Not exactly. The Heisenburg uncertainty principle doesn't apply to knowledge (actual observations), it applies to observables (things that could affect interactions in principle). That is, Heisenburg uncertainty is not merely a limit on how fine our measurement instruments could get, or even how much information about an interaction we could conceive of and store. It's a limit on how strongly those properties can affect an interaction at all.
That is, the future direction and momentum of an interaction between two particles can't depend very strongly on both the position where the interaction happened, and on the momentum the particles had before the interaction. If the interaction is a direct collision, so the position is heavily constrained, then the momentum the particles had before the collision will not really matter a lot for what happens after they collide.
If you were to "put yourself in the shoes of" one of the particles, you could say that, because it "knows" where the other particle is at the time of the collision with high precision, it can't "know" the momentum the other particle had with any precision, so it's future movement can't depend strongly on that. But this stretches the definition of "knowledge" far beyond the normal understanding of the word.
Yes, I apparently used the wrong word for this. Not a physicist.
My point is that it’s not something special about quantum mechanics or particles or even positions and momentum.
It’s inherent in Fourier transform, conjugate variables and covariance matrices.
It happens outside QM, and even outside physics. It’s not a physical attribute, it’s statistical.
How frequently a wave would go through just 1 of the slits. If you threw a baseball at a wall with two baseball sized slits it would basically always go through just one of the slits. You would never see an interference pattern.
This is because a baseball is interacting with other matter on the way to the slit. A photon on the other hand might not interact with any matter and it stays as a wave and you can see an interference pattern on the other side.
https://m.youtube.com/watch?v=qJZ1Ez28C-A I had learned about the double slit experiment in school, but in my mind it was something of a theoretical construct. This veritasium video demonstrates, that quantum waves are very real and tangible. This is how physics should be.
I'm not sure very many people will actually be helped by reading the linked discussion, which appears both too technical to be clear for newcomers to Quantum mechanics while also not providing any interesting detail for the more experienced reader.
This seems to be entire argument:
But the wave function is a wave in the space of possibilities, and not in physical space.
Which is fair enough as an initial claim, but it doesn't really get motivated further, or at least not before I got bored reading and started skimming.
For a single particle they are easy to confuse. A wave function ψ(t,x) for a single particle gives a probability amplitude to find the particle at coordinate x at time t. In this case one can imagine an amplitude at each point in space and time, like a field. This interpretation however completely breaks down once you introduce a second particle: the wave function ψ(t,x1,x2) gives a probability amplitude to find particle 1 at x1 and particle 2 at x2 at time t. This no longer admits an interpretation of assigning some value to locations in space. Intuitively one might think you get one amplitude for each particle at some location but that's not how QM works, so we shouldn't think of the wave function as living in physical space.
But if you aren't trying to map the wave function to physical space somehow you are essentially saying that the central construct of your theory has no direct relation to the actual physical processes happening "underneath".
This reduces to a kind of "shut up and calculate" attitude, so it seems poor starting point from which to write an interpretation text.
Space is a part of the wavefunction, as the article explains clearly. The wave function describes where the particles can be in physical space. And, the wave function has the same shape as the wave equations for traditional mechanical waves, like a sound wave or a sea wave.
However, if a classical three-dimensional wave equation describes how matter osciallates in three-dimensional physical space, a quantum wavefunction doesn't do that. Quantum particles don't oscillate in physical space like that. A three-dimensional wavefunction might describe three particles' positions along a one-dimensional line, and it's oscillations are oscillations of probability, not position. The particles don't move, say, up and down. Their probability to be here or there on that 1-d line waxes and wanes.
This is what the article is trying to explain: the basic mathematics of quantum mechanics, the definition of the wavefunction. The value of a wavefunction for the position of three particles is not a position in space at a moment in time. It is a (complex) probability for the position of every particle at that moment.
This only seems confusing when looking at wavefunctions that describe positions. But wavefunctions often have many more observables, such as spin or polarization. A wavefunctions for two electrons moving around on a plane will not be a two-dimensional wave. It will be a wave in a six-dimensional space, whose axis may be "particle 1 has spin up/down, particle 2 has spin up/down, particle 1 position along x axis, particle 2 position along x axis, particle 1 position along y axis, particle two position along y axis".
I'm honestly confused; it's fine to say the wave function lives in some high dimensional phase space and that it's not actually describing some vibration of spacetime. But I don't recall ever imagining the wave function being a vibration of spacetime, is that really something people think?
If I were to express some sort of wave-function-in-spacetime theory, I'd invoke lots of classical fields filling space and have those wiggle.
In any case, the whole bit about the proper two-particle wave function living in a higher dimensional space is somewhat spoiled by the fact that you can factorise it into normal 3-space pieces (so long as you don't have your particles interacting), it doesn't seem such an alien space to me.
I morally agree, but not quite: think of the wave function as not more than a bookkeeping device. It does get the job done but be careful to ascribe it too high an ontological status! The path integral formulation seems a lot more natural to me and it does not need a wave function, instead you can derive it and treat it as a bookkeeping device. The way I think about it is that it's an attempt to deterministically model non-deterministic behavior: you "pretend" that the system is deterministic by keeping track of all the possible ways it could have evolved in time. sure enough, once you make a measurement this probability distribution "collapses" and you find out what is actually the case.
I think you are agreeing with my point that declaring the wave function to be mere bookkeeping is a poor foundation for writing about the interpretation of quantum mechanics?
Can't really get any other sense out of your reply, but I'm not entirely sure.
Also not sure I'm understanding you right :) My view is this: the wave function is mere bookkeeping and not anything ontologically fundamental. However the fact that such a seemingly bizarre concept lets you do quantum physics (even if it's not the only way) points to some fundamental questions about the nature of...well, nature.
Of course this is not the only valid view...just one that makes sense to me. Thinking about these sorts of questions is a very fun endeavour.
My view is that OP's text about whether the wavefunction goes through both slits is overly long if the premise is that the wavefunction is only for bookkeeping.
Comment was deleted :(
It's not just pretending, deterministic model works, unlike nondeterministic.
That's true, but it's also true of the classical probability distribution p(t,x1,x2).
Yes, which is exactly the point. The main difference is that the wave function has a complex value with norm <= 1, while a probability distribution function has a real value <= 1.
I had the same reaction. If you make it to the end he concludes with:
The wave function’s pattern can travel across regions of possibility space that are associated with the slits.
Which to me conflicts with his emphatic “no” at the beginning of the article because this implies you can define some mapping between the physical and probability space. And of course you can because if you couldn’t the theory would not be physically predictive.
His point from the beginning is this: the particle described by the wavefunction can't be said to move through both slits at once, because ψ(t, x, y) has a single value for a particular x and y at a particular time. The particle has non-0 probability for both x, y1, t and for x, y2, t, of course - but that just means the particle has non-0 probability to pass through either slit.
And as for saying that the wave moves through both slits, that also doesn't make sense, by the very definition of the wave function - it's a wave in probability space, not in space, so it just doesn't move through space.
And as for saying that the wave moves through both slits, that also doesn't make sense, by the very definition of the wave function - it's a wave in probability space, not in space, so it just doesn't move through space.
I don't think that's a valid argument. Imagine a regular water wave, i.e. a wavefunction h = h(x, y, t) describing the height of the water at position (x, y) at time t. You could say "this is a wave in height space, not in space, so it just doesn't move through space" and in a certain sense that's true. But obviously there is something that does "move" through "space" to the extent that anything can ever be said to do so.
I’m with you on point 1, (I think this is also obvious from experiment because you will never measure a particle at both slits).
for point 2 it seems you can define a mapping from the physical space to probability space. Saying that the wave doesn’t “move through” space might be technically correct but also seems like semantics on the definition of the phrase “move through” ?
Considering a particle is an excitation of a quantum field, the space of possibilities could be seen as the only space there is. At least that’s what I think (but don’t know for sure) that the mathematical universe hypothesis people posit.
Here is my take on explaining this, written seven years ago:
https://blog.rongarret.info/2018/05/a-quantum-mechanics-puzz...
https://blog.rongarret.info/2018/05/a-quantum-mechanics-puzz...
https://blog.rongarret.info/2018/05/a-quantum-mechanics-puzz...
If you want to play with quantum wave packets, I built a quantum Web-based Simulation tool for non-relativistic quantum mechanics in 2016: https://quantum-simulation.de
The split-operator method for the numerical solution of the time-dependent Schrödinger equation is used to simulate the propagation of a Gaussian wave packet in arbitrarily adjustable potentials.
Somewhat related recent Veritasium video - https://www.youtube.com/watch?v=qJZ1Ez28C-A
Strassler is usually more technical than your run-of-the-mill popsci writer. But his book "Waves in an impossible sea" is much more accessible.
Comment was deleted :(
Comment was deleted :(
I'm a bit confused by the argument posed here:
Figure 4: The wrong wave function! Even though it appears as though this wave function shows two particles, one trailing the other, similar to Fig. 3, it instead shows a single particle with definite speed but a superposition of two different locations (i.e. here OR there.)
I understand that if treat the act of adding two particles' wave functions as creating a new wave function for one particle, then we have this problem, essentially by definition. But it got me thinking - would it not make sense to treat the result as an expected value, such that we could then measure how many particles are likely to be to the right of the door at each point in time?
It isn't by definition, presuming the relationship between quantum mechanics and reality. You can have a _two particle_ state and a _one particle state_ with non-trivial probability of being in two places. They are distinct things. The key idea here (and really, in Quantum Mechanics generally) is that superpositions are important things in the theory. This is the statement that if you have a wave function for one situation and another wave function for another than the sum of the two is also, necessarily, a valid wave function for a physically realizable system.
This is different from a classical probability. Suppose we simply don't know whether the baseball was fired from HERE or from THERE. In a classical situation, we can carry forward our understanding of the situation in time by simply calculating what the classical particles would do independently. In quantum mechanics the mechanics are of the wave function itself, not of the things we measure. We cannot get the right answer by imagining first that we measure the particle in one location and calculate forward and then by imagining we measure the particle in another and calculating forward and then adding the results. It isn't how the theory works. We must time evolve the wave function to predict the statistical behavior of measurement in the future.
Not a physicist - how is it possible to only send 1 photon at a time? How is this ensured?
Literally dim the light source until the probability of two showing up within the time scale of the measurement is low enough. This is not impractical. For instance there are many kinds of detectors that can be set up to discriminate single photons or particles.
I have a dumb question since school days, how to you build two "slits" as small as particle size? I assume there's tons of interacting forces when you zoom in to that scale. The "slits" are made of particles themselves afterall.
Is it really similar to the "slits" we see in daily life or something different going on here?
The slits are not made of particles, they are slits, so the absence of particles.
You could make a double slit experiment by shining a laser on a single strand of hair, this would create two wide slits on either side.
Or draw black sharpie on some glass, and scratching two openings on it with a needle.
Since effects are clearer when it matches wavelength, you can also buy pre-made ones if you don’t feel like making it yourself
The slits do not have to be particle sized. You just cut a couple on a sheet of metal.
This again.
The take is that possibility space does not have the same constraints as physical space. But I cannot recall that that was a real problem, but at least that's an opener for a discussion.
The question was and still is what influences individual particles to form an interference pattern over time. One interpretation is that something changes, or interferes with, the probability wave function of the particle's 'virtual' trajectory. The empirical evidence for this statistical anomaly is the pattern on the screen. Well of course, the double slit causes the interference pattern, but it's still not clear why. Best working guesstimate so far is that it behaves like a bullet in physical space and like a wave in possibility space at the same time. This guesstimate was enough to become the foundation of modern quantum physics, and so much more. But is it the whole story? Do we want to know?
I think the double slit experiment is just flawed, and it's a miracle that someone was able to derive a working model from its results.
The experiment result becomes much more clearer once you add more slits. Until you have infinite slits. The possibility of a lightwave travells on all roads possible, but most chancel out by likelihood . The action goes all the way.
Is there a way to filter out all the people who watched the Veritasium video last week but didn’t read this article?
Because i would like to understand infinite slits from wave probability space described in the article. Although the article author says that’s coming next week or so.
Made me think of this exchange with Heinz von Foerster about theory, particles and reality: https://www.youtube.com/watch?v=ev7e9sfWIJo
Isn't it one of the 'does it matter if you didn't interact with it?' questions, and keep in mind 'observation' at quantum scales is to a good approximation synonymous to 'interaction'.
Approximation?
One can only measure by interacting, there is no other way.
There was a few lines on this, but I wish it clearer that everything it said is also true classically about particles for which we are uncertain.
IMO so much writing about quantum mechanics gets harder to follow by trying to jump classic -> quantum, and certain -> probabilistic at the same time. If one does the latter switch first, it cuts out the noise of easier-to-understand things to get to the second.
I'm enjoying this tutorial so far. Every sentence was carefully considered which I think is important for my level of understanding of quantum mechanics. I'm reading very slowly and carefully. It was really helpful to define the wave function as not existing in the physical world such as a water wave but exists as description in probability space.
Simpleton's view:
Particles are just standing waves, so to speak. They are not just an amorphous clay-like lump of matter. They are made of smaller things and those things are churning around. That in-place churning becomes a wave when the particles move at speeds that approach a significant fraction of c.
"Nobody really understands quantum mechanics." — Richard Feynman
Comment was deleted :(
One thing I've thought about is whether observations in the present can influence past events. I'm thinking it must be so, though probably only on a microscopic level.
The chain isn't this:
Choice of how to measure -> History
it is,
Choice of how to measure + physical system -> Observations -> Interpretation of observations -> History
The choice of what and how to measure will influence the history you conclude, but that is true of actual "Caesar and Napoleon" history too, and in that case it's definitely not that past events are being changed, instead it is your knowledge of them. A really interesting principle is that any philosophical question that can be phrased without referring to ideas that only exist in quantum mechanics can usually be answered without referring to them.
Ugh, that's bad, the post is very handwavy.
Vibration of charged particle creates EM waves. Particle goes trough one slit. Wave goes trough two slits.
One of the best ways of thinking about it is that it's "a quanta propagating with these properties".
Trying to collapse a quanta to either a particle or a wave loses some of the behaviour of the thing you're taking about and is where some of the confusions come from trying to take one viewpoint to it's wrong conclusion
Has anyone tried this with non-magnetic particles? Something that's not a photon, not an electron and not a C60 electron-wrapped molecule?
It's been done with neutrons if that's what you're asking.
Yeah, that's right.
There are no particles, only waves. I do not know how long it will take people to accept this because I think it effects their very psyche, realizing that there is no mass outside of our observations.
The wave went through the slits, not the "wave function". There is no "quantum" because there is nothing to measure so there is no quantum physics.
The fact that we are quantifying things is the problem. When we look at everything as a whole which is effected by waves we will find the solution.
Probably it's because I'm not a quantum physicist, but the argument boiling down to "the wavefunction is an object of a probability space not of physical space" seems to make the whole article moot. Can the "wavefunction" be anything else than a _representation_ of the particule(/wave)?... but then who could ever think that a representation would actually travel in space?
But waves are just group dynamics of particles, so there is no waves, just particles.
But waves are just group dynamics of particles
No. Group movement of particles is one medium in which waves can occur, but the concept is more fundamental and general. The waves described in the article are not in particles.
Do particles change back into waves? No.
Yes, you can reintroduce another beam splitter at the end and "lose" the path information, and provided you aren't measuring anywhere along the path, you get wave interference at the end even if you split the beam along two paths in the beginning. Look up the quantum eraser experiment.
Quantum physics says: yes.
We are assuming the “particle” is just a particle and somehow not attached to a larger wave of other unknown (smaller) things
We are looking at a birds body calling it a particle but it has wings we don’t see which effect the direction the particle flies
This is a classic "midbrow dismissal" where you assume experts haven't thought of the first thing you thought of.
[dead]
[dead]
Yes.
Rather than viewing wave functions as abstract mathematical objects in possibility space, we might understand them as describing the probabilistic nature of fundamental spinning energy entities whose rotational states generate wave-like behavior in measurement outcomes. Strassler's "possibilities" could be reinterpreted as different rotational configurations of spinning energy, with interference patterns emerging not from physical objects passing through slits but from how these spinning states evolve when constrained by sequential measurements.
Two words, Bohmian Mechanics
Three words, pilot wave theory
To quote Cockshott, the Copenhagen Interpretation is an idealist recapitulation of Russian Machism/Bishop Berkley. The statement "nothing /is/ until it is observed" is not necessarily a Weird Quantum formulation but just a solipsistic attitude applicable towards all scientific observation in general.
If one tries to formulate QFT theory with Bohmian Mechanics the results are less than satisfying. Regular Quantum Mechanics in a Bohmian mode is, in addition to failing to be invariant, also pretty paltry if pressed to really serve, primarily in that it appears to be the case (for both theories) that one has quite a lot of freedom with respect to what precisely lives with the particle and what lives with the pilot wave function.
In another sense Bohmian mechanics just kicks the can down the road - we may decide to associate the specific thing we observe with a particle situated on the pilot wave, but in fact, as far as the theory goes, the particle can live at any point in the pilot wave it wishes and nothing about the dynamics of the pilot wave changes at all. Thus we simply place the non-determinism in the past rather than in the present.
Furthermore, Bohmian mechanics seems to break Newton's First Law, since the pilot particle, as hinted above, is influenced by the pilot wave but not vice versa. The appeal of Bohmian mechanics is obvious, but superficial. It does not dispense with the can of worms, just opens it from the other side, in my opinion.
Bohmian mechanics is based on the idea that we perceive stuff to be in a certain position in a single reality because there is a correspondence to stuff being actually there. That's nice. If the particles are surfing a wave and not impacting it, so be it.
It is also rather nice to think of the particles as just being points in space with nothing else associated with them; an electron is just an electron because the portion of the wave function that is relevant and guiding it is the electron portion; see a paper from 2004 entitled "Are all particles identical?" [1] (I am a coauthor on that). If one thinks about it, we only know about particles through their motion so having things like mass and charge linked to the object guiding the particle seems perfectly reasonable. Points are not only not labelled by numbers (particle 1, 2, etc) but also not labelled by mass and charge.
The nondeterminism of not knowing the initial conditions is fine; the point was to have a theory with well-defined objects that give some plausible story and connection to our experiences, such as stuff existing and being somewhere. The fact that non-relativistic Bohmian mechanics happens to be deterministic is just happenstance for many of its supporters. In some QFT versions, the dynamics of creation is not deterministic and there is no reason for that to be a problem. But it is well-specified without having to invoke some special magic action called "observation".
As for QFT, the biggest problem for Bohmian mecahnics is the need to have an actually well-defined evolution of the wave function. The idea of particles being created and annihilated is not particularly hard. And, in fact, recent work has shown that if one takes that seriously and respects probability leaking from n particle space to n+1 and n-1, then at least some of the divergence problems go away. See [2]
1: https://arxiv.org/abs/quant-ph/0405039 2: https://arxiv.org/abs/1809.10235
Bohmian mechanics is based on the idea that we perceive stuff to be in a certain position in a single reality because there is a correspondence to stuff being actually there. That's nice. If the particles are surfing a wave and not impacting it, so be it.
At that point it's very obviously a violation of Occam's razor though. It's like positing that the content of my field of vision is an objectively real thing, that the reason the universe looks like a video projection is that there really is a video projection going on, even though that video projection has no physical effect.
If one thinks about it, we only know about particles through their motion so having things like mass and charge linked to the object guiding the particle seems perfectly reasonable.
Indeed. But if one thinks a little more, what's the point of positing a particle at all, if all of the physics is in the pilot wave?
The definite positions of your brain states evolution is correlated to the other positions of all the stuff. The other particles do have an effect on your evolution and there is a "you" set of particles one can talk about. Remember the wave function is a function on configuration space so evaluating the guiding effect on the particles is to have to know what point in configuration space it is at; this is actually the troubling bit and leads to the nonlocality concerns, but that problem is common to any quantum theory with definite results happening.
The physics, therefore, is not all in the pilot wave. If you take as the point of a particle theory that there should be particles with positions changing in time, then that is what is being given in Bohmian mechanics.
Also, ask yourself, if the wave function is on configuration space, what constitutes a configuration? In Bohmian mechanics, it is clear, but if the wave function is all there is, then why are we talking about configuration space at all? It is just this abstract vector in Hilbert space evolving and many different representations can happen. Why do we not perceive reality in terms of these other representations?
If it helps, you can think of the wave function a bit like a dynamic law. In [1], the authors suggest thinking of log( psi) analogously to the Hamlitonian H on phase space in classical mechanics. There is no back action on H and most of it is irrelevant to the evolution of a particular particle system in that framework and yet everyone recognizes it as just a convenient way of describing the dynamics.
The difference is that psi evolves but even that may only be true on a subsystem point of view. It is theoretically possible to have a stateless universal wave function which, when particular particle positions of the environment are plugged in, nonetheless gives evolving subsystem wave functions.
Occam's razor is difficult to apply here without a prejudice. If you want to minimize the number of equations, then sure, "the wave function is everything" works, but it comes at the cost of there being what could be considered an infinite number of "you"s and everything else, all slightly different and whole existing other expressions of the universe with no connection to us. If you want collapse somewhere, then you have to posit that mechanism.
On the other hand, by adding in particles and the guiding equation, one gets a singular "you" and everything that we experience is, more or less, definite and singular. So the "existing" stuff is dramatically reduced.
Which one of this is truly simpler is a matter of taste, I would say. I think in terms of communicating with people, the Bohmian version of "there is this universal wave and the positions of stuff are guided by it" is pretty simple. The law itself is so trivially a part of the Schrodinger equation that it could easily be derived before the Schrodinger equation itself. Contrast this with other versions which is "reality collapses to a definite state when we look at it" or "there are infinitely many different universes". None of those seem as simple.
there is a "you" set of particles one can talk about
We know that particles don't have identity though - exchange of identical particles is a symmetry and physics would be very different if it wasn't. I won't claim it's compelling, but to me that suggests that a particle is more like a pattern or a field excitation than a thing with its own concrete existence.
Why do we not perceive reality in terms of these other representations?
What would be different if we did? I mean obviously at a macroscopic level particles moving through space is a model that gives a good approximation and is easy to think in, but that doesn't mean they're any more physically real than e.g. temperature.
If only all these people who spend their lives and jobs studying this in rigor and have all heard of Bohmian Mechanics WOULD ONLY JUST LISTEN TO YOU.
I don't think it's quite as quack of me to present as you might think: it has a somewhat fringe but not unsuccessful body of recent study.
https://pubmed.ncbi.nlm.nih.gov/26989784/
I tend to think (as some others do) that it's also a much better way to reason about quantum computation. Should a factorization of a large semiprime number by Shor's algorithm be attributed to the semi-mystical power of The Observer collapsing the wave function (which is who by the way, the sensor, or the person reading that sensor?), or are we instead exploiting realism to do the work?
I’m tired of metaphorical discussions about this experiment.
Stop with the “wave particle duality”.
Stop with the “until it’s measured”.
Explain the experimental setup in grosse detail.
What do you mean by “a particle is emitted?”. What do you mean by “a particle is measured?”.
Even within the bounds of self described “double slit experiment”s there are numerous variations on how it is designed, constructed, and conducted.
Stop explaining the abstract notion of the experiment through a lens of your preconceived interpretation.
Show me data.
Show me numerical analysis.
Sure, here you go.
https://iopscience.iop.org/article/10.1088/1367-2630/15/3/03...
Crafted by Rajat
Source Code