NHacker Next
login
▲Is gravity just entropy rising? Long-shot idea gets another lookquantamagazine.org
225 points by pseudolus 20 hours ago | 199 comments
Loading comments...
abetusk 5 hours ago [-]
Entropic gravity is like the "brazil nut effect" [0] [1]. The idea is that if you shake a glass full of different sized nuts, the large ones will rise to the top.

From what I understand, this is because larger objects have more mass, moving slower when shaked, so as the larger (brazil nuts) don't move as much relative to the smaller ones (peanuts), and because of gravity, there's a cavity left under the brazil nut which gets filled in with peanuts.

For entropic gravity, the idea is that there's a base density of something (particles? sub-atomic particles?) hitting objects in random ways from all directions. When two large massive objects get near each other, their middle region will have lower density thus being attracted to each other from particles hit with less frequency from the lower density region. They sort of cast a "shadow".

I'm no physicist but last time I looked into it there were assumptions about the density of whatever particle was "hitting" larger massive objects and that density was hard to justify. Would love to hear about someone more knowledgeable than myself that can correct or enlighten me.

As an aside, the brazil nut effect is a very real effect. To get the raisins, you shake the raisin bran. To get gifts left from your cat, you shake the kitty litter. It works surprisingly well.

[0] https://en.wikipedia.org/wiki/Granular_convection

[1] https://www.youtube.com/watch?v=Incnv2CfGGM

hellohello2 3 hours ago [-]
Not a physicist either but this passage from the Feynman lectures seem related to what you are describing: https://www.feynmanlectures.caltech.edu/I_07.html

"Many mechanisms for gravitation have been suggested. It is interesting to consider one of these, which many people have thought of from time to time. At first, one is quite excited and happy when he “discovers” it, but he soon finds that it is not correct. It was first discovered about 1750. Suppose there were many particles moving in space at a very high speed in all directions and being only slightly absorbed in going through matter. When they are absorbed, they give an impulse to the earth. However, since there are as many going one way as another, the impulses all balance. But when the sun is nearby, the particles coming toward the earth through the sun are partially absorbed, so fewer of them are coming from the sun than are coming from the other side. Therefore, the earth feels a net impulse toward the sun and it does not take one long to see that it is inversely as the square of the distance—because of the variation of the solid angle that the sun subtends as we vary the distance. What is wrong with that machinery? It involves some new consequences which are not true. This particular idea has the following trouble: the earth, in moving around the sun, would impinge on more particles which are coming from its forward side than from its hind side (when you run in the rain, the rain in your face is stronger than that on the back of your head!). Therefore there would be more impulse given the earth from the front, and the earth would feel a resistance to motion and would be slowing up in its orbit. One can calculate how long it would take for the earth to stop as a result of this resistance, and it would not take long enough for the earth to still be in its orbit, so this mechanism does not work. No machinery has ever been invented that “explains” gravity without also predicting some other phenomenon that does not exist."

AnotherGoodName 1 hours ago [-]
It also doesn't account for time dilation in a gravity well however i still think the general idea has some merit if you think of it as being bombarded by massless ‘action potentials’ on all sides with mass absorbing that field to some to enable translation in space time.

I get this is vague spitballing but essentially an ‘action potential’ would allow mass to move. Higher temperature mass interacts more, lower temperature interacts less. Mass with momentum would be biased to absorb more from one side so it travels in a specific direction in space more than others (the idea i’m getting at is that all movement in space only occurs with interaction with this field), this also would counteract issues with moving mass interacting more on a specific side - the very bias of mass with momentum to absorb more on one side means that from that masses point of view it has the same action potentials interacting from all sides. Mass shielded behind mass receives fewer action potentials so experiences exactly the effect that you can call time dilation. Mass shielding other mass from action potentials also means that mass accelerates towards other mass.

Essentially its the above but instead of a massive particle hitting other mass from all sides it’s a field that allows mass to experience a unit of time.

1 hours ago [-]
FilosofumRex 2 hours ago [-]
This is a better YouTube video describing granular physics and shows the speed (amplitude) of vibrations can cause counterintuitive arrangements of particles.

At lower speeds you get something akin to Newtonian gravity but at higher velocities you get something resembling MOND gravity where galaxies clusters and large voids appear - no dark matter needed.

https://www.youtube.com/watch?v=HKvc5yDhy_4

abetusk 2 hours ago [-]
Thanks for the link, very interesting. I'll have to check out the paper but just watching the video it seems all these counter intuitive effects can be described from the oscillations being related to the size of the chamber.

For example if I were to roll the chamber at a very low frequency, I would expect the particles to clump on one side, then the other and so on. This is not really so surprising and the frequency will depend on the chamber dimensions.

Someone 3 hours ago [-]
> From what I understand, this is because larger objects have more mass, moving slower when shaked, so as the larger (brazil nuts) don't move as much relative to the smaller ones (peanuts)

That doesn’t make sense to me. If larger objects move slower, don’t they move faster relative to the (accelerating) reference frame of the container?

Also, conventional wisdom has it that shaking (temporarily) creates empty spaces, and smaller objects ‘need’ smaller such spaces to fall down, and thus are more likely to fall down into such a space.

ndriscoll 4 hours ago [-]
Aren't more massive particles smaller though (in terms of de Broglie wavelength, at least), so they'd have a smaller "shadow"? Or do different forces have different cross-sections with different relationships to mass, so a particle's "size" is different for different interactions (and could be proportional to mass for gravity)?

Actually this is currently blowing my mind: does the (usual intro QM) wavefunction only describe the probability amplitude for the position of a particle when using photon interaction to measure, and actually a particle's "position" would be different if we used e.g. interaction with a Z boson to define "position measurement"?

bobbylarrybobby 3 hours ago [-]
The momentum wavefunction (or more properly, the wavefunction in the momentum basis) completely determines the position wavefunction (wavefunction in the position basis). And we can probe the momentum wavefunction with any particle at all, by setting up identical (say) electrons and seeing the momentum they impart on a variety of test particles. That is to say, the probability distribution of momentum of a particle does not depend on what we use to probe it.

As the position wavefunction is now completely determined in a probe-agnostic matter, it would be hard to justify calling a probe that didn't yield the corresponding probability distribution a “position measurement”.

hcarvalhoalves 4 hours ago [-]
In other words, gravity would be explainable by statistical mechanics (like heat)?
abetusk 2 hours ago [-]
That's the allure, that gravity is a derived effect from statistical mechanics. Thus the name 'entropic attraction'.
sim7c00 5 hours ago [-]
but these nuts move by gravity do they not? and what in the universe is exactly up and down? and why would that matter?

are all celestial bodies then a local up and 'away from them' down?

this analogy hurts my brain. please tell me how to make the hurting stop

franktankbank 4 hours ago [-]
No you are right. You can't invoke gravity in an analogy trying to explain gravity.
cwmoore 2 hours ago [-]
Not without a bridge for the metaphor to generalize. What parameters map to the nutjar’s selective force, size, and mass?
dudeinjapan 2 hours ago [-]
The same effect could be replicated in a zero-gravity environment using an alternative background force (centrifugal force, vacuum suction, electromagnetism, etc.)
jvanderbot 4 hours ago [-]
You need to reread the middle only. It's a kind of "vacuum" effect.
MathMonkeyMan 13 hours ago [-]
Entropic gravity is a compelling framework. I think that most Physicists admit that it would be nice to believe that the yet unknown theory of everything is microscopic and quantum mechanical, and that the global and exquisitly weak force of gravity emerges from that theory as a sort of accounting error.

But there are so many potential assumptions baked into these theories that it's hard to believe when they claim, "look, Einstein's field equations."

evanb 6 hours ago [-]
Jacobson showed that thermodynamics + special relativity = GR. Those are very very general assumptions, so general that it’s hard to even consider what else you might ask for.
cryptonector 5 hours ago [-]
Ooh, link?
evanb 5 hours ago [-]
https://arxiv.org/abs/gr-qc/9504004
cryptonector 5 hours ago [-]
Thank you!
mr_mitm 10 hours ago [-]
What are some of the most problematic assumptions in your opinion?
nathan_compton 6 hours ago [-]
I'm not an expert in this field but I think reproducing realistic gravitational interactions seems to require a lot of fiddly set up with heat baths etc.
layer8 2 hours ago [-]
From the article, they don’t claim Einstein‘s field equations yet, just classical Newtonian gravity, at present.
gus_massa 7 hours ago [-]
> I think that most Physicists admit that it would be nice to believe that the yet unknown theory of everything is microscopic and quantum mechanical,

I agree.

> and that the global and exquisitly weak force of gravity emerges from that theory as a sort of accounting error.

Nah, it's probably just another weird family of bosons, just like the other forces.

From the article:

> Entropic gravity is very much a minority view. But it’s one that won’t die, and even detractors are loath to dismiss it altogether.

pif 11 hours ago [-]
As an experimental physicist, I refuse to get excited about a new theory until the proponent gets to an observable phenomenon that can fix the question.
the__alchemist 5 hours ago [-]
This is why I'm skeptical of theories like Wolfram's: It feels like an overfit based on this: It produces all sorts of known theories (special relativity, parts of QM, gravity etc), but doesn't make new testable predictions, or new fundamentals. When I see 10 predictions emerge from the theory, and they all happen to be ones we already known of... Overfit.
ojo-rojo 37 minutes ago [-]
But that means we'd prefer whichever theory our species had landed on first. Basing our preference for a theory on that timing seems kind of arbitrary to me. If they're the same in other respects, I'd take a look at both sides to see if there are other compelling reasons to focus on one or the other, such as which is simpler. Of course if they make different predictions that'd be even better, time to get to testing :)
emtel 5 hours ago [-]
Jonathan Gorard goes through a handful of testable predictions for the hypergraph stuff here: https://www.youtube.com/watch?v=XLtxXkugd5w
lewdwig 11 hours ago [-]
The problem with emergent theories like this is that they _derive_ Newtonian gravity and General Relativity so it’s not clear there’s anything to test. If they are able to predict MOND without the need for an additional MOND field then they become falsifiable only insofar as MOND is.
dawnofdusk 6 hours ago [-]
Deriving existing theories of gravity is an important test of the theory, it's not a problem at all. It's only a problem if you can only do this with more free parameters than the existing theory and/or the generalized theory doesn't make any independent predictions. Seems like in the article the former may be true but not the latter.
cryptonector 5 hours ago [-]
If such a theory makes no new predictions but is simple / simpler than the alternative, then it is a better theory.
JPLeRouzic 10 hours ago [-]
Please, how is the article related to MOND's theories?
lewdwig 10 hours ago [-]
In general, they’re not. But if the only thing emergent theories predict is Newtonian dynamics and General Relativity then that’s a big problem for falsifiability. But if they modify Newtonian dynamics in some way, then do we have something to test.
westurner 8 hours ago [-]
From https://news.ycombinator.com/item?id=43738580 :

> FWIU this Superfluid Quantum Gravity [SQG, or SQR Superfluid Quantum Relativity] rejects dark matter and/or negative mass in favor of supervaucuous supervacuum, but I don't think it attempts to predict other phases and interactions like Dark fluid theory?

From https://news.ycombinator.com/item?id=43310933 re: second sound:

> - [ ] Models fluidic attractor systems

> - [ ] Models superfluids [BEC: Bose-Einstein Condensates]

> - [ ] Models n-body gravity in fluidic systems

> - [ ] Models retrocausality

From https://news.ycombinator.com/context?id=38061551 :

> A unified model must: differ from classical mechanics where observational results don't match classical predictions, describe superfluid 3Helium in a beaker, describe gravity in Bose-Einstein condensate superfluids , describe conductivity in superconductors and dielectrics, not introduce unoobserved "annihilation", explain how helicopters have lift, describe quantum locking, describe paths through fluids and gravity, predict n-body gravity experiments on earth in fluids with Bernoulli's and in space, [...]

> What else must a unified model of gravity and other forces predict with low error?

cryptonector 5 hours ago [-]
u/lewdwig's point was that if an emergent gravity theory made the sorts of predictions that MOND is meant to, then that would be a prediction that could be tested. The MOND thing is just an example of predictions that an emergent theory might make.
andrewflnr 6 hours ago [-]
They both have to do with very weak gravitational fields.
cantor_S_drug 6 hours ago [-]
Sometimes I wonder, imagine if our physics never allowed for Blackholes to exist. How would we know to stress test our theories? Blackholes are like standard candles in cosmology which allows us to make theoretical progress.
mycatisblack 5 hours ago [-]
And each new type of candle becomes a source of fine-tuning or revision, progressing us with new ways to find the next candles - cosmological or microscopic.

Which kinda points to the fact that we’re not smart enough to make these steps without “hints”. It’s quite possible that our way of working will lead to a theory of everything in the asymptote, when everything is observed.

nitwit005 2 hours ago [-]
But, think of all the fun math we get to do before someone shows it's an unworkable idea.
elyase 5 hours ago [-]
Between two models the one with the shorter Minimum Description Length (MDL) will more likely generalize better
Caelus9 6 hours ago [-]
It's a fascinating idea that gravity could be an emergent result of how information works in the universe. I feel like we still don't have that clear piece of evidence where this model predicts something different from general relativity. For now it is one of those theories that are fun to explore but still hard to fully accept.
raindeer2 3 hours ago [-]
Wonder if this perspective is compatible with Wolframs physics model based on hypergraphs?

Gravity, in this framework, is an emergent property arising from the statistical behavior of the hypergraph's evolution, suggesting that gravity is an "entropic force" arising from the tendency of the system to minimize its computational complexity

meindnoch 13 hours ago [-]
I don't get it.

To me, entropy is not a physical thing, but a measure of our imperfect knowledge about a system. We can only measure the bulk properties of matter, so we've made up a number to quantify how imperfect the bulk properties describe the true microscopic state of the system. But if we had the ability to zoom into the microscopic level, entropy would make no sense.

So I don't see how gravity or any other fundamental physical interaction could follow from entropy. It's a made-up thing by humans.

Ma8ee 22 minutes ago [-]
Entropy is certainly a physical “thing”, in the sense that it affects the development of the system. You can equally well apply your argument that it isn’t a physical thing because it doesn’t exist on a microscopic scale to temperature. Temperature doesn’t exist when you zoom in on single particles either.

There’s no reason to involve our knowledge of the system. Entropy is a measure of the number of possible micro states for a given system, and that number exists independently of us.

kgwgk 16 minutes ago [-]
> Entropy is a measure of the number of possible micro states for a given system, and that number exists independently of us.

That number also exists independently of the system! I can imagine any system and calculate the corresponding number.

(And does the “system” exist independently of us? What separates the “system” from anything else? Is every subset of the universe a “system”?)

antonvs 12 hours ago [-]
Your perspective is incorrect.

Physical entropy governs real physical processes. Simple example: why ice melts in a warm room. More subtle example: why cords get tangled up over time.

Our measures of entropy can be seen as a way of summarizing, at a macro level, the state of a system such as that warm room containing ice, or a tangle of cables, but the measure is not the same thing as the phenomenon it describes.

Boltzmann's approach to entropy makes the second law pretty intuitive: there are far more ways for a system to be disordered than ordered, so over time it tends towards higher entropy. That’s why ice melts in a warm room.

aeonik 10 hours ago [-]
My take, for what it's worth,

Entropy isn’t always the driver of physical change, sometimes it’s just a map.

Sometimes that map is highly isomorphic to the physical process, like in gas diffusion or smoke dispersion. In those cases, entropy doesn't just describe what happened, it predicts it. The microstates and the probabilities align tightly with what’s physically unfolding. Entropy is the engine.

But other times, like when ice melts, entropy is a summary, not a cause. The real drivers are bond energies and phase thresholds. Entropy increases, yes, but only because the system overcame physical constraints that entropy alone can’t explain. In this case, entropy is the receipt, not the mechanism.

So the key idea is this: entropy’s usefulness depends on how well it “sees” the real degrees of freedom that matter. When it aligns closely with the substrate, it feels like a law. When it doesn't, it’s more like coarse bookkeeping after the fact.

The second law of thermodynamics is most “real” when entropy is the process. Otherwise, it’s a statistical summary of deeper physical causes.

lumost 7 hours ago [-]
What makes entropy interesting is that you can describe many physical processes through analysis of the systems degrees of freedom. This pattern repeats regularly despite the systems being radically different.

So you can interpret entropy as being about as real as potential energy or newtons laws. Very useful for calculation, subject to evolution laws which are common across all systems - but potentially gives way as an approximation under a finer grained view (although the finer grained view is also subject to the same rules)

ludwik 12 hours ago [-]
> there are far more ways for a system to be disordered than ordered

I'm a complete layman when it comes to physics, so forgive me if this is naive — but aren't "ordered" and "disordered" concepts tied to human perception or cognition? It always seemed to me that we call something "ordered" when we can find a pattern in it, and "disordered" when we can't. Different people or cultures might be able to recognize patterns in different states. So while I agree that "there are more ways for a system to be disordered than ordered," I would have thought that's a property of how humans perceive the world, not necessarily a fundamental truth about the universe

mr_mitm 10 hours ago [-]
You only hear these terms in layman explanations. Physics has precise definitions for these things. When we say "ordered", we mean that a particular macrostate has only few possible microstates.

Check this Wikipedia article for a quick overview: https://en.wikipedia.org/wiki/Microstate_(statistical_mechan...

Details can be found in any textbook on statistical mechanics.

Gravityloss 8 hours ago [-]
Exactly. The coin flipping example is a very nice way to put it. It works since the coins are interchangeable, you just count the number of heads or tails.

If the coins were of different color and you took that into account, then it wouldn't work.

It's not intuitive to me what gravity has to do with entropy though, as it's classically just a force and completely reversible (unlike entropy)? Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

floxy 2 hours ago [-]
> Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

How does that work with things like black holes? If you saw an apple spiral out of a black hole, wouldn't you suspect that you were watching a reversed video? Even if you take account the gravitational waves?

kgwgk 2 hours ago [-]
If you saw a comet coming from the sun, or a meteorite coming from the moon, etc. you would also find that suspicious.
immibis 1 hours ago [-]
That's the question of why time only goes forwards. It seems to be that the universe started in an extremely low-entropy state. It will go towards high entropy. In a high entropy state (e.g. heat death, or a static black hole), there's no meaningful difference between going forwards or backwards in time - if you reverse all the velocities of the particles, they still just whizz around randomly (in the heat death case) or the black hole stays a black hole.
hackinthebochs 11 hours ago [-]
Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.
zmgsabst 9 hours ago [-]
“Number of terms” is a human language construct.
hackinthebochs 9 hours ago [-]
No, it's a representation construct, i.e. how to describe some system in a given basis. The basis can be mathematical. Fourier coefficients for example.
zmgsabst 7 hours ago [-]
Mathematics is a human language. It being a formal language doesn’t change that.

Further, it’s not objective: you’re choosing the basis which causes the complexity, but any particular structure can be made simple in some basis.

hackinthebochs 7 hours ago [-]
Mathematical notation is a human invention, but the structure that mathematics describes is objective. The choice of basis changes the absolute number of terms, but the relative magnitude of terms for a more or less disordered state is generally fixed outside of degenerate cases.
zmgsabst 6 hours ago [-]
The structure that most words describe is objective, so you haven’t distinguished math as a language. (Nor is mathematics entirely “objective”, eg, axiom of choice.) And the number of terms in your chosen language with your chosen basis isn’t objective: that’s an intrinsic fact to your frame.

The complexity of terms is not fixed — that’s simply wrong mathematically. They’re dependent on our chosen basis. Your definition is circular, in that you’re implicitly defining “non-degenerate” as those which make your claim true.

You can’t make the whole class simplified at once, but for any state, there exists a basis in which it is simple.

hackinthebochs 6 hours ago [-]
This is getting tedious. The point about mathematics was simply that it carries and objectivity that natural language does not carry. But the point about natural language was always a red-herring; not sure why you introduced it.

>You can’t make the whole class simplified at once

Yes, this is literally my point. The further point is that the relative complexities of two systems will not switch orders regardless of basis, except perhaps in degenerate cases. There is no "absolute" complexity, so your other points aren't relevant.

amelius 11 hours ago [-]
In a deterministic system you can just use the time as a way to describe a state, if you started from a known state.
sat_solver 10 hours ago [-]
You're thinking of information entropy, which is not the same concept as entropy in physics. An ice cube in a warm room can be described using a minimum description length as "ice cube in a warm room" (or a crystal structure inside a fluid space), but if you wait until the heat death of the universe, you just have "a warm room" (a smooth fluid space), which will have an even shorter mdl. Von Neuman should never have repurposed the term entropy from physics. Entropy confuses a lot of people, including me.
hackinthebochs 9 hours ago [-]
Maxwell's demon thought experiment implies they are the same concept. Given a complete knowledge of every particle of gas you can in principle create unphysical low entropy distributions of the particles. This[1] goes into more detail.

[1] https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_...

bavell 8 hours ago [-]
A fun visual explanation: https://youtu.be/8Uilw9t-syQ?si=D9sR2YAm40SPFG3a
nick__m 9 hours ago [-]
And somewhat surprisingly the heat death of the universe is the maximal entropy state.

Because there are an infinite number of microstates (all the particles are interchangeable) that lead to the same macrostate: nothing happening for ever!

refactor_master 12 hours ago [-]
I think original post is confused exactly because of “tangled chords” analogies. Something being “messy” in our daily lives can be a bit subjective, so using the same analogies for natural forces may seem a tad counterintuitive actually.

Maybe it would be more fitting to say that it just so happens that our human definition of “messy” aligns with entropy, and not that someone decided what messy atoms look like.

I’d say a bucket of water is more neat than a bucket of ice, macroscopically.

geon 9 hours ago [-]
It has been suggested that time too is derived from entropy. At least the single-directionality of it. That’d make entropy one of the most real phenomena in physics.
meindnoch 11 hours ago [-]
>Simple example: why ice melts in a warm room.

Ice melting is simply the water molecules gaining enough kinetic energy (from collisions with the surrounding air molecules) that they break the bonds that held them in the ice crystal lattice. But at the microscopic level it's still just water molecules acting according to Newton's laws of motion (forgetting about quantum effects of course).

Now, back on the topic of the article: consider a system of 2 particles separated by some distance. Do they experience gravity? Of course they do. They start falling towards the midpoint between them. But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

tsimionescu 10 hours ago [-]
> But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

The answer is that this doesn't happen in a system with only 2 particles. The idea of gravity as an entropic phenomenon is that you introduce some other kind of particle that permeates spacetime, so there is no system that only contains 2 particles. You may use some idea like virtual particles from quantum field theory, or you may define "quanta of space time" as something that is not technically a particle but basically works like one in a handwavy sense.

But the basic point of these entropy based theories is to explain gravity, and typcilaly spacetime itself, as an emergent result of a collection of numerous objects of some kind. This necessarily means that they don't make sense if applied to idealized systems with very few objects - which is why they typically posit such isolated systems simply can't actually exist in reality.

ccozan 11 hours ago [-]
Let me try to answer. Let's say the particles are experiencing gravity as a natural entropy phenomena. They will attract until they become so close that they are now seen as a single particle. The new system has a lower entropy and a higher gravity than before.

Explanation seems very rudimentary but that is the gist of the theory.

From my point of view, I might add the layer of information density. Every quantum fluctuation is an event and the more particles the more information is produced in a defined space volume. But there is no theory of information that is linked to the physics so ...that let me leave as that :).

HelloNurse 12 hours ago [-]
But "disordered" and "ordered" states are just what we define them to be: for example, cords are "tangled" only because we would prefer arrangements of cords with less knots, and knots form because someone didn't handle the cords carefully.

Physical processes are "real", but entropy is a figment.

dekken_ 12 hours ago [-]
I believe you are correct.

Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

Lots of people talk about order/disorder or macro and micro states, not realizing these are things we've invented and aren't physical in nature.

kgwgk 9 hours ago [-]
> Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

That’s funny because the original thermodynamic entropy is defined only for systems in equilibrium.

dekken_ 5 hours ago [-]
from who? Clausius?

It doesn't make a lot of sense to me because a system at equilibrium, cannot go undergo any further diffusion, so there's no potential "entropy increase"

Maybe the issue, is that, like an ideal gas, a perfect equilibrium just doesn't occur.

7 hours ago [-]
12 hours ago [-]
kgwgk 9 hours ago [-]
> Physical entropy governs real physical processes

> the measure is not the same thing as the phenomenon it describes.

There is some tension between those claims.

The latter seems to support the parent comment’s remark questioning whether a “fundamental physical interaction could follow from entropy”.

It seems more appropriate to say that entropy follows from the physical interaction - not to be confused with the measure used to describe it.

One may say that pressure is an entropic force and physical entropy governs the real physical process of gas expanding within a piston.

However, one may also say that it’s the kinetic energy of the gas molecules what governs the physical process - which arguably is a more fundamental and satisfactory explanation.

willvarfar 12 hours ago [-]
The way we use the word 'entropy' in computer science is different from how its used in physics. Here is a really good explanation in a great talk! https://youtu.be/Kr_S-vXdu_I?si=1uNF2g9OhtlMAS-G&t=2213
prof-dr-ir 11 hours ago [-]
Good question. You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

Nevertheless there is a distinct "reality" to entropic forces, in the sense that it is something that can actually be measured in the lab. If you are not convinced then you can look at:

https://en.wikipedia.org/wiki/Entropic_force

and in particular the example that is always used in a first class on this topic:

https://en.wikipedia.org/wiki/Ideal_chain

So when viewed in this way entropy is not just a "made-up thing", but an effective way to describe observed phenomena. That makes it useful for effective but not fundamental laws of physics. And indeed the wiki page says that entropic forces are an "emergent phenomenon".

Therefore, any reasonable person believing in entropic gravity will automatically call gravity an emergent phenomenon. They must conclude that there is a new, fundamental theory of gravity to be found, and this theory will "restore" the probabilistic interpretation of entropy.

The reason entropic gravity is exciting and exotic is that many other searches for this fundamental theory start with a (more or less) direct quantization of gravity, much like one can quantize classical mechanics to arrive at quantum mechanics. Entropic gravity posits that this is the wrong approach, in the same way that one does not try to directly quantize the ideal gas law.

[0] Let me stress this: there is no entropy without probability distributions, even in physics. Anyone claiming otherwise is stuck in the nineteenth century, perhaps because they learned only thermodynamics but not statistical mechanics.

meindnoch 10 hours ago [-]
Sure, I'm not denying that entropy exists as a concept, that can be used to explain things macroscopically. But like you said, it's origins are statistical. To me, temperature is also a similar "made up" concept. We can only talk about temperature, because a sufficiently large group of particles will converge to a single-parameter distribution with their velocities. A single particle in isolation doesn't have a temperature.

So if they say gravity might be an entropic effect, does that mean that they assume there's something more fundamental "underneath" spacetime that - in the statistical limit - produces the emergent phenomenon of gravity? So it isn't the entropy of matter that they talk about, but the entropy of something else, like the grains of spacetime of whatever.

flufluflufluffy 8 hours ago [-]
Yes, exactly. The model is based on (in the first approach) a “lattice” of some type of undiscovered particle-like thing (what they refer to as “qubits” in the article, which is unfortunate because it is NOT the same “qubit” from quantum computing) permeating space time. Or maybe more aptly, it is that lattice from which spacetime emerges. And what we observe as the force of gravity emerges from the entropic forces happening in this lattice.
spacecadet 10 hours ago [-]
Im an idiot, let's get that out of the way first. I think that your temperature analogy answered your own question.

I guess my question in turn is, if we imagine a universe at the end of time(?), one that maybe dominated by a few black holes and not much else. Would an observer experience gravity if place sufficiently far enough way? Or even further, if nothing is left in the universe at all. Assuming that doesn't cause a big crunch, rip, or whatever...

simiones 9 hours ago [-]
> You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

> [0] Let me stress this: there is no entropy without probability distributions, even in physics.

The second item doesn't entail the first. Probabilities can be seen as a measure of lack of knowledge about a system, but it isn't necessarily so. A phenomenon can also be inherently/fundamentally probabilistic. For example, wave function collapse is, to the best of our knowledge, an inherently non-deterministic process. This is very relevant to questions about the nature of entropy - especially since we have yet to determine if it's even possible for a large system to be in a non-collapsed state.

If it turns out that there is some fundamental process that causes wave function collapse even in perfectly isolated quantum systems, then it would be quite likely that entropy is related to such a process, and that it may be more than a measure of our lack of knowledge about the internal state of a system, and instead a measurement of the objective "definiteness" of that state.

I am aware that objective collapse theories are both unpopular and have some significant hurdles to overcome - but I also think that from a practical perspective, the gap between the largest systems we have been able to observe in pure states versus the smallest systems we could consider measurement devices is still gigantic and leaves us quite a lot of room for speculation.

mjburgess 12 hours ago [-]
Even if we take that view, gravity is still basically a similar case. What we call "gravity" is really an apparent force, that isnt a force at all when seen from a full 4d pov.

Imagine sitting outside the whole universe from t=0,t=end and observing one whole block. Then the trajectories of matter, unaffected by any force at all, are those we call gravitational.

From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

Inertia, on this view, is just a kind of hysteresis the matter distribution of the universe has -- ie., a kind of remembered deformation that persists as the universe evolves.

tsimionescu 11 hours ago [-]
> From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

On the contrary, entropic gravity works pretty well for the Newtonian view of gravity as a force, and not the GR view of gravity as a deformation of space time and analogous to acceleration. Acceleration is a very elementary concept, one you find even in microscopic descriptions. Gravity being essentially the same thing makes it far more elementary than a concept like entropy, which only applies to large groups of particles.

So, if the GR picture is the right one, if gravity and acceleration are essentially the same thing, its very hard to see how that aligns with gravity being an emergent phenomenon that only happens at large scales. However, if gravity is just a tendency for massive objects to come together, as in the Newtonian picture, that is perfectly easy to imagine as an entropic effect.

sixo 5 hours ago [-]
This comment thread is exhibit N-thousand that "nobody really understands entropy". My basic understanding goes like this:

In thermodynamics, you describe a system with a massive number of microstates/dynamical variable according to 2-3 measurable macrostate variables. (E.g. `N, V, E` for an ideal gas.)

If you work out the dynamics of those macrostate variables, you will find that (to first order, i.e. in the thermodynamic limit) they depend only on the form of the entropy function of the system `S(E, N, V)`, e.g. Maxwell relations.

If you measured a few more macrostate variables, e.g. the variance in energy `sigma^2(E)` and the center of mass `m`, or anything else, you would be able to write new dynamical relations that depend on a new "entropy" `S(E, N, V, sigma^2(E), m)`. You could add 1000 more variables, or a million—e.g every pixel of an image—basically up until the point where the thermodynamic limit assumptions cease to hold.

The `S` function you'd get will capture the contribution of every-variable-you're-marginalizing-over to the relationships between the remaining variables. This is the sense in which it represents "imperfect knowledge". Entropy dependence arises mathematically in the relationships between macrostate variables—they can only couple to each by way of this function which summarizes all the variables you don't know/aren't measuring/aren't specifying.

That this works is rather surprising! It depends on some assumptions which I cannot remember (on convexity and factorizeabiltiy and things like that), but which apply to most or maybe all equilibrium thermodynamic-scale systems.

For the ideal gas, say, the classical-mechanics, classical-probability, and quantum-mechanic descriptions of the system all reduce to the same `S(N, V, E)` function under this enormous marginalization—the most "zoomed-out" view of their underlying manifold structures turns out to be identical, which is why they all describe the same thing. (It is surprising that seemingly obvious things like the size of the particles would not matter. It turns out that the asymptotic dynamics depend only on the information theory of the available "slots" that energy can go into.)

All of this appears as an artifact of the limiting procedure in the thermodynamic limit, but it may be the case that it's more "real" than this—some hard-to-characterize quantum decoherence may lead to this being not only true in an extraordinarily sharp first-order limit, but actually physically true. I haven't kept up with the field.

No idea how to apply this to gravity though.

IsTom 10 hours ago [-]
If you want to only have one possible past (i.e. can't destroy information) then when you end up in one branch of quantum state you need to "store" enough information to separate you form other branches and you really do need to have multiple possible microstates to differentiate them. If you look post-factum obviously you did end up in a specific state, but statistics do their work otherwise.
logicchains 12 hours ago [-]
Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions. Quantum mechanics assumes, as the name implies, that reality at the smallest level can be quantised, so it's completely appropriate to apply entropy to describing things at the microscopic scale.
aurareturn 10 hours ago [-]
If we knew the exact state of all particles in an enclosed system, we can calculate what future states will be exactly. No need to calculate possible states.
mensetmanusman 27 minutes ago [-]
Quantum uncertainty actually says no to this. There is an ‘error’ in any propagating probability field.
IAmBroom 8 hours ago [-]
Since that's not possible in any physical system of one or more particles, it's irrelevant.
kgwgk 10 hours ago [-]
> Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions.

There are no probability distributions over possible states when there is perfect knowledge of the state.

> Quantum mechanics

Entropy is also zero for a pure quantum state. You won’t have entropy without imperfect knowledge.

whereismyacc 10 hours ago [-]
> There are no probability distributions over possible states when there is perfect knowledge of the state.

I know very little about physics but I thought that the leading interpretations of quantum physics say that the probability distribution is all we can know about a system. The entropy is not due to due to a lack of information about the quantum state, but because the outcomes are inherently stochastic?

kgwgk 10 hours ago [-]
Entropy is about the state - not about “outcomes”.

“All we can know” is the precise state - at least in principle - and entropy is zero in that case.

mr_mitm 10 hours ago [-]
Just look at the definition of entropy. Knowledge about a system never enters the equation.

S := -k_B sum p_i ln (p_i)

ajkjk 7 hours ago [-]
As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

antonvs 3 hours ago [-]
You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

kgwgk 10 hours ago [-]
p_i

Edit to add lots of words:

In the definition of entropy

S := -k_B sum p_i ln (p_i)

knowledge about the system enters the equation in the p_i terms.

The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

mr_mitm 10 hours ago [-]
Please communicate in full sentences with me.

I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.

kgwgk 9 hours ago [-]
The same microstate will have different probabilities depending on what are the constraints or measurements used in _your_ description of the system.

If you choose to describe the system using its microstate - and you know it - there are no probabilities anywhere.

You can of course know something and choose to ignore it - the entropy is still a reflection of the uncertainty (actual or for the sake of a lower-resolution model).

tsimionescu 9 hours ago [-]
But the point is that, regardless of how you choose to describe or even measure the system, it will need exactly as much heat to raise its temperature by 1 degree (or it will need as much kinetic energy to increase the average velocity of the constituents by the same amount, in the microstate framework). So there is some objective nature to entropy, it's not merely a function of subjective knowledge of a system. Or, to put it another way, two observers with different amounts of information on the microstate of a system will still measure it as having the same entropy.
kgwgk 7 hours ago [-]
There is some objective nature to the operational definition of entropy based on an experimental setup where you fix the volume and measure the temperature or whatever.

And this is related to the statistical mechanical definition of entropy based on the value of the corresponding state variables.

But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Thermodynamics assumes the state variables provide a complete description of the system. Statistical mechanics assumes the state variables provide an incomplete description of the system - and work out what that entails.

tsimionescu 6 hours ago [-]
> But it’s not a property of the microstate - it’s a property of the macrostate which makes sense only in the context of the experimental constraints and measurements.

The same can be said about the wavefunction then, right? You can't directly observe it, you can only use it to predict the statistics of a particular experimental setup. So, at worse, entropy is as real as wavefunction amplitudes.

> If we relate entropy to work that can be extracted someone with a better understanding of the state of the system and operational access to additional degrees of freedom can extract additional work.

Is this actually true? Per my understanding, if I give you three containers, two of which are filled with some kind of gas that you know nothing about, and the third with a mix of those same gases, you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy. So, you can extract more work from one of the boxes despite not knowing anything more about it.

kgwgk 6 hours ago [-]
> Per my understanding

What’s the source of that understanding? You cannot measure the entropy, only changes of entropy - which will be the same (for an ideal gas).

Edit: we already had this discussion, by the way: https://news.ycombinator.com/item?id=42434862

tsimionescu 4 hours ago [-]
> You cannot measure the entropy, only changes of entropy

You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy.

And thanks for looking it up! I remembered a very similar conversation and was wondering if you were the same person, but was a bit lazy to search :)

kgwgk 3 hours ago [-]
> You can measure the changes in entropy from a minimal state and integrate - and you'll get the "total" entropy

That doesn’t help with the following (at least if you keep those kinds of gas in gas state):

> if I give you three containers […] you can measure their entropy using thermodynamic experiments and tell which of the three is a mix of the other two because it will have a higher entropy

But you can weight them, it’s much easier.

immibis 1 hours ago [-]
Entropy can be defined as the logarithm of the number of microstates in a macrostate. Since transition between microstates is reversible, and therefore one-to-one (can't converge on any particular microstate, can't go in cycles, have to be something like a random walk) we're more likely to end up in a macrostate that holds a larger number of microstates.

For example, there are many more ways your headphone cord can be tangled than untangled, so when you pull it out of your pocket, and it's in a random state, then it's very likely to be tangled.

If entropy causes gravity, that means there are more somehow more microstates with all the mass in the universe smooshed together than microstates with all the mass in the universe spread apart.

whereismyacc 10 hours ago [-]
It sounds like you're talking about information entropy which to my understanding is analogue to but not the same as entropy in physics?
ajkjk 7 hours ago [-]
It pretty much is the same, except that entropy in physics usually has a constant in front of it.
mjanx123 10 hours ago [-]
Entropy is the opposite of potential
echelon 6 hours ago [-]
Entropy is complicated beyond just a Rankine or Carnot cycle.

Biology thrives at the ebbs, flows, and eddies of entropy. Predation. Biochemical flux. There are arrows flowing every which way, and systems that keep it finely tuned.

This theory, based on my surface level reading and understanding, is that the aggregate particle-level entropy within sub light speed systems creates gravity.

bmitc 9 hours ago [-]
> It's a made-up thing by humans.

All of physics is made up by humans.

12 hours ago [-]
colanderman 3 hours ago [-]
See also: emergent fox-treasure gravity in Skyrim: https://www.eurogamer.net/skyrims-myth-of-the-treasure-fox-f...

TLDR: areas around treasure have higher entropy by a measure relevant primarily to stochastic movement of foxes. Since there are thus on average more ways for a fox to walk toward treasure than away, they tend to gravitate toward treasure.

fourthark 5 hours ago [-]
> "The ontology of all of this is nebulous"
hoseja 13 hours ago [-]
Like some sort of buoyancy?
deadbabe 5 hours ago [-]
Couldn’t it be a byproduct of frame dragging? Any massive object that spins is pulling stuff into it by forcing things to rotate in some kind of space time whirlpool?

This means if something massive doesn’t spin, it would have no gravity, but isn’t everything large enough to have gravity in space pretty much spinning?

ang_cire 4 hours ago [-]
No, many asteroids have detectable, "functional" (e.g. they pull dust towards them) gravitational fields, but are not spinning like planets.
vkou 3 hours ago [-]
Given that we can experimentally observe the gravitational attraction between two non-spinning objects in a lab, I don't think that's the answer.
nathias 5 hours ago [-]
Descartes had a similar idea, but as far as I remember, it requires ether.
dist-epoch 13 hours ago [-]
We all know that life on Earth gets it's energy from the Sun.

But we also know that's an approximation we tell kids, really life gets low entropy photons from the Sun, does it's thing, and then emits high entropy infrared waste heat. Energy is conserved, while entropy increases.

But where did the Sun got it's low entropy photons to start with? From gravity, empty uniform space has low entropy, which got "scooped up" as the Sun formed.

EDIT: not sure why this is downvoted, is the explanation Nobel Physics laureate Roger Penrose gives: https://g.co/gemini/share/bd9a55da02b6

dawnofdusk 6 hours ago [-]
This is just a question about the origins of inhomogeneity in the universe. The prevailing theory is cosmic inflation, I believe: in the early universe a quantum field existed in a high entropy state and then the rapid expansion of space magnified small spatial inhomogeneities in the field into large-scale structures. What we see as "low entropy" structures like stars are actually just high entropy, uniform structures at a higher scale but viewed from up close so that we can see finer-scale structure.
uncircle 13 hours ago [-]
Your question fascinated me. Googling "where did the Sun got its low entropy" I also came across these explanations:

"Solar energy at Earth is low-entropy because all of it comes from a region of the sky with a diameter of half a degree of arc."

also, from another reply:

"Sunlight is low entropy because the sun is very hot. Entropy is essentially a measure of how spread out energy is. If you consider two systems with the same amount of thermal energy, then the one where that energy is more concentrated (low entropy) will be hotter."

https://physics.stackexchange.com/questions/796434/why-does-...

Probably it's a bit of both. I'm not sure I understand your hypothesis about the Sun scooping up empty, low-entropy space. Wasn't it formed from dusts and gases created by previous stellar explosions, i.e. the polar opposite of low entropy?

im3w1l 10 hours ago [-]
The universe was low entropy at the time of the big bang, and even though entropy is steadily rising, the universe is still pretty low entropy.
dist-epoch 13 hours ago [-]
I read the gravity explanation for the sun low entropy in the "Road to Reality" book from Roger Penrose. Asked Gemini to summarize the argument (scroll to end)

https://g.co/gemini/share/bd9a55da02b6

gattr 10 hours ago [-]
It's also in his previous book "The Emperor's New Mind: Concerning Computers, Minds and The Laws of Physics", along with a lot more. Strongly recommended (even though after reading a lot of Greg Egan, my views on consciousness somewhat shifted towards "classical computation can do it, too".)
eru 6 hours ago [-]
Yes, the main argument in the Emperor's New Mind seems to boil down to 'consciousness is weird, and quantum stuff is weird, so consciousness needs to be quantum'.

If you can look past that, there's some good other material inside.

mjanx123 10 hours ago [-]
The photons do not have entropy.

The photons from Sun are hot, the space around Sun is cold, the system has a low entropy.

If the space around Sun was as hot as the photons, the entropy would be high.

aurareturn 10 hours ago [-]

  But where did the Sun got it's low entropy photons to start with? From gravity, empty uniform space has low entropy, which got "scooped up" as the Sun formed.
From the Big Bang originally. We don’t know what caused the Big Bang.
gavinray 8 hours ago [-]
The end of the previous Big Bang, a-la Big Bounce ;^)

"It's turtles all the way down."

jerf 5 hours ago [-]
One of the major challenges with "Big Bounce" that media coverage of it tends to overlook is that it is not entirely clear how the previous universe, which is presumably high entropy if it's supposed to be like ours, becomes the low entropy feedstock for the next universe. There's still a "And Here There Be Magic" step there.

I'm not saying there's no solution; indeed, this is the sort of thing where the problem is that the profusion of proposed solutions is exactly the thing that shows there's a problem there. I think people tend to intuitively think that "lots and lots of possible solutions" is somehow better than "no solutions at all" but they're actually nearly the same thing.

layer8 2 hours ago [-]
You’d have to explain how the steadily increasing entropy in our universe would revert to a low-entropy state again.
riskable 5 hours ago [-]
Based on the theory of gravity in the article it's actually, "Archimedes Principle all the way down."

https://en.wikipedia.org/wiki/Archimedes%27_principle

omeysalvi 13 hours ago [-]
"There’s some kind of gas or some thermal system out there that we can’t see directly" - The Ether is back on the menu boys
ThinkBeat 2 hours ago [-]
95% of the Universe is made up of dark matter and dark energy. These are words astronomers have come up with to give a name to the mysterious, invisible side of the Universe
cantor_S_drug 6 hours ago [-]
What Causes Gravitational Time Dilation? A Physical Explanation.

https://www.youtube.com/watch?v=DjwQsKMh2v8

I like the river model which helps with the intuition.

whycome 13 hours ago [-]
Caloric. Dark matter. Cosmological constant.

We like placeholders for the unknown.

killerstorm 12 hours ago [-]
Isn't that how equations get solved?

Pretty much anything known entered through such placeholder, it's just that equations could be connected more easily.

It's not like Higgs field is something you can directly observe

Keyframe 10 hours ago [-]
Right, but you can push unknowns into tmp vars only so much before you have to introduce constraints, otherwise it's all downright undetermined. You have to inject a structure into the placeholder soup or you're just pushing ambiguity around with no real net gain.. which is also fun to play around, question is will you get a paper out of it or even paid if you play like that to no end.
jstanley 12 hours ago [-]
Maybe, (I don't know), but it's easy to accidentally come up with a theory of "mysterious stuff" that appears to explain something, but neither constrains your expectation nor provides predictions.

Phlogiston is the classic example. https://www.lesswrong.com/posts/RgkqLqkg8vLhsYpfh/fake-causa...

mr_toad 6 hours ago [-]
The Phlogiston theory made one crucial prediction - that the speed of light would vary depending on the observer’s movement through the ether. That prediction turned out to be famously wrong.
FrustratedMonky 8 hours ago [-]
Its a process.

You find some un-identified variables.

Form some hypothesis, try to narrow it down.

Sometimes it is a discovery, new particle, and sometimes it is nothing.

But that is how science works.

At some point in time, everything was an unknown, and people had to work with unknowns.

This whole movement from the 'right' that all science has to know the answers ahead of time in order to justify spending money, is hindering progress. How can you know the results are worthwhile, in order to justify funding, before doing the research to know the results?

RGamma 8 hours ago [-]
Primordial black holes.
jstanley 12 hours ago [-]
Don't forget phlogiston.
holowoodman 12 hours ago [-]
Virtual Particles!
bandrami 12 hours ago [-]
Was that de Broglie's thing? I always thought it didn't get a fair shake
holowoodman 12 hours ago [-]
Virtual particles and related effects are actually widely accepted and experimentally proven (at least partially). Current physics wouldn't really work without them, or at least something that looks the same.

https://en.wikipedia.org/wiki/Casimir_effect

https://en.wikipedia.org/wiki/Zero-point_energy

https://en.wikipedia.org/wiki/Virtual_particle

https://en.wikipedia.org/wiki/Hawking_radiation

The gist of it is, that quantum mechanics prevents vacuum from really being empty. Any finite-size system or any system with some kind of influence/force/anything will have a lowest energy state that is not actually zero energy but slightly above. Which means that this non-zero can fluctuate and on occasion pair-produce and pair-annihilate particles (probability inversely depending on pair energy).

And yes, this sounds like some kind of ether...

tsimionescu 10 hours ago [-]
The Wikipedia article that you quote is quite explicit that, while virtual particles are a widely accepted mathematical tool, they're actual existence of elements of reality is very much not widely accepted, and definitely nowhere close to "experimentally verified". It's in fact considered impossible to verify experimentally, even in principle.

Note that there are many very widely used physical theories that include mathematical elements that are not necessarily assigned any physical meaning. The Poynting vector in classical electrodynamics, for example, carries no widely accepted physical meaning, even though it appears in many well verified and used calculations. This doesn't make the theory suspect or anything, I'm not trying to imply that - simply that virtual particles being "real" or not is a mostly philosophical question that has no widely accepted consensus.

holowoodman 9 hours ago [-]
Those particles are virtual in that they don't really exist, so you are right that proving them isn't actually possible, because they are simply not there, just virtually, in our mathematical imagination. In quantum mechanics[1], this isn't really a "doesn't exist" kind of thing, rather it means that the wave function is there, leading to the (slim) possibility of existence through some kind of wave function collapse.

What is proven is that e.g. vacuum energy / zero point energy exists (not actually in the StarGate sense of extractable energy, just that the lowest energy state of any physical system isn't zero), and that the Casimir effect exists. Vacuum energy directly leads to virtual particles through pair production (which is a proven mechanism, at high energies, for low energies we do suspect that there isn't a cutoff there), and also influences e.g. high-energy cosmic rays leading to an observed high-energy cutoff (although there are other possible explanations for that cutoff and lack of very-high-energy cosmic rays). The Casimir effect is most easily explained by virtual particles and vaccum energy.

In Hawking radiation, the idea is actually that virtual particles through interaction with the gravity of the black hole become real particles. The event horizon actually makes those wave functions collapse such that real particles start to exist. Hawking radiation hasn't been observed yet, however.

[1] non-Kopenhagen QM has the same consequences, it's just even harder to explain actually.

griffzhowl 10 hours ago [-]
You're probably thinking of the de Broglie-Bohm pilot wave theory, where there are actual particles with determinate trajectories at all times, which are probabilistically guided by a wave. I think they main problem with this idea is that it can't be made relativistically invariant, and so it can only be used for systems with low realtive velocities of its components.

OTOH de Broglie for one of the central ideas in the development of quantum mechanics: he inverted Einstein's idea about photons, which were previously thought to be waves but Einstein showed how they came in particle-like quanta. de Broglie realised you could apply the same thinking to matter, which had previously been thought of as particles, and describe them using waves. Subsequent observation of wavelike dynamics (diffraction) of electrons in the Davisson-Germer experiment got de Broglie the Nobel prize.

grumbelbart2 10 hours ago [-]
It has been back for a while in the form of quantum fields.
8 hours ago [-]
metalman 12 hours ago [-]
gravity=time
evanb 6 hours ago [-]
whoa dude, that means gravity = money by transitivity. deep.
layer8 2 hours ago [-]
So can we make hoverboards by losing money?
the_sleaze_ 3 hours ago [-]
And that means gravity is the root of all evil. And what is a root? The bottom of something. Therefore gravity is the distilation of pure evil. This is proved by the question Could there be evil without gravity? Nope. And where do we find the most gravity? Yep. Black holes.

Gravity is high entropy evil and Black holes are entrances to Hell.

brador 11 hours ago [-]
Anti matter is created and repulsed and expelled, leaving a vacuum, things get sucked into that vacuum, creating the illusion of gravity, that’s my novel theory.
IAmBroom 8 hours ago [-]
Vacuums don't suck; high pressure repels.

Similarly, umbrellas aren't places to stand under when it's not raining.

john_moscow 14 hours ago [-]
Space exists around things with mass. Also, above-absolute-zero temperatures cause particles to jump around randomly.

Now if there is "more space" around particle A, particle B will have a slightly higher statistical chance of randomly jumping closer to it, than farther.

Rinse-repeat. Gravity as we know it.

meindnoch 12 hours ago [-]
>Also, above-absolute-zero temperatures cause particles to jump around randomly.

Does it? A single free particle won't "jump around randomly". Thermal motion is plain Newtonian motion with an extremely high rate of collisions. There's nothing random about it (let's put quantum things aside for now).

AlexandrB 3 hours ago [-]
This made me think of Norton's Dome[1] and how a particle would choose a direction to move when warmed from absolute zero to above absolute zero. Though I guess, "warming" in this context would mean a collision with another particle and that would determine the initial direction?

[1] https://en.wikipedia.org/wiki/Norton%27s_dome

JPLeRouzic 13 hours ago [-]
It sounds a bit like Le Sage's theory of gravity:

https://en.wikipedia.org/wiki/Georges-Louis_Le_Sage

strogonoff 11 hours ago [-]
If space existed around things with mass, then what would you call the emptiness that replaces space the further you go away from things with mass?
bravesoul2 13 hours ago [-]
> particle B will have a slightly higher statistical chance of randomly jumping closer to it,

Why?

Also how do you explain acceleration due to gravity with that model. How do you explain solid objects?

MaxikCZ 12 hours ago [-]
My guess would be the answer is right in the part before you quote? If theres more "space" (imagining more space coordinates possible) for me on the left than on the right, me jumping to a random location would statistically move me left.

Repeating results in movement, getting closer to the object intensifies this effect, results in acceleration.

Solid objects are products of electric charge preventing atoms/particles from hitting each other, I dont think that has to have to do anything with gravity in this example?

bravesoul2 8 hours ago [-]
I don't understand the more space thing then. Is this more space due to spacetime curvature or something else.

E.g. if we have earth and moon:

    O   o
Why is there more space from the moon towards earth than away?
jblezo 6 hours ago [-]
Spacetime curvature.

Like if you dropped the earth on a giant sheet, it would stretch the sheet more than what the moon would have.

enriquto 13 hours ago [-]
Sounds fun!

Would this imply that cold objects have weaker gravity?

psittacus 13 hours ago [-]
Isn't this something we already know from the mass–energy equivalence? In the same way that a nuclear reaction that produces heat must cost the object mass (and therefore gravitational pull)
Quarrel 12 hours ago [-]
It does, but because you have to divide the energy change by c^2, it is really really hard to detect it, and mostly overwhelmed by other effects of the heating/cooling.
enriquto 11 hours ago [-]
why do the units matter here? Under this theory, will a body at absolute zero have no observable mass? No attractive field around it, no inertia if you try to move it.
Woansdei 13 hours ago [-]
sounds more like the reverse to me, movement away from denser areas (less space), so like water leaking out of a container.
amai 4 hours ago [-]
So entropy can curve spacetime? My fridge locally lowers entropy, does that mean inside my fridge gravity is lower than outside?
cwharris 8 hours ago [-]
This seems backwards. Entropy is a dispersive force — it favors distribution and disorder. But the universe clumps. Planets, stars, galaxies — all of them are low-entropy configurations.

So how did scattered dust particles form the planet we’re standing on… through entropy?

If gravity is just emergent from entropy, then it should be fighting against planet formation, not causing it. There’s a missing piece here — maybe coherence, resonance, or field attraction. But “just entropy”? That doesn’t explain formation. It explains dissolution.

jetrink 6 minutes ago [-]
> Entropy is a dispersive force — it favors distribution and disorder. But the universe clumps. Planets, stars, galaxies — all of them are low-entropy configurations.

Entropy does not always imply dispersion or mixing. Your intuition becomes incorrect in the presence of a field like gravity. Take, for example, a set of marbles balanced on the rim of a bowl. Spatially, it's disordered: they are spread out randomly around the rim. You tap them gently and they all roll to the center of the bowl. You've made the room more tidy! But in the process, the potential energy of the marbles has been released as heat. The entropy of that heat is greater than the entropy you've reduced by organizing the marbles.

heyjamesknight 7 hours ago [-]
Entropy isn't a force. It doesn't "favor" anything. Its a property of statistics, information, and distributions.

Also why does this have that particular ChatGPT social media post rhythm to it? Please, Lord, tell me we haven't reached the point where people are writing HN comments w/ AI.

cwharris 6 hours ago [-]
You’re right. the cadence is written by ChatGPT. I’m pretty terrible at keeping my thoughts cohesive, so I often use it as a post processor. I’ll try not to do that.

Because you had the decency to respond, I’ll spent some more time thinking about this and see if I can come up with a more well rounded response that incorporates more of the traditional language of physics. But to your point about entropy not being a “force”, you’re probably right. Someone got to choose what that word means, and I’m probably not using their definition. But let me ask you this… would you rather have a book that explains everything and not know how to read it, or trust your own eyes ears and hands, and not be able to share it?

prophesi 6 hours ago [-]
> would you rather have a book that explains everything and not know how to read it, or trust your own eyes ears and hands, and not be able to share it?

Maybe use AI to help you understand TFA instead of writing gut reactions to the title.

cwharris 5 hours ago [-]
I... Do... With quite a lot of articles... And I build semiconductors in my garage using what I learn.

I just don't see how this particular article would be beneficial, even if it _were_ correct.

prophesi 4 hours ago [-]
Are these the very same semiconductors writing your comments? NGL, a 9 year old account with 0 posts that suddenly starts posting AI-assisted comments is very suspicious.
cwharris 3 hours ago [-]
The problems I wanted to solve couldn't be solved with software, so I started researching ceramic semiconductors, starting with positive temperature coefficient ceramics as fail-safe heating elements. The geometry I needed to manufacture for that project wasn't scalable in a way that solved the problem for enough people, so I switched to working on DC cold atmospheric plasma. Saw enough progress there to convince myself it's possible, but wasn't happy with the current HV supplies on the market, so I'm working on making my own based on converting compressed argon in to a high voltage source that self-regulates based on plasma generation in an attempt to not exceed metastable argon charge levels, which would produce NOx and Ozone at completely harmless levels (unless maybe you're using it during surgery) but are heavily regulated.

It's uhh... been a ride.

But yes, posting to hacker news is a new thing. Because I'm seeing the limitations of the world we live in through the lens of someone who's gone through industry long enough to know how slowly controlled progress is, and beginning to see what happens when you apply semiconductors to more than just microprocessors on your own terms. The world is stagnating, not because we don't have what it takes to bring about the future... but because 1) we do, 2) the people in control don't care to make it happen, and 3) everyone has their hands tied up in corperate profits while we wait for someone to make a move that makes things better.

I'm just... done waiting.

bee_rider 6 hours ago [-]
We’ve definitely reached that point. I’ve seen responses that are essentially,

Well, here’s what ChatGPt has to say:

<begin massive quote>

If folks are doing that, then I assume they are also quoting it without citation—although, I have no idea about this case. It looks sort of rambling for ChatGPT, doesn’t it?

echelon 6 hours ago [-]
Because it has emdashes and ellipses that Chrome, Firefox, Mac, Linux, and Android text input controls do not natively produce.

I don't know about iPhone.

If you see these artifacts, it was authored in some other software (be it an editor or LLM) and pasted over.

2 hours ago [-]
trealira 6 hours ago [-]
Android does let you produce em dashes. I'm typing this with Google Keyboard right now.

If you hold the hyphen button, you get options for an underscore, an em dash (—) an en dash (–), and the dot symbol (·). The ellipsis (…) can be written by holding the period button.

But yeah, the commenter admitted it was authored by AI. But even if you converted all the em dashes to "--", it would still have a ChatGPT cadence.

> There’s a missing piece here — maybe coherence, resonance, or field attraction. But “just entropy”? That doesn’t explain formation. It explains dissolution.

Even ignoring the em dash, it just screams ChatGPT.

cwharris 5 hours ago [-]
I like how people are recognizing "OH THIS IS TOKEN OUTPUT", and that's like... the only thing you can come up with to refute the argument?

Like not the actual content or meaning of the text, which I chose to post, but the mere fact that it wasn't typed in to a keyboard directly in a hacker news text box, but rather pasted after using a tool to refine the verbiage.

Honestly? Great test for most posts. We live in a world surrounded by people who are just copy and pasting ChatGPT answers like a magic 8 ball, and I respect your instinct to try to avoid those.

But that's not how I use ChatGPT, because I understand how language models work and choose to use them intentionally to help me nagivate ideas and learn new concepts (as well as write memos after-the-fact). Not just take a hollow "sounds good" response and post it to take up people time.

:shrug:

cryptonector 6 hours ago [-]
iPhone definitely turns `--` into `—`, at least sometimes.
cwharris 5 hours ago [-]
Here's the non-ChatGPT rant that I was attempting to not spew all over the internet.

> “There’s some kind of gas or some thermal system out there that we can’t see directly,” >

Posit that there’s something we don’t know about, and we’re supposing it’s gas-like. This is what I like to refer to as “imagination”, and it’s a great way to start thinking about problems. The fact that it’s showing up in an article suggests they didn’t get much further than imagination, but I’ll keep reading…

> “But it’s randomly interacting with masses in some way, such that on average you see all the normal gravity things that you know about: The Earth orbits the sun, and so forth.” >

Cool. We’re back on everything being “random” again. Modern interpretations of quantum mechanics has really torn a hole in the idea of causality by replacing it with the idea that we can’t explain why things happen, but we CAN model it statistically, so we’ll assume the model is right and stop looking for causal relationships entirely.” It’s lazy pessimistic psuedo-science, and I don’t buy it. I don’t outright REFUTE it, but I’m not basing my understanding of nature on it just because a bunch of smart people decided to stop looking.

On the paper the article refers to:

> Consider a pair of massive pistons with a non-interacting gas between them, as in Fig. 1. >

Cool. Happy to consider it. But I am curious… Are there existing examples of particles that do not interact with particles of like kind? Neutrinos and Photons come to mind. But has anyone proven that they don’t interact, or are we just assuming they don’t interact because we haven’t put the effort in to try and detect interactions? But sure, let’s consider the possibility.

> What this exercise demonstrates is that the two pistons feel an effective force between them, namely the pressure, which is mediated by the gas rather than some fundamental quantized field. >

Honestly? I love this. I don’t care about “fields” at all, personally. I feel like it’s more intuitive to think of fields as reinforcement of particle interactions over time and space. An electon moves? So do all of the others. A lot of them move the same way? The others feel that combined movement at distance according to C. Magnetic flux? Interplay of electron inertia reinforcment delayed by the time it takes for the repulsive forces to make their way around a coil (or whatever other medium according to it’s influence) and allow spin to align. Falsifiable? Yes. Relevant intuitive observation? Yes. Taken the time to write out the math myself in languages I don’t know? No.

> <… lot’s of math that proves individual hypothetical (sorry, theoretical) particle interactions can explain how gravity emerges…> >

Cool. I’m almost certain that if I took the time to check their math, it would be meaningfully accurate and absolutely show that this is a way you can view gravity.

But let me ask you… Why the hell would anyone want to think about gravity like that, and why are we trying to explain things in terms of entropy when it clearly has no applications outside of “well, I guess everything is left up to chance, and there’s nothing left to be discovered.” I reject this hypothesis. I reject the idea that everything we see, feel, hear, and know was at one point non-existant, and somehow emerged at this level of complexity such that we are capable of not only cognition but also direct observation of physical phenomena while simultaneously being physical phenomena ourselves. There is something else. And no, it’s not “God”. But it sure as hell isn’t “everything’s just falling apart in interesting ways”. And I get that that’s not the “full idea” behind entropy, but it is entropy’s brand identity, and it is the implication behind entropy as the driving force of nature (sorry, I used force again. I forget we're not allowed to say that about the thing we're using to explain how all of the real forces emerge. my bad). Heat death of the universe as a result of entropy? I’m onboard. Red shift? I get it. Entropy is a great “welp I guess that’s the reason again”, but the mental gymnastics it takes to represent gravity as a result of this? Give me a freaking break.

There’s a simpler explanation for all of this that models well across domains, and nobody is willing to admit it because it doesn’t fit the narrative. Phase-lock. Waveforms that mesh together in torsional space time reinforce each other, sometimes purely locally through identity changes (fusion), and sometimes via interdependant standing waves (non-fundamental particles, atoms, molecules, etc etc). Entropy is just what happens when coherence fails to resolve locally and must resolve non-locally (chemical interactions, fission, dielectric breakdown, photoelectric effect). Most things can be modelled this way: as stable geometric configurations of quantum wave functions representing self-reinforcing torsional spacetime harmonics. And if you take a second to consider it, maybe this single paragraph _is_ a more intuitive explanation of gravity, too.

ajkjk 7 hours ago [-]
There is a whole article explaining it... if you don't read the article, how do you expect to know the idea?
cantor_S_drug 6 hours ago [-]
Actually Roger Penrose also had this line of thinking if my memory serves right.
cryptonector 6 hours ago [-]
You have it backwards. The lowest entropy state of the universe would be if there were no attractive forces, only repellent forces, as then all particles would be forced into something of an expanding lattice, but with all particles equidistant from all nearest neighbors (of the same type).

It is gravity which disrupts this and causes clumping, and that _increases_ entropy.

I know it's confusing because normally one would think of a cloud of gas as more disordered than the star it might collapse into, but that is not so. For one the star would be much hotter, and the motions of every particle in the star much more chaotic.

konschubert 7 hours ago [-]
This is not a philosophical discussion
cwharris 5 hours ago [-]
This paper is literally physical philosophy. To be science, it would require recursive experimentation, observation, and adjustment to hypothesis, until the model it proposes becomes a stable, reliable, and (most importantly) useful interpretation.

It does none of that, and so I have no responsibility to do so prior to discussing it.

8 hours ago [-]
neuroelectron 7 hours ago [-]
The speed of light is C, a constant. Mass is composed of these particles that are bound by C. Because they are vibrating, a lot of that speed is being wasted in brownian motion. So the denser it is, the more your average vector is going to be toward more dense brownian motion as the particles interact and induce more brownian motion. The gradient has a natural sorting effect.

Seems pretty intuitive to me. The question remains though, what is this density made of since gravity exists in a vacuum? Quantum fluctuations popping in and out of reality? Does this infer that quantum fluctuations are affected by mass as well? It would seem so since in Bose Einstein Condensate, what is "communicating" the state across the BEC if the particles are no longer interacting?

steamrolled 7 hours ago [-]
> Because they are vibrating, a lot of that energy is being wasted in brownian motion. So the denser it is, the more your average vector is going to be toward more dense brownian motion as the particles interact and induce more brownian motion ... Seems pretty intuitive to me.

So this is why warm objects weigh more?

Xcelerate 6 hours ago [-]
Warm objects actually do weigh more than their counterfactual cold versions haha. The stress energy tensor is the quantity to look at here.
neuroelectron 6 hours ago [-]
I didn't know this, thanks for sharing.

https://herebeanswers.com/things-weigh-heavier-or-lighter-wh...

patcon 6 hours ago [-]
I feel like you have somehow found the least authoritative source for the wonderful new information provided...

why did you choose that one? serious question, because I'm trying to understand your process (and yes, maybe gently challenging it, but unsure if I should, bc you are clearly knowledgeable in specific ways about this)

neuroelectron 6 hours ago [-]
Thanks, I appreciate it
parineum 6 hours ago [-]
This reads like a sarcastic quip so, sorry if it wasn't but, they do. Solve for m in E=mc^2 and see what happens when objects have more energy.
6 hours ago [-]
danparsonson 7 hours ago [-]
> Seems pretty intuitive to me

OK, but it's nonsense. Apart from whatever-you're-talking-about-with-C, quantum fluctuations are not Brownian motion; Brownian motion is the visible effect of a lot of invisible particles interacting kinetically with macroscopic particles like dust, making those macroscopic particles appear to vibrate of their own accord. Atoms that cannot be seen in a microscope flying around in straight lines and randomly bumping into dust particles that can be seen.

https://en.m.wikipedia.org/wiki/Brownian_motion

6 hours ago [-]
ajkjk 7 hours ago [-]
Doesn't sound intuitive at all really...
neuroelectron 6 hours ago [-]
You can see it in action with a simple random walk program. Allow the steps to decrease in size toward one side of the screen and they will statistically be sorted toward the shorter steps.