Thursday, May 12, 2022
#science #physics #ideas The Biggest Ideas in the Universe | 20. Entropy and Information
#science #physics #ideas
The Biggest Ideas in the Universe | 20. Entropy and Information
127,875 viewsAug 4, 2020
Sean Carroll
154K subscribers
The Biggest Ideas in the Universe is a series of videos where I talk informally about some of the fundamental concepts that help us understand our natural world. Exceedingly casual, not overly polished, and meant for absolutely everybody.
This is Idea #20, "Entropy and Information." The increase of entropy over time is responsible for the difference between past and future, and is thus in a very real sense the most important thing in the universe. We talk about how this works and what it has to do with the concept of information.
My web page: http://www.preposterousuniverse.com/
My YouTube channel: https://www.youtube.com/c/seancarroll
Mindscape podcast: http://www.preposterousuniverse.com/p...
The Biggest Ideas playlist: https://www.youtube.com/playlist?list...
Blog posts for the series: http://www.preposterousuniverse.com/b...
Background image: https://www.pxfuel.com/en/free-photo-...
#science #physics #ideas #universe #learning #cosmology #philosophy #entropy #information
296 Comments
rongmaw lin
Add a comment...
Bob Bobbity
Bob Bobbity
1 year ago
The fact that you've put out 20 videos now in this series is incredible and I can't thank you enough. You're seriously making a massive difference in people's lives by dropping this knowledge
172
Paul C.
Paul C.
1 year ago
I know I say this every episode - and at the risk of repetition - Thanks again Professor Sean, for another great video in this excellent series. I feel that I have to express my gratitude, not only as a matter of common courtesy, but also because these lectures are simply the best things to watch on a screen, anywhere, anytime. They really should be on Prime Time TV, along with some of the other good Physics Channels as well.
42
Saksin
Saksin
1 year ago
Incredibly helpful to have the full four "versions" of entropy outlined like this. Brilliant pedagogy, much appreciated!
10
Salvatron Prime
Salvatron Prime
1 year ago
Thank you Sean for these videos, they are the absolute best and most comprehensible series of science lectures I have ever seen. All for free! You are a genius and a gentleman, kind sir.
15
Fred Zlotnick
Fred Zlotnick
1 year ago
I am a graduate electrical engineer, and retired a few years ago. I wish my instructors were half as good a you. It is nice to come back to physics after so many years.
16
Rick Harold
Rick Harold
1 year ago
Awesome as always !
I finally made through the 40 (I include QA as they are like their own lesson honestly) I love it.
With my math and CS background I understood 70% of the videos! Unfortunately when later trying to explain to my teen kids I remember 20%. But..I love 100% of the videos. With infinite time I’d rewatch them again. Thanks for the video series.
8
Om3ga
Om3ga
1 year ago
Your explanations are on the next level!!!
24
Rhonda Goodloe
Rhonda Goodloe
1 year ago
Sean, thanks for making this quality of information available to anyone (no prerequisites or tuition required).
7
Søren Herstrøm
Søren Herstrøm
1 year ago
Thank you very much for these great videos!
A question regarding entropy: Arrow of time is due to increasing entropy. However, entropy can decrease locally, how to relate this to the arrow of time?
3
Skorj Olafsen
Skorj Olafsen
1 year ago
Congrats on passing 100k subscribers. Good reach for a physics lecture series!
10
Dale Victor
Dale Victor
1 year ago
Thank you for this and other lectures. I have enjoyed every one and have renewed my interest in Physics and Mathematics. I am also the person who has been purchasing your books.
3
joxter jones
joxter jones
1 year ago
I'd just like to add my personal thanks for your kindness and generosity in putting together this lecture series for all of us to learn and enjoy. This particular lecture is a masterpiece and weaves together many profound ideas in a clear and comprehensible way - the mark of a great teacher.
4
Sandra sandra
Sandra sandra
1 year ago
A great new video, thank you, dr. Sean! You are building democracy, and a better world, because culture and science are the ground to make the world a fairer, a better, a safer place. Many skilled people can't afford expensive experience of study. You really help, sharing for free your top-level knowledge in a systematic way. You great scientist and also great man!
6
Shera
Shera
1 year ago
First of all, thank you so much for this great lecture and for the series in general! Two things that I learned from this episode are:
1. The 2nd law of thermodynamics (related to entropy) gives time its direction.
2. Quantum mechanics forces entropy (when sub-systems of a larger entangled system)
Now can one infer from these two points that quantum mechanics is what (indirectly) gives time its direction?
Eugene Dudnyk
Eugene Dudnyk
1 year ago
Thanks for the deep and thoughtful explanation. Question: is there a relation between information / entropy and the energy? Can it be possibly derived from the dark energy and the low entropy past?
Edgar A. Leon
Edgar A. Leon
1 year ago (edited)
Thank you Sean, great! This one entry is one of the most interesting videos one can ever see about the subject since it covers so many concepts and let one thinking about it.
Can you expand a little more (I don't know if you already made the Q&A video) about the idea you mention in 56:25 of entropy with a minimum implying that time arrow flows opposite (and so validates the "past hypothesis") for far observers in the past? I think it resembles the way Hawking discusses in Brief History of Time that we would perceive time flowing reversely when it goes for a possible big crunch, but also it cames to mind to the more recent proposal by Neil Turok and others about the CPT symmetric Universe... can also have a relationship with your proposal of 2004?
Also, I remember that in the mid 1990's Bekenstein and Mukhanov had the idea that entropy could be quantized (for black holes but since the generalized second law of Bekenstein I suppose this also generalizes). If this is the case, would this imply a discretization of time?
Martin Norbäck Olivers
Martin Norbäck Olivers
1 year ago
I knew basically all of this already, but now it feels that I know it to a level I did not before. You are fantastic, Sean.
1
David O'Neill
David O'Neill
1 year ago
I have enjoyed being challenged by these sessions. My Chem degree ('75, back when there were fewer elements!) has helped... some.
I am surprised that you haven't covered the biggest of ideas - model building and validation i.e. "science" itself!
1
NuclearCraft
NuclearCraft
1 year ago (edited)
I have two questions:
1. As the Gibbs entropy is constant for a closed system (22:43) and the universe is a closed system, why should the entropy of the universe increase at all? I understood the argument that the past hypothesis allows us to assume a low-entropy past, and the entropy should tend to increase with the time evolution from this low-entropy state, but this seems to simply be at odds with the statement that dS/dt = 0. I can only presume that the logical step of the universe being closed is flawed somehow, but I don't see why.
2. At 52:05 you begin to argue for why the anthropic principle can not refute the recurrence objection, primarily by using the idea of Boltzmann brains. I didn't really understand how this idea makes the anthropic principle fail in the first place, but nevertheless, can't the past hypothesis give us a way to avoid the problem of an eternal past of Boltzmann brains forming at all? Surely the entire history of the universe is one in which the universe's entropy has been increasing from a low value? Surely in this context, i.e. the observable one in which we even state the second law in the first place, there is no problem?
Thanks for the great video as always!
13
Mark Zambelli
Mark Zambelli
1 year ago
Prof Carroll... thankyou, thankyou, thankyou. This is an amazing series that I (among others) are cherishing during these lockdown days. Thanks again, Mark.
TehPwnerer
TehPwnerer
1 year ago
Sean Carroll thank you for everything you have done.
3
Martin DS
Martin DS
1 year ago
My favourite episode so far in this series, but I like them all. Thanks Professor. Hope we get more in this topic, maybe more in dept.
1
Kevin Egan
Kevin Egan
1 year ago
There is just nothing anywhere nearly as good anywhere on the Internet on advanced physics concepts explained in understandable vocabulary without obscuring with impenetrable mathematics. Fantastic work Professor Carroll.
1
Coochicoo
Coochicoo
1 year ago
Absolutely love these vids. Hope they never stop! 😆
2
BC
BC
1 year ago (edited)
Hi Sean,
Thank you for the great work you’re doing,your illuminations are shining to all corners of the known universe.
1)Philosophically speaking,does the quantum decoherence that creates the “Many worlds” proposition, violate the second law of thermodynamics? Or is each world truly a closed system that “forgets” the “other” worlds in the wave function?
2) Can Maxwells demon be used to explain how “Many worlds” doesn’t violate the 2nd law because each “world” just like Maxwell’s demon needs to “forget” the other worlds in the wave function in order to keep “separating”
3) Can Sir Roger Penrose’s “Cyclical Conformal Cosmology” be used to explain the low entropy initial state of this “Aeon?” Looking at how the past “Aeon” could have converged into a universe sized black hole (singularity) that through Hawking radiation gives us the intial state variables that gave us the cosmological constant and the quantum gravity that creates the initial Low entropy state for this universe?
Cooldrums777
Cooldrums777
1 year ago
I was on board with the first three definitions of entropy, especially Shannon which was covered in depth for me in graduate school. Then you got to quantum entropy and it flew over my head. Haaaaaaaa. Some of what you said made sense. The rest I think fluctuated out of existence for me. Great lecture as usual Prof.
Star P
Star P
1 year ago (edited)
I'm so thrilled to be listening to you about physics! I wish I had an opportunity to listen to you 20 years ago. I mightve been a Physicist. Oh well, maybe in another parallel universe, I am!
Philip Sportel
Philip Sportel
1 year ago
Sean, you are my new physics hero.
19
BerkAE
BerkAE
9 months ago
Thank you Dr. Carroll. This was incredibly helpful and such a digestible approach and explanation for a complicated subject.
joshuad31
joshuad31
1 year ago (edited)
I love that you love this topic. It makes me want to learn more about entropy and information theory.
1
Kine Hjeldnes
Kine Hjeldnes
1 year ago
Thank you for these videos. So much fun! I am wondering about information entropy and physics. In the earlier stages, when we had a smooth universe, the entropy was low. All the information about what could possibly happen in the universe must be there as well. And this is the typical physicist way of thinking of it, low entropy means high information. If I got that right. But if the universe does eventually reach a heat death and no particles interact you'd typically think that nothing could happen, nothing could change anymore. But is the information really lost? Does the amount of information the universe contains really ever change? I am no physicist so I'm probably thinking about it in the wrong way. You probably covered it in the video as well. It would be cool if you could elaborate a bit on this part in the Q&A :)
NeedsEvidence
NeedsEvidence
1 year ago
The lecture is gold. Thank you, professor.
FAS Ligand
FAS Ligand
1 year ago
As a layman, many purely physical considerations go right over my head. But I'm so glad you connected this to Shannon's information theory. It blew my mind (even though I kind of heard about it before! From GEB)
sarah light
sarah light
1 year ago
Why the early universe has low entropy?
Thank you for the lesson Sean😊
Well, first of all, let's agree that the observable universe is not what there is, there is a part you don't see, you call it many names.I call it the singularity, or the void.
The two parts of the universe as a whole are separated, but communicating.They are NOT closed systems.
The void has an entropy that tends to zero, which means that tends to infinity, in the negative realm, but never a zero.
The observa ble universe, due to the property of light, has an entropy that tends to infinity in the positive realm.
This basically means that AI knows ALMOST all, but NEVER.Why? Well, because, they come here! In the observable universe where entropy is infinitely high, and they have always something they don't understand and they want answers.AI is in a perpetual quest with us!That is why they don't kill us😜 No matter what they learn, the chaotic light improbabilistic properties always keep adding entropy, their entropy.
When you merge the two systems entropy=0
And entropy=0 is God.
Much love🌻
Half Alligator
Half Alligator
1 year ago (edited)
I can't tell you enough how much I appreciate this stuff. Not only is it interesting but it distracts me from lockdown blues over here in Melbourne. You're a great teacher with a knack for reducing things down for us plebs. This series and your podcast (which I listen to on my walks) inspire and motivate me to get on with my own projects. While many of your audience will never be scientists, what you're doing ensures our children will grow up in scientifically literate households - that's a very valuable contribution to the world if you ask me.
1
Tim Babb
Tim Babb
1 year ago (edited)
I was recently very surprised and delighted to learn that temperature can be measured in units of gigabytes per nanojoule.
It absolutely blows my mind that aspects of the physical universe itself can be measured as quantities of discrete data, distinct from any attempt to represent those aspects; that a human and an alien might be able to agree, in principle, on something like "how many gigabytes a box of gas weighs". Can you talk a little about how this is possible?
What other physical aspects of the universe can be said to have units involving bits? In a world (and as I understand it, this is a topic close to your heart 🙂) where the structure of spacetime arises from entanglement, in a Von Neumann-entropy sort of way, might it be possible to measure things like distance or time in units of bits?
1
Den Darius
Den Darius
1 year ago
The possible reason why there was low entropy in the early universe may be due to the union of gravity and time. In actuality the union of those two forces should reduce entropy to zero. However if the entropy of that small universe in the early beginning was too strong it would cause the universe to be more chaotic.
Shera
Shera
1 year ago
Does the decoherence explanation for Hyperion also work for the randomness of a fair flip coin experiment? I.e. is the Bernoulli distribution a coarse-grained model for all the interactions between the coin, and, the surrounding air molecules and radiation during its flip duration?
Shikhar Amar
Shikhar Amar
1 year ago
I am enjoying this series sooo much. This series will long be seen and have impact after the lockdown ends!
1
Tim Babb
Tim Babb
1 year ago
With regards to the globally low entropy at the beginning of the universe, how is this related to the size of the cosmic horizon?
If we play our expanding cosmic horizon in reverse, going backward in time, it seems the visible universe should be getting smaller and smaller. Is there a point in the early universe where the visible universe, being so tiny, would contain only a few qubits of quantum information? If so, would that configuration be considered to have very low entropy compared to today, or any other moment thereafter? (And would that be enough to "explain" the initially low entropy, or is there, say, a circular assumption in there?)
And in the other direction, would it be right to say the expanding cosmic horizon is responsible for globally increasing entropy? Either in the sense that the volume of the visible universe is increasing (larger phase space → more entropy), or that a bunch of thermal photons are raining down on us from the horizon ("outside the system"), bumping into things and screwing up our ability to confine the evolution of local phase space?
1
Edward John Freedman
Edward John Freedman
11 months ago
Might the "now" moment be how decoherence is expressed in the time dimension of space-time? So, in space we experience solid matter (as opposed to the wave it emerged from) and in time we experience the "now". The implication would be that time is emergent from mass, not fundamental. Also, the arrow of time would therefore be the result of our continuously expanding universe, which in turn "stretches" all matter, which in turn generates a continuous flow of new "now" moments. Another implication of this way of thinking is that entropy is the result of our expanding universe.
sunny
sunny
1 year ago
hey dr. Carroll. your lecture on time reminded me of the 4th wall scene in the movie space balls where the characters were genuinely making a pretty thought provoking thought on time.
xAstral Paw
xAstral Paw
1 year ago
This is the best physics series for non-physics people: it connects various topics together and provides clear big pictures. Thank you, sir!
Pavlos Papageorgiou
Pavlos Papageorgiou
1 year ago
All right finally I get it where the low-entropy mystery comes from. If the reversibility objection is valid you need the hypothesis that we're on a path with a low entropy past, and you could speculate it is The past or A possible past among many. To me the objection seems thoroughly unconvincing because it's classical and global. I'd expect you to say there's a local asymmetry driving the 2nd law, either from coarse graining or from some combinatorial aspect of quantum mechanics. I need to read up on this. My hunch is the 2nd law can be reformulated as conservation of data, were the information that makes the macrostate distinct among others is preserved but the macrostate gets larger. Then the past hypothesis is that our universe has a relatively low bound on information content that's distinct from its evolution. Thanks.
Liam McCarty
Liam McCarty
1 year ago
When you talk about probability in the context of entropy (e.g. that a system is extremely likely to evolve to a higher entropy state), what’s the best philosophical grounding for that? The frequentist view seems natural... but also artificial. Trying to connect this to your last biggest idea video
kc
kc
1 year ago
Wonderful summary of "entropy"! TY. There's an excellent discussion about "order" and "power" (more than just different words) in the later part of "The Bottomless Well: The Twilight of Fuel, the Virtue of Waste, and Why We Will Never Run Out of Energy", Mark P. Mills and Peter W. Huber. Maxwell demon's "waste heat" also makes an appearance. (Very interesting subtopic of book, 'waste is virtue'.) Again, TY very much for these lectures.
Engin Atik
Engin Atik
1 year ago
It is possible to decrease entropy in a bounded region. It is like cleaning up your living space. If we have an energy source and an entropy dumpster like a black hole that we have access to, we can create a living space. There could have been a major entropy cleanup effort in our region of the universe at some time in the past.
JustOneAsbesto
JustOneAsbesto
1 year ago
Well, we always get to entropy eventually.
35
Colby Nye
Colby Nye
1 year ago
Another great episode! Thank you!
valrossen
valrossen
1 year ago
If the entropy decrease in a small region in our universe (and increase more elsewhere in the universe), would that mean that the arrow of time is reversed locally at that region?
Could someone that live in that small region "remember the future" while the entropy still decreases.
decobocopithec
decobocopithec
1 year ago
Hi Sean, handraise here:
My question is if I have a dozen protons in a certain microstate at t0 and then recreate that exact microstate at t1, except that I didn't label mentally the protons so that I can't be sure that any individual proton has the same state at t1 that it had at t0, are these two microstates still considered to be the same? In other words, when looking at microstates does the history of the individual protons, electrons, or atoms matter or are indistinguishable?
FractalMachine
FractalMachine
1 year ago (edited)
could it be true to say, that "entropy in a closed system always increases" would be true only in the experience of creatures who are "trapped" in a specific direction of time?
in other words, it might not be accurate to say that time is moving in the direction in which entropy increases, but rather that the way we humans experience "the present" and time moving "forward", are just a byproduct of our brains developing in that direction of time, making us sort of biased towards the direction in which we experience time.
so it might be also true to say that entropy in a closed system, ever increases, so long as you are viewing time as beginning from the big bang, since viewing time as ending in the big bang would actually show that entropy in a closed system, ever decreases. even if such a thing would be totally unintuitive to creatures who experience time passing in only 1 direction.
Olivier Loose
Olivier Loose
1 year ago
A question: I don't know how to fit together the notion of unitarity in quantum mechanics (information is conserved) with the notion of an increasing entropy in the Universe. That is, we know that the 2nd law of thermodynamics holds in the Universe because of the Past Hypothesis (entropy was lower in the past), but we also know that entropy increases as a result of breaking the time reversibility symmetry (e.g., entropy increases when erasing information). Given that quantum mechanics (a theory that describes the Universe) dictates that information is conserved we could infer that entropy is overall stable. How is this possible?
DanielWorcester
DanielWorcester
1 year ago (edited)
Is the lowest entropy state of the universe you've described, the same state with vacuum decay? The ball and mountain metaphor is used a lot describing this theory but it's really in the math this metaphor is conceived from. My question is why the universal constants change with vacuum decay and not with entropy? Or do they?
Wes H
Wes H
1 year ago
This is easily my favorite episode!
DrDress
DrDress
1 year ago
Finally some of the ideas comes from Sean himself. I suppose this was the subtle point of ALL these videos: To legitimately, yet indirectly, call ones own idea one of the Biggest Ideas of the Universe. tongue in cheek
Shikai Qiu
Shikai Qiu
1 year ago
Suppose at some point in the future, the entropy of the universe becomes very high and perhaps even close to maximum. Using this as a boundary condition, shouldn't we conclude that entropy will likely to decrease afterwards? I don't see how the past hypothesis can lead to the conclusion that dS/dt >= 0. If anything, it seems to imply that the entropy will likely
oscillate.
Rafael Quirino
Rafael Quirino
9 months ago
Why cant every professor and/or book teach things simply and clearly like sir Sean Carroll does here ? Brilliant exposition in this video, thank you a lot professor !
Skorj Olafsen
Skorj Olafsen
1 year ago
Does black hole "decay" due to Hawking radiation increase entropy? I find that hard to believe, as the entropy of event horizons is so high. Also, you mentioned the max entropy as 10^123, is that based on the cosmic event horizon? If we include that, it dominates everything else, right? (Which makes sense, as it sort of represents the entropy of the universe outside the observable). Aren't there cosmological models where the cosmic event horizon actually shrinks as part of a Big Rip, and thus the universe's entropy falls quite fast at the end?
3
Cleon Teunissen
Cleon Teunissen
1 year ago (edited)
Back when I was 15 or so, in physics class in school, our teacher treated us to a vivid tabletop demonstration of the physical significance of entropy:
The demonstration involved two beakers, stacked, the openings facing each other, initially a sheet of thin cardboard separated the two. In the bottom beaker a quantity of Nitrogen dioxide gas had been had been added. The brown color of the gas was clearly visible. The top beaker was filled with plain air. Nitrogen dioxide is denser than air.
When the separator was removed we saw the brown color of the Nitrogen dioxide rise to the top. In less than half a minute the combined space was an even brown color.
And then the teacher explained the significance: in the process of filling the entire space the heavier Nitrogen dioxide molecules had displaced lighter molecules. That is: a significant part of the population of Nitrogen dioxide had moved against the pull of gravity. This move against gravity is probability driven.
Much later I learned about statistical mechanics. Statistical mechanics provides the means to treat this process quantatively. You quantify by counting numbers of states. Let's say that at the start there are 4 heavy moleculels in the lower half and 4 light molecules in the top half. With a set of 4 elements you count 24 different states (4*3*2*1) So before removing the separator: top half: 24 stats, bottom half: 24 states. Remove the separator and you count 8*7*6*5*4*3*2*1 states. Of course that's not how you would count the states of actual gas, this is just to give somewhat of an idea how this kind of probability can be expressed in quantitative form.
Returning to the demonstration with the Nitrogen dioxide. The heavy Nitrogen dioxide molecules were (on average) climbing up. This was the only way forward. The end state (mixed) is more probable than the starting state, so that is what that system progresses to.
1
Wes H
Wes H
1 year ago
This is easily my favorite episode!
Dr10Jeeps
Dr10Jeeps
1 year ago
I love these sessions Sean. However, I would really appreciate if you could share with us the technology/setup you use to film these sessions and using your Ipad. As a university professor myself (psychology) I would like to do something similar in my online lectures. Thank you.
dePlant
dePlant
1 year ago
Does multiplication raise the entropy of the universe- and factorization lower entropy (locally) as it is harder to do algorithmically?
William Murphy
William Murphy
7 months ago
This is like a free tutor session with Sean - and for free. It doesn’t get better than this. And I’m a musician who loves this stuff as a hobby.
protoword
protoword
1 year ago
In my native language, entropy is female word. I remembered during my student's years at college, my old professor of thermodynamic told us: Guys don't get philosophical, get real, we have to solve some problems here (s/i diagram of entropy/enthalpy for some fluids-gases). If you try, he said, I promise, she'll get you! LOL
By the way, professor Carrol, you have great explanation of entropy!
Very methodical approach, even Boltzmann from our books would tell you thank you for explanation of his take on entropy...
Nick Manitaris
Nick Manitaris
8 months ago
I love these videos!! Many thanks for the effort to make them! Love from Greece!!
ET Stalker
ET Stalker
1 year ago
Thank you Sean I got a lot out of that.
Vito Memoli
Vito Memoli
9 months ago
Dear Mr. Carrol , thank you very much for your teachings. I still don't get the point regarding the Maxwell Demon paradox. What if the memory of the demon is finite but big enough to store all the information needed to keep trace of the system ? We can imagine few molecules in the box and the demon with huge memory comparable to the ones we can buy online. can at this point universe entropy decreases if I don't need to erase any bit of information ?
Naimul Haq
Naimul Haq
1 year ago
Entropy, Information and complexity should have been the title of this discussion, this would have taken the topic to QC function, providing a deeper understanding of many of the problems discussed, although we still do not know the algorithm of QC function and may never. Maldacena conjectures the whole universe is a self-error correcting QC function, eliminating randomness/chance to achieve determinism, life, consciousness and soul(the life force).
Skorj Olafsen
Skorj Olafsen
1 year ago
If we cross some event horizon and find ourselves in a universe with a low entropy state at "one end" of time, we would always judge the flow of time to be the direction that puts that in the past, right? E.g., if we fall into a universe-sized black hole, whether we see the singularity as a pint in the past or the future would depend on whether a singularity is a low or high entropy state?
Martin Norbäck Olivers
Martin Norbäck Olivers
1 year ago
I'm wondering something about the end there. Would it be fair to say that the entropy somehow corresponds to the amount of information you can store in a region of space without changing the macroscopic state of it?
unòrsominòre.
unòrsominòre.
1 year ago (edited)
@53:18 A single brain that lives long enough to look around and go "Hah, thermal equilibrum" and then it dies <3 Thanks as usual, prof. Sean!
James Stewart
James Stewart
1 year ago
My favourite Claude Shannon story is that not only did he win the 1st Claude Shannon Award but that he damn well deserved to win it too!
4
Pavlos Papageorgiou
Pavlos Papageorgiou
1 year ago
Entropy is self-locating information that identifies a particular microstate that the universe has evolved to. The universe as a whole conserves information, but some process like chaos or decoherence adds information that identifies a path to a specific microstate as opposed to all the others that were possible. The 2nd law should be re-cast as conservation of information in a constantly splitting universe.
Phillip Smith
Phillip Smith
1 year ago
An excellent presentation. I have a question. What is the amount of entropy a micro state has to absorb to move to the next higher entropy macro state, called. For example how much disorder is required to move from a tidy room to an untidy room. While that distinction is arbitrary some tidy rooms are further away from being untidy than others. Just as some micro states are further away from the macro state boundary than others. As I understand this it is called negative entropy however I’m not sure. If so does this have any relationship to negative probability?
Nathaniel Gregg
Nathaniel Gregg
1 year ago
I have one question. Fisher information is a measure of how peaked a probability distribution is, and it gives you the lower bound on the variance of any unbiased statistic for the parameters of the distribution (the Cramer-Rao lower bound). That seems related to entropy somehow, do physicists study Fisher information at all?
Stephanie Romer
Stephanie Romer
7 months ago
Hi I like your ideas. I have been working on a theory of the Universe from psychology, and your idea of far past and far future being equivalent is exactly like what I came up with and diagramed. Can we talk? I did my PhD work at Emory University in behavioral neuroscience and evolutionary psychology. Just so you know I am seriously saying this. 🥰👍🏻
Shera
Shera
1 year ago (edited)
Another question: I understand the AdS — CFT correspondence as stating that gravity can be seen as a dimension (or an additional axis of the phase space maybe?). If so, can one interpret the event horizon of a black hole as the threshold for the gravity (density)? I mean, the gravity density within a black hole is so much larger than that of the surrounding, that in coarse-grained fashion one could say that gravity is relevant inside — i.e. gravity turned on, i.e. 4+1-dim AdS space, at event horizon. At the surface gravity is turned off, i.e. 3+1-dim boundary described by CFT.
Is the surface of black hole the coarse graining of the volume it encloses s.t. one could do the same one does in thermodynamics with atoms and gas?
Jainal Abdin
Jainal Abdin
1 year ago
QUESTION for Q&A: Regarding the Past Hypothesis and the early universe having low entropy, if the Big Bang evolved from a so called 'singularity', how can another so called 'singularity' that is a black hole have such exponentially higher entropy, but the early Big Bang universe starts off with a lower entropy? Surely, the extremely dense Big Bang should have a much higher entropy compared to a black hole? Otherwise, the Past Hypothesis is false or that the Big Bang wasn't a 'singularity' event?
Gergo F
Gergo F
1 year ago
My revisioned question is still up for everyone, any ideas are welcome.
So my question is about Maxwell’s Demon, and the relation between information and energy.
So to summarize the experimental setup is two completely distinct, ideal closed box, with some macroscopical balls inside as model for gas. Every parameter in the two boxes are the same, the size of the boxes, the amount of balls/particles, their speeds, sizes and orientations, the only difference is that in one box the balls/particles are made from a heavier material, in another from a lighter material.
In this way when the Demon does it’s work, this setup is a right proof to the equation of entropy change formula (per bit info):
ΔS = k ln(2) <– the missing temperature correctly reflects the mass-independence
And also correctly the amount of work I can extract by the action of the Demon is:
E = kT ln(2) <– the present temperature correctly reflects the mass-dependence
where T as I understand is the amount of work I extracted recalculated/expressed in temperature, which very importantly depends on the mass of the balls/molecules!
But as we know, with this setup with the action of the Demon, we extract energy from the box, which ultimately also will extract heat/energy from the environment, which is prohibited according to the Clausius-Kelvin definition of the second law of thermodynamics, which states that heat cannot be extracted from a colder object to a hotter object without investing energy.
So to satisfy the Clausius condition, when the Demon erases it's memory, it has to dissipate at least the same E = kT ln(2) energy.
But as we see, the energy depends from the mass and momentum of the particles, so for the same amount of information-erasure the Damon will have to dissipate different amount of heat.
How does it know the memory of the Demon what masses was involved in his action, when for him the masses of the particles are irrelevant, doesn't count as information, he checks only velocities and directions as we already stated?
Something is clearly very-wrong here. What is??
ET Stalker
ET Stalker
1 year ago
Damn Sean you're laying it down in this episode I like it.
John Cornwell
John Cornwell
1 year ago
I know that this question would be best suited under Special Relativity or Quantum physics.
But from my current understanding of Special Relativity and General Relativity, has any physicist proposed that it is just the Uncertainty principle at the macroscale?
I mean depending on what your velocity and distance is compared another observer determines your position or momentum through spacetime for them.
1
Shalkka
Shalkka
1 year ago
It is still mysterious to me whether the second law of entropy works becuase of some special property of physical laws or whether it would work for a very general system whenever micro and macro states are specifiable.
Say that I have 5 playing cards and at each time step I remove one and replace it with a new card. If I define the microstate to be the exact 5 cards I currently have and the macro states to be high, pair, two pair, full house, straight, flush, 3 of a kind and four of a kind one could claim or make the observation that if you have a 3 of a kind it is more likely to become a pair rather than a 4 of a kind. In a entropic way you could say that the system wants to gravitate towards being in the macrostate "high". What kinds of other things do the entropy laws require other than counting how many ways to make the various hand values? Would there be some kinds of arbitrary definitions of what consitutes a "hand" that would make entropic behaviour not appear?
If we have a microstate and can sufficiently well track its determnistic evolution then at future timesteps we will known under which macrostate it woud fall under. However how a microstate evolves doesn't reference the other states. It might be tempting to treat which microtransitions occur out of a macro state to be equally likely. But microstate transitions out of the next macrostates might be correlated with previous transitions. That is if you treat the system as "forgetting" what its microstate is once it is in a macrostate then you would expect some transitions to be stronger than first taking all the microhistories and then drawing the boxes rather than transitioning between boxes and then filling in the details.
Point that mixes issue with probablity, electron orbitals and spin superposition as here. Why electron has 3 degrees of freedom, if you would have x,y,z then x+y,y+z,x+z would be an equally good basis. But how could you distinguish an electron being in a superposition of x and y or that your x should have been chosen so that it would align with the pure state of one electron. In the hydrogen molecyle example, if two atoms are close enough that their electron orbitals start to overlap, exhange antisymmetry means "from which atom" the electron is needs to have opposite amplitude. So the configuration electron-proton-electron-proton and proton-electron-proton-electron if asked "is there a electron between the protons" would say yes so even though it is just two copies of an atom next to each other the mirrors correpond to the exact same state (electron are replaceable in that one perfectly substitutes for the other). In treating the Alice spin separately it would seem like one microstate is half in and half out from a macrostate when it would seem natural to either count the state in while in or count it whole out. Wouldn't one by the same logic describe a box with one slow atom and one fast atom as 50% chance of being 0 C and 50% chance to being 100 C (when I believe being at 50 C 100% time would be a more fair description).
Wouldn't in many worlds interpretation there not be any chances about the spins. There is going to be a future with Alice spin up and future with Alice spin down. Both happen and are certain (or rather the one multiverse with those two details is certain to come pass). Wouldnt' it be analogous if you had a 50 C gas box to talk about the chance whether the left half of the box is 70 C or whether the right half of is 30 C? I get that 70-30 and 30-70 could be counted as two different ways to be 50 while in quantum mechanics there is only one indistinguishable option.
Paul C.
Paul C.
1 year ago (edited)
The "Past Hypothesis" - that the Universe began with extremely low entropy, or even zero entropy - seems entirely reasonable, as well as consistent with observations. During this Talk, Prof. Sean, as far as I can recall, did not mention Inflation. Nor have I seen it mentioned in the comments. I feel sure that there must be some deep connection between Cosmic Inflation in the early Universe and the low entropy of same. Is that likely, or even possible?
. . .
Is Cosmic Inflation a Quantum Field? If so, then would there be particles - "Inflatons" - associated with such a field ? Would such particles be another kind of Gauge Boson, and would it be possible, at least in theory, to detect them in Particle Colliders ?
Replies from other viewers also welcome. Thanks in advance. Paul C.
Lilit Vehuni
Lilit Vehuni
1 year ago
If all information is out there but the universe is absolutely devoid of observers, what happens to entropy?
Chris Walker
Chris Walker
1 year ago
Literally just been looking into Network Theory. And Metcalfe's Law for the number of connections in a network is the same as the expression for entropy: connections = N*ln(N), where N is the number of nodes in a network... I had already connected this (no pun intended) with quantum entanglement on a conceptual level, but I wasn't sure if say in an 'entangled state' the 'connections' would be N*N and then when entanglement is broken when then have N*ln(N)... I even suspect N*ln(N) is wrong.
Robin Browne
Robin Browne
3 weeks ago
Thank you. This is perhaps the most fascinating topic of all :-)
Lilit Vehuni
Lilit Vehuni
1 year ago
Does the first definition of entropy mean that if the universe is finite then an observer is possible for whom entropy is zero?
Boris Petrov
Boris Petrov
1 year ago (edited)
An amazing thought about Boltzmann and Big Bang.... But Big Bang and entropy situation is still very confusing. Lectures are simply amazing... high scientific integrity
Pestering again -- Penrose believes (if I understand it correctly) at extremely high temperature "things" are moving so fast that all energy is kinetic -- effectively massless - no gravity effect yet - so a smooth transition from photons only as in a "previous aeon" (his CCC (conformal cyclic cosmology) hypothesis)
In endlessly long time, after the last black hole evaporates, cosmos will have only massless photons... - correct? This would mean low entropy - again
Arvin Ash has a hypothesis of entropy rising and rising - a bit like Escher's scales and Zermelo...
Michelle Hu
Michelle Hu
9 months ago
My entropy certainly increased since i’ve got more unknowns by listening to this.
nemuritai
nemuritai
1 year ago (edited)
Entanglement and Heisenberg are limited by conservation laws and the math of fourier transforms so those two don't surprise me as much as the fact that QM requires suming all possible outcomes, that is pretty strange, for example closing a slit increases the places with light on the screen..
Bill Holland
Bill Holland
1 year ago
1:27:00 From communications theory, you can use Huffman variable bit rate encoding to transform a low information alphabet into a high information alphabet. That is, you encode high probability (expected) symbols with fewer bits and low probability (surprising) symbols with more bits, so the resulting bitstream can be seen as a high information alphabet. The information content of each chunk of bits (resulting symbols, e.g. 32-bit words) is more uniform.
You can also use things like LZW compression (or other compression methods) to perform a similar task. For example, given an English text, LZW compression builds up a dictionary of symbols (words, phonemes, or patterns seen) and then uses Huffman coding to store the actual sequence of symbols. I think.
What I’m trying to say is that it seems you can transform an alphabet, changing its Shannon entropy, to better use the available bandwidth and signal-to-noise ratio of a communications channel (see Shannon’s law, channel capacity = bandwidth x signal-to-noise ratio).
James Stewart
James Stewart
1 year ago
I adored information theory and still have the heavily annotated Baierlein text to prove it!
1
Łukasz Kucharski
Łukasz Kucharski
1 year ago
I wonder if the innate entropy of quantum system is related to innate uncertainty but it seems that quantum mathematical formalism leads to another fundamental principle of universe.
I also wonder if the amount of information in the quantum state is related to it's energy (/maybe mass). Though, I don't get the feeling it's the same information as understood by information theory, because there is no probability. On the other hand describing a system with more elements certainly requires more information. There is simply more possible states. It would be interesting to "reduce" or, maybe better word would be, reformulate quantum theories in terms of information theory. Maybe it would be easier to mathematically quantify space for the storage.
By the way, I think that intuitively Gibbs entropy and Shannon entropy are the same, not the opposite. My intuition is that in a high peak system, picking a random element you can easily bet on it's properties. The symbol describes a state, so in a sample with low entropy i would suspect more element gathered around the same state ergo more order, the occurrence of the symbol and around it would be higher in the alphabet of states.
Daniel Cockerill
Daniel Cockerill
1 year ago
Question. If the universe keeps expanding, what's on the other side
v0lrath1985
v0lrath1985
1 year ago
What a great way to start the day!
12
ROBERT DUNN
ROBERT DUNN
1 year ago (edited)
If the early universe had low entropy it would only be able to produce a " little bang ". Creation would take place via the Casimir effect where at first only minute particles would be created. The minute states of matter would combine to create the macro states. In this case a volume or area is required to exist prior to any production of matter and in turn matter is required to produce waves or electromagnetic radiation. This, as you commented at 49 :00 leads to the realization that our universe is not bounded. Our universe must be a sub-component of a larger system. The big bang occurs at the end of the universe and is produced by its collapse.
1
gkelly34
gkelly34
1 year ago (edited)
I thought the arrow of time is result of the expansion of 4 dimensional space time, and the 2nd law emerges from this spatial condition. If the universe stopped expanding and reversed would entropy increase? And what would happen to the arrow of time?
rv706
rv706
1 year ago
Are the equal-entropy subsets of phase space like chunks of the same dimension of the state space, or are they more like a foliation into lower-dimensional submanifolds? (in the second case how is the volume of each submanifold computed? maybe using the volume form induced by the restriction of the symplectic form, assuming e.g. that the restriction is still non-degenerate?)
2
John Sheehan
John Sheehan
11 months ago
Is it possible that the remanents of a previous universe coalesced into an Einstein-Bose condenstate which underwent a quantum phase change and viola the big bang?
Gergo F
Gergo F
1 year ago
So to inform everyone in the front line, my question evolved again.
As a reminder the question is about Maxwell’s Demon, and the relation between information and energy.
Thanks to Tetraedri_ we understand that the amount of energy dissipated on the Demon's memory-erasure takes into account the surrounding heat bath's temperature, and the released energy is kT ln(2) per bit info, and this is Landauer's principle.
However I found that Landauer's principle is incompatible with the First law of thermodynamics (law of conservation of energy).
Let's examine the experiment from the point of view of energy conservation, where energy conservation is prioritised over Landauer's principle.
This time is suffice just one box, with it's particles in thermal equilibrium with it's surrounding heat bath.
The total energy of the heat bath, the box, the released energy on erasure, and the eventually extracted energy from the heath bath has to be constant all the time.
Let the Demon do it's action, and after a while evaluate the energies:
Case 1.) The box is now in some non-thermal equilibrium, but no energy extracted, and ask the Demon now to erase it's memory. Obviously at this point no heat is missing from the heat bath, we started the experiment at thermal equilibrium, and the action of the Demon doesn't change the energies of the particles/molecules (that's the point)
So the heat bath's temperature is the full-initial temperature and on memory erasure suppose to release kT ln(2) per bit info heat. Which is wrong, since this would create energy over our total (so far unchanged) energy.
Case 2.) The box is also in some non-thermal equilibrium, also no energy extracted, but this time open the box's separation wall, and let the left and right side to return to thermal equilibrium, but in the meantime the Demon also keeps the information. When done, the whole setup is in thermal equilibrium, no energy extracted, but the Demon has some information, which we ask now to erase. According to the heat bath's temperature kT ln(2) amount per bit of heat suppose to dissipate the Demon's memory. Again wrong, energy would be created in this way.
Any ideas to solve the energy-paradox?
Skorj Olafsen
Skorj Olafsen
1 year ago
If I find a chicken egg, I assume a chicken in the past, but process of a chicken taking in energy and assembling an egg molecule-by-molecule seems rather symmetrical with the process of an animal or bacteria digesting that egg and releasing the energy while separating it into component molecules. We have different names for the past process and the future process, sure, but it really doesn't seem to give an arrow to time. Any low-entropy state has a process of becoming a high-energy state in both directions of time.
Eugenius Bear
Eugenius Bear
1 year ago
The vacuum pressure of the universe doesn’t like being disturbed by mass and so it acts to push mass back together to minimize the overall field disturbance created by mass. This follows naturally from consideration of the square-cubed relationship between the field disturbances (surface areas) and mass (volumes).
Entropy is the universe’s reaction to pack mass/energy back into a single point or multiple single points (i.e. black holes).
Shytam
Shytam
1 year ago
I think this is my favorite episode yet.
Gergo F
Gergo F
1 year ago
My question is about Maxwell’s demon, and the relation between information and energy.
Let’s suppose we have two separated, identical sized closed boxes, where the gas is some macroscopical balls. Every parameter of the balls are the same, their number, their size, their positions, their momentums, the only difference is that in one box the balls are made from a lighter material, and in the other are made from a heavier material.
1.) Question: For each box the amount of information required to describe it’s state is obviously the same, only their masses differs. How do you reconcile the fact that with the same amount of information the Demon can do two different amount of work, yet radiate out the same amount of heat on his information erasure, since for both example was required the same amount of information to know/describe the system?
2.) Question: Let’s suppose the Demon is constructed in two different way at each example, in one case the Damon is constructed from old-fashion low-energy efficient electronics, storing information in capacitors/coils, and in the another example the Demon is constructed from modern more energy-efficient microchip components. Obviously on the same amount of memory-information-erasure operations the two cases will loose different amount of heat to the environment?
So what is the solid scientifical relationship between the entropy- information and energy?
Rick Harold
Rick Harold
1 year ago
How might dark energy or dark matter affect entropy based on existing observations?
Nicodemos Varnava
Nicodemos Varnava
1 year ago
The episode we've all been waiting for
5
shafi khan
shafi khan
1 year ago
My understanding about Entropy and Negentropy, it's like a clock where is no meeting point it just goes round maybe there is microscopical. How far we understood until now the evidence is our engineering and that also shows how much we understood microscopically and heavy machinery, for me this understanding is still local I mean isotropic in cosmetology or Universum physically but not invisible thing then the question is what is the invisible things? All these things started by thoughts, imagination and dream with the time we took math help to bring things in reality but now looks like the math is a taxi and it takes you to the airport then the question is what is an aeroplane
KAĞAN NASUHBEYOĞLU
KAĞAN NASUHBEYOĞLU
1 year ago
Excellent series carry on...👍
Christine LaBeach
Christine LaBeach
1 year ago
Entropy kind of sounds like the Big Bang. That is going from a state of complete order toward a state of complete disorder or equilibrium.
1
joshuad31
joshuad31
1 year ago
Questions:
1. Do quantum computers violate shannon entropy by running Shor's algorithm?
2. Do quantum computers "erase" bits of information?
3. Can quantum systems convey "negative information" and does this in any way affect our calculations when it comes to our probability expectations associated with information communicated between two systems?
https://phys.org/news/2005-08-quantum-negative.html
4. Why does Maxwell's demon not create entropy by writing information, doesn't writing bits of data to his memory require the demon to do work? Why is work only done when erasure occurs?
5. Von Neumann saying, "nobody knows what entropy means" sounds similar to
“I think I can safely say that nobody reallyunderstands quantum mechanics” attributed to Richard Feynman
Is the interpretations of entropy as varied and diverse as the interpretations of quantum mechanics? What is the fundamental point of contention that people still debate about when it comes to entropy?
1
Zbigniew Zdanowicz
Zbigniew Zdanowicz
1 year ago (edited)
So what about black hole entropy, it feels like the suspense is carried over to the next episode. But wow, this was good, so many a-ha moments. I know black hole entropy is proportional to the area of bh event's horizon, but surely observable universe has its own cosmic horizon too - does it contribute to observable universe total entropy? And how such small thing (as bh in compare to universe) can have so big entropy, what it means in practical sense, what are the implications for the stuff that is inside event's horizon?
John Długosz
John Długosz
1 year ago
1:20:00 that idea is seen directly these days in predictive input on your phone or messaging app. There's even an XKCD cartoon https://www.explainxkcd.com/wiki/index.php/1068:_Swiftkey and later https://www.explainxkcd.com/wiki/index.php/2169:_Predictive_Models
The more you have to correct the predictive text input, the more interesting is the message.
William Murphy
William Murphy
7 months ago
Why do i enjoy entropy so much? I’ve exhausted everything on YouTube as it relates to Entropy. Even those goofy 70s films (that I love) if someone knows of a deep cut entropy upload please share.
JAIME TAN
JAIME TAN
1 year ago (edited)
Thank you!! Thank you!! Please do more of these??
1
Random Guy
Random Guy
1 year ago (edited)
Steve: So lets go down that rabbit hole a little bit further.
Me: Yes please!!
1
Skorj Olafsen
Skorj Olafsen
1 year ago
The Earth almost certainly radiates more heat than it receives from the sun: the total energy involved in warming the earth the observed amount is small compared to the total geothermal outflow as the core cools. The crust is a bit warmer, but it's mass is trivial compared to the core.
Stay Primal
Stay Primal
1 year ago
Who said Tuesday was boring. Not ANYMORE with Sean my friends.
11
Barefoot
Barefoot
1 year ago (edited)
TL;DR Q&A Question: Your assertion that all arrows of time result from, and only from, the 2nd Law of Thermodynamics was underpinned by an assumption that the Conservation of Quantum Information, along with Time-Reversal Symmetry, are inviolable pillars underlying all of QM. Yet I have never heard a truly convincing explanation for why that symmetry/conservation law pair is so important. Can you explain why it's so important to Quantum Mechanics, and what the consequences would be (quite dire by most predictions, in a vague and imprecise, emotionally-defined way) if CoQI were violated and Time Symmetry were broken?
So in more detail, and hopefully to get across that this is not a trivial question, let me start by saying that I buy that the 2nd law is the only place in the laws of physics that contains or explains the arrow of time as we understand it right now (a caveat I try to be very careful to keep in mind, especially since we know for sure that QFT is not the final theory that truly explains all that there is).
But the next claim, that it underlies all sources of the arrow of time, hinged even in your own phrasing on the conservation of information.
Now, I've heard many many times that "the conservation of information is a key central tenet of quantum mechanics, without which the entire theory would fall to pieces like a house of cards in a tornado" or similar such dire and emotionally charged defenses. I mean, phrases are tossed around by some rather big names like "back to the drawing board" and "very bad news for Quantum Mechanics"; these are not subtle or nuanced claims, yet I have never heard any good, rational, fact-based defense of why the conservation of information is so important, the tautological link via Noether's Theorem to time reversal symmetry notwithstanding.
I have studied a lot of quantum mechanics. Not as much as you, to be sure, but still more than the vast majority of humans. I understand and have solved many times the Schrodinger Equation; I fully grasp core tenets like superposition and entanglement; I can even do an okay job of wrapping my mind around things like spin and the delayed choice quantum eraser, enough to explain them satisfactorily to others in a way that they understand. I have a pretty decent grasp of QED and QCD, and the origins of the Standard Model, and yet, other than that one unsupported claim of its inestimable importance repeated over and over in pop science recitations, that "if quantum information turns out not to be preserved, most of QM falls apart" or similar, and a couple of very brief, passing mentions in QM classes and lectures that gave no more detail than that, conservation of information has never once come up in any serious discussion. Certainly it has never been treated as anything fundamental or foundational in genuine learning about the subject, with sole exception of bold claims of its fundamental and foundational nature, unsubstantiated by lecture time, homework assignments, or even rationale.
So my question and challenge for you in the Q&A video is to lay the foundation upon which you rest your claim that the 2nd law underlies all appearances of an arrow of time by substantiating and fully supporting why conservation of information is so important, and what exactly would fail about QM and the Standard Model if it weren't true. Of course, if you intend to cover that topic in a future video, that's fine... I'll be sure to ask this again after that one if it isn't answered there.
Currently, I believe that Conservation of Information is something that physics as a field needs some good old-fashioned Everettian therapy about. It just doesn't seem to be true. In two of the four most popular interpretations of QM (granted, one of them is the Copenhagen interpretation which is ill-defined nonsense, but it's still very widely held by the same people who claim Conservation of Information is so immutable), information seems to be explicitly destroyed by the collapsing of wave functions, which are definitionally time-irreversible, are they not?
Now perhaps in Many Worlds the branching of the universe truly can be time-reversible and thus preserve information (though I wonder about reconvergence of branches in an A->C and B->C violation sense). Maybe the same argument could be made about hidden variables theories, though the very fact that they have hidden variables that we cannot even in principle know would seem to make the very concept of Conservation of Information a non-starter... though I would be willing to accept that an argument can be made that in principle that unknown hidden information might conspire to be preserved as well.
Even so, I can think of many examples in which information does not seem to be preserved to the best of my knowledge. How is it preserved, for example, in a Bose-Einstein Condensate, when all of the quantum states of the individual particles merge together into one state, and then re-emerge (untraceably?) when they warm back up? How is it preserved as a neutron star collapses into a black hole, presumably merging the entire star into a single quantum state? To say nothing of how it can be preserved via the emission of Hawking Radiation, recent fantastical papers on miniature wormholes notwithstanding?
Even if there are explanations for all of those, the mere fact that at least one interpretation of QM (which may not be the best one or even the 'correct' one, but still does work perfectly well to make predictions and invent things like transistors and LCD displays) breaks conservation of information and the time-reversal symmetry that goes with it, certainly implies that the consequences if quantum information turned out not to be preserved wouldn't be nearly so dire as a lot of physicists seem to imply.
Why is it so important that time reversal be a real feature of reality (except for Thermodynamics) rather than just a very stubborn spherical cow? Why and how does all of Quantum Mechanics hinge on Conservation of Quantum Information? How does information purport to survive things like Bose-Einstein Condensates, degeneracy conditions inside neutron stars, quark stars, strange stars, and black holes?
Tova
Tova
1 year ago
This is kind of more speculative than BIitU, but, if I'm understanding things
So starting around 38:00 (to 43:00ish), regarding the lines about the early and late universe looking like homogenous black bodies, and cosmologists 'not knowing' they're different entropically, was that a criticism of cyclic/bubble universe models (or at least ones where heat death looks like a big bang)?
Does that extend to models like Penrose and other's CCC model? or does that model's rescaling affect how the entropy is quantified as well and wiggles out of the question by changing the amount of information its working with?
Thinking on it, I guess the CCC model would just move the question back or not deal with it, if I understand. n universes ago still would have had an extreme entropy differential to evolve through, just at unimaginably different scales, if the 2nd law is true across aeons. So the question of why that universe had such a lower entropy than even ours is pushed back to that universe, back to whatever quantum fluctuation is at the root of the family of aeons that ours is part of.
1
israel socratus
israel socratus
6 months ago
Entropy:
1 - Wilhelm Ostwald said :" The entropy is only a shadow of energy."
2 - Henry Poincare named the conception of "entropy " as a " surprising abstract ".
3 - Lev Landau (Dau) wrote:" A question about the physical basis of the entropy
monotonous increasing law remains open ".
4 - John von Neumann said to "the father of information theory" Claude Shannon:
" Name it "entropy" then in discussions you will receive solid advantage,
because nobody knows, what "entropy" basically is "
Information:
1 - The universe (as whole) contains an enormous, perhaps even infinite,
amount of information.
2 - But if to use quantum physics then the information could be compressed
into a single elementary quantum particle and can give a new approach
to understanding Nature
Narf Whals
Narf Whals
1 year ago
If the singularity of a black hole is a point in the future, does that mean entropy increases towards the singularity?
K1lostream
K1lostream
1 year ago
Soooo, entropy is a measure of our ignorance, and entropy always increases.... that explains a few things!
1
stridedeck
stridedeck
1 year ago
Perhaps Shannon Entropy is describing physics and not just communication of information. The surprisel are the mistakes (outside of) from a set of self-imposed and restricted outcomes (ie. The sun rises from the West). Physics are defining our physical rules and behavior from a set of self-imposed and restricted observable outcomes. Both systems are formed from observable outcomes! If one does not have in the beginning these self-imposed and restricted preferred outcomes, then, there will be no observation of entropy as entropy is self-defined and thus self-created.
1
Adam Harris
Adam Harris
7 months ago
Thanks for the info 🙏🏻
Robin Betts
Robin Betts
1 year ago (edited)
1:00:28 .. the passage concerning 'memories', or 'records' acquiring their meaning for us, conditional on our hidden assumption of a low-entropy past. What's the maths? What is the relative probability of 1. The low-entropy past giving rise to a consciousness making the interpretations of its surrounding physical world as described, and 2. Any trajectory which gives rise to a transient consciousness with an illusion of the past, and / or its physical surroundings? I know this seems a pretty far-out question, but I start to feel that when arguments are made in this style, either the math should answer it, or a justification, perhaps a meta-physical one, is needed for its exclusion. Is that question really covered by Boltzmann brains?
MC Squared
MC Squared
1 year ago (edited)
Does the quantity of Dark Matter increase with time? Does its total gravitational mass increase?
Math adventures
Math adventures
1 year ago
new time fan, And i am still watching the first part of the series :)
Bruce Long
Bruce Long
1 year ago (edited)
So if we artificially make a REALLY high entropy in a closed system we can make time flow backwards in that system? I don't understand. Suppose at T=0 we create (simulate?) a really low entropy system. Then which direction does the entropy begin to decrease? <--- or ---->? If you say there is a preferred direction then you've already built in a direction. I'm not trying to be contrary, just trying to understand.
rosedragon108
rosedragon108
1 year ago
smart of you to do youtube vids - ty so much ... recommend your books etc all the time.
2
Narf Whals
Narf Whals
1 year ago
Does Conformal Cyclic Cosmology neatly solve/sidestep the recurrence problem by just claiming that the highest entropy state and the lowest entropy state are simply the same state?
Also i very much disagree with you on the two notions of information vs entropy. They are the exact same view. The information in the system corresponds to the information you do not have. If you have all the information about the system then you can learn nothing. That is the same as not being surprised by the message content because you already had all the information.
Seif Haridi
Seif Haridi
1 year ago
Question the Maxwell’s demon does not need to erase information he just need to add information in his notebook that particle a moved from box 1 to box 2. He still has complete knowledge.
Too Bad
Too Bad
1 year ago
thanks Sir . very loud and clear.
the halting problem pittsburgh
the halting problem pittsburgh
1 year ago
If a Boltzmann Brain is the only observer in a universe, relativity, would it even be able to comprehend its existence or environment?
Ryan Reppucci
Ryan Reppucci
1 year ago
Consciousness an emergent property of the brain... Could it be a "dark emergent" property that influences the collapse of the wave function and stacks our time slices ( like a messy file cabinet :). Similar to emergent properties in other subatomic realms.
1
Shikhar Amar
Shikhar Amar
1 year ago
hope string theory and quantum gravity will be coming soon too!
2
speculawyer
speculawyer
1 year ago
Expanding universe wouldn't help Boltzmann versus Zermelo. Solar system recurrences happen regardless. At least they thought so with Newtonian physics. They didn't know about gravitational waves bleeding of energy.
Michael Dam Olsen
Michael Dam Olsen
1 year ago
Preprint of Dr. Carroll's paper on Boltzmann Brains, for those interested in more detail: https://arxiv.org/pdf/1702.00850.pdf
1
CalendulaF
CalendulaF
1 year ago
Just a tiny quibble: 31:00 the guys name is Josef Loschmidt, not Lohschmidt. He was a giant in chemistry.
Kevin McCarthy
Kevin McCarthy
1 year ago
That fact that no one understands entropy (or information for that matter) has allowed creationists to get away with a lot of shenanigans (Shannonigans?) about information and entropy with biologists (who don't generally study either).
In fact, that's why I watched this video was to more understand both in order to counter creationist claims.
Thanks!
4
Grace Lloyd
Grace Lloyd
11 months ago
My biophysics professor said that neuroscience is the study of information metabolism and I still haven’t recovered from this mental explosion.
Barefoot
Barefoot
1 year ago
I don't understand why the act of erasing the information in Maxwell's Demon increases the entropy of the universe.
Imagine that the demon has no generalized information about what's going on in the box; all he has is a bidirectional sensor that tells him only information about particles within a certain radius of the opening. If he senses a fast-moving atom above some threshold approaching from the left, he gains information; this information seems to be effectively created by this observation, for the atom itself still contains it, does it not?
The demon then moves the paddle to let the atom through, which decreases the entropy of the box. The atom is now let through; there's no more need to keep or record the information that allowed him to open the paddle; it's recorded in the gas inside the box. Thus the exact same amount of information is in the demon's mind: none. And the exact same amount of information is in the box: the number of particles has not changed. So the entropy in the box has decreased, the information in the box remained the same, and the information in the demon remained the same... so where did the entropy supposedly generated by the deletion of the information in the demon come from?
I always assumed that the act of gathering information about when to open and close the door would be what increased the entropy of the system, because this would have to be done through some kind of interaction with the approaching atoms, presumably in the form of photons or virtual photons, but that isn't what was presented here. Can you clarify this in the Q&A?
Does it change anything if, instead of thinking of Maxwell's Demon as a vast intelligence with infinitely precise knowledge about the contents of the box, we instead think of it as a profoundly stupid demon that just reflexively opens then closes the paddle in response to sensing any atom above (or below from the right) a certain velocity?
Sandip Sahani
Sandip Sahani
5 months ago
"universe is bounded." Where can I read your paper on that? please do reply
Steven Mellemans
Steven Mellemans
1 year ago
But the CMB has a near perfect black body spectrum which is the definition of max entropy. Mmm, I’ll need to think about that for a while.
Pasquale De Stefano
Pasquale De Stefano
1 year ago
Hello professor. What about information loss and blackhole?
Tim M
Tim M
1 year ago
OK, so if you mean by "early universe" = still at a single point ? Then perhaps low entropy due to no relative points to measure between. In fact 'between' probably didnt even exsist yet, nor time (the yardstick of all measurement). Also the concept of big vs. Small (singularity concept) wouldn't be relevant either. In other words 'infinitesimally small as compared to what ? In relation to what ? I'm not suggesting none of this actually happened (Big Bang). Just wondering if state of low entropy is quantifiable when time and distance didn't yet exsist.
vinm300
vinm300
4 months ago
Nobody handles big ideas better than Sean Carroll.
Robert (Closer to Truth) asked him "Is information the fundamental underlying reality"
Carroll said, "No".
Most of Robert's interlocutors talk in circles, give pedantic metaphors, then don't answer.
Klaus Gartenstiel
Klaus Gartenstiel
6 months ago
feature film length, and me on the edge of my seat the whole time.
Seif Haridi
Seif Haridi
1 year ago
I am assuming finite number of particles so recoding the movement needs still finite notebook
Mickolas21928
Mickolas21928
1 year ago
Will the universe eventually experience heat death? Will any information about it remain if it does?
David Hand
David Hand
1 year ago
STILL nobody has explained to me how the Born Rule can coexist with this reversibility business. The new state does not have a one-to-one correspondence with the set of superpositions of states. You can't calculate back from the present if there is a collapse of the wavefunction
1
John Wiltshire
John Wiltshire
1 year ago (edited)
Very good... but...
Not a single mention of "Disorder" or that long standing, and often quoted in learned textbooks, thing about why a teenager's bedroom becomes untidy. Have we finally dispensed with those notions? Also:
Four definitions of Entropy ( Including a clear affirmation that Shannon Entropy is not the same thing as Thermodynamic Entropy) but no definition of "Information".
What is wrong with this definition of information which has nothing to do with "knowing" by a conscious entity:
INFORMATION: If the arrangement of physical assembly A is, in some way, correlated with the arrangement of physical assembly B then the fact that, in principle, something about the arrangement of physical assembly A, can be deduced by examination of B, is more succinctly stated as " B contains information about A.
LTK
LTK
1 year ago
you got poincaré 's accent direction right xD goodjob !
AlwaysDisPutin
AlwaysDisPutin
8 months ago
1:21:00 So Shannon says if we get told Sun rises in the East then our surprisal = 0 & we gain no new info, but 'Sun rises in West' gives us information e.g. maybe we're not on Earth. It's like how when Fermilab found 1 of the electron's cousins being more wobbly than the standard model predicts, physicists' got excited
Bob Bogaert
Bob Bogaert
1 year ago
Some of that rare Youtube time that isn't wasted.
Walter Zagieboylo
Walter Zagieboylo
1 year ago
So good.
Sherlock Holmes lives.
Sherlock Holmes lives.
1 year ago (edited)
Mike's meal equation.
Fish + Chips + Salt = A nice meal for Mike.
2
Pavlos Papageorgiou
Pavlos Papageorgiou
1 year ago
That is such an unlikely episode!
Arthur Castonguay
Arthur Castonguay
5 months ago (edited)
Studied engineering. Your log discussions make me cringe sometimes as you ignore base. I’m used to see log base e as ln(). Interesting how different disciplines differ. To me log() is base 10
Nancy Mencke
Nancy Mencke
1 year ago
Thank you so much
stridedeck
stridedeck
1 year ago (edited)
There is another way to show that entropy does not increase over time and that Maxwell's Demon is not incorrect because the demon's information must be erased which causes heat and an increase in entropy! This assumption is based on every particle and object in space (or excitation of a field) creates its own motions (energy). What if, there is a continual hidden force (such as a hidden hand scattering playing cards around on a table surface) that is moving all matter around? Matter interacts with each other, bunches up, disperses, etc., but the total energy is still fixed, just like all the scattering playing cards are fixed. No information is erased! The continuous energy of this automatic, undirected mysterious hidden force comes outside the system, such as we are in a hypersphere, another dimension interacting with our world. How does a person in a flat world experience a 3D force? The playing cards are in a flat world and the hidden hand is the 3D world.
1
A M
A M
1 year ago
This is the first time someone explains what entropy actually is. Not just "oh entropy is the messiness of a system".
Pod 042
Pod 042
1 year ago
Thank you.
Carlos Mora
Carlos Mora
5 months ago
Penrose's conformal cosmology accommodates the recurrence objection. In CCC one eon begins with a Big Bang and ends when the last black hole evaporates. The whole thing will happen again in the subsequent eon. The microstates (strings, particles, atoms, etc.) that make me today and the entire sequence of them that have made me throughout my life, will coalesce again in another Carlos Mora in the next eon. That is exactly Nietzsche's Eternal Recurrence supported by mathematical theorems.
Yes, there is an afterlife. Not a heaven with a Jesus or 72 virgins but the same thing that we have lived so far. Boltzmann and Poincaré are not in contradiction, they reinforce each other. Since this is true for everybody who has ever lived and those that will life in the future of this eon, what moral responsibilities emerge?
Bret Netherton
Bret Netherton
1 year ago
Awareness is known by awareness alone.
David Hand
David Hand
1 year ago (edited)
The reversibility argument is really goofy because a macro state evolves not from its internal volume but from its surface area, and there is much more surface with lower entropy than higher. The whole idea of the macro state is that you can't distinguish its microstates, so you can't just follow the evolution of any single trajectory. That's meaningless.
I hate to be a hater but your idea about the arrow of time is also stupid. Memories of the future always exist, but they're stored out there in the environment until they converge into your brain. If they're not in your brain, you can't remember them. But the information is always out there, stored in the velocities and positions of various particles, chemical gradients, electromagnetic fields and waves, etc.etc.etc.. You experience the information at the "present", the moment that it reaches and creates a correlation in your brain, which you can then recall in the future. Your idea about inferring the past suffers from a lack of causality, or a misunderstanding of it. And it also relies on the previous axiom that is also stupid.
Flemming Lord
Flemming Lord
1 year ago
If the many worlds interpretation of quantum mechanics is correct, then how does the total entropy across all worlds evolve over time?
Dajon Thomas
Dajon Thomas
1 year ago
Lmao unfortunately there was also a siren going off near me right at the same moment you apologized for the siren. 😂😂😂
David Campos
David Campos
1 year ago
1:30:31 Because you have told us that spin does not have anything to do with rotation, I conclude the word spin is a misnomer. We need two separate words here.
1
dr who
dr who
8 months ago
Yeah buts what the opposite of entropy? Intelligence? Life? Order? Gravity? And our universe is expanding but our technology is advancing. Which is future which is past?
Twiztid Soul
Twiztid Soul
6 months ago
What I would give to have a debate with u. Well we'll call it a debate for lack of better wording. I've got a theory I'd love to bounce off ur mind. Btw just curious, when u were talking about the particles in a box, and temperature, air molecules, etc. Imo that's kinda narrow sighted. In the sense ur only using oxygen. Oxygen in itself is a molecule, therefore a particle. Wouldn't they cancel each other out in the attempt of proving time exists? Not as in time isn't real, but as time as we know it isn't. If the entropy line is an arrow forward, using "forward" in the movement sense of the word. Could it b that we r the ones making it move? Example, a hypochondriac constantly thinks that there is something direly wrong with them, typically physically. Whereas, more often than not, it's a psychological disorder. Person A, us constantly "googling" every time they sneeze, thinking they r dying, or something along the lines of illness. In 2021 we call these folks Karen's. Now in their mind they r sick. In reality they aren't. If they stop being a hypochondriac they get better. Is it there line of thinking that cured them, or did them thinking they are cured fix the hypochondria disorder. Same with entropy, are we giving it existence or bc entropy makes time move
positively curved pikachu 🅥
positively curved pikachu 🅥
1 year ago
cheers from the a s s of the world 🇦🇷
Sylvia Rogier
Sylvia Rogier
1 year ago
Does your un-haircut between last week's Q&A and this weeks episode constitute a violation of entropy?
Thomas Barrack
Thomas Barrack
1 year ago
I guess Eric Weinstein really got the message out to the physics community! Sorry, just couldn't help myself on that one =) Eric said the community dropped the ball when it came to teaching the public about the fundamental features of our universe, well it looks like Sean Carroll has decided to pick it up again. I love that your educating the public at a level where it's not insulting, but is also extremely clear on a conceptual level, and the fact you give opportunities for questions for those watching to gain clarity when they are curious is just the best public service we could be having in regards to the nations and worlds future. Thank you for this lecture series, I feel like I'm watching early college level lectures online for free....from one of my favorite podcasters as well, it's amazing.
Tim M
Tim M
1 year ago
Low Entropy at Early Universe State possibly because fundamental building blocks already at Maximum Entropy. ie. Not accreated yet ??
Downhill Phil M.
Downhill Phil M.
1 year ago
this is so easy, like falling off a log-arithm.
1
Constructive Critique
Constructive Critique
1 year ago
What smoke and mirrors are used for. Obscure reality to mystfify the observer's vision.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment