If entropy is increasing does it mean universe is non-deterministic?

1,998

Solution 1

Your question is an excellent one and I think your IT approach is spot on. The contradiction you rightly point out arises partly from the subtle differences between the different Conditional Entropies (Conditional Informations) at play in this discussion.

Leaving aside Universes for the moment, let's instead think about a truly thermodynamically isolated system. If that system is deterministic, then, as you say, one can always compute its state at any time given a full specification of its beginning state. The most losslessly compressed description of the system possible then measures the amount of "information" in the system as the Kolmogorov Complexity relative to the language you use to define the compression algorithm in - be aware that an absolute measure of information is not definable in these terms, only relative to a certain language, but I should think that you appreciate that fact.

As you rightly point out, this measure of information does not change with time: it specifies once and for all the system's history

If there are $\Omega$ possible microstates at some time, and if we can assume the Ergodic Hypothesis that all are equally likely, then this minimum information in bits is $\log_2\Omega + \Delta$: which comprises $\log_2\Omega$ bits to write down the integer in $1,\,2,\,\cdots,\,\Omega$ naming which microstate the system is in and a fixed length "foreword" or "header" that describes how to decode the integer into the relevant microstate. As $\Omega$ gets mightily big, $\log_2\Omega$ overwhelms $\Delta$ and so we can make the proportional error in the storage space needed to define the microstate arbitrarily small by looking at bigger and bigger systems and so we often forget about $\Delta$. This justifies the somewhat sloppy statement that the storage people often cite is simply $\log_2\Omega$ bits.

But $\log_2\Omega$ is not the same as the thermodynamic entropy, which is defined wholly in terms of macroscopic state variables such as temperature, pressure, volume, moles of this chemical reactant here and moles of that reactant there and so forth. This is the Conditional Entropy (Conditional Informations): the entropy conditioned on the knowledge of these macrostate parameters. Not all of the possible states of an isolated system are consistent with certain macrostate observations, so the number of states that the system might be in if we know its macrostate is in general smaller than $\Omega$, and so the system's thermodynamic entropy is in general smaller than $\log_2\Omega$ and moreover, this conditional entropy can change with time as the system's microstate evolves.

Now it can be shown, from the laws of large numbers alone, that for a system of $N$ statistically independent particles, the number of bits per particle $\frac{1}{N} \log_2 \Omega$ needed to specify the particle's microstate is arbitrarily near to the Shannon entropy per particle of the a priori maximum likelihood microstate, with the two quantities equal in the limit as $N\to\infty$. What this means is that in any large system of particles there are states that look almost exactly like maximum a priori likelihood states and there is almost nothing else. Microstates that are significantly different from the maximum likelihood one do of course exist, but they form a fantastically small proportion of the total so that if the system for whatever reason finds itself in one of these unlikely states, any evolution of the system will almost certainly carry it nearer to the maximum likelihood one. These unusual states might be extremely simple to specify: our $\log_2\Omega$ bits in computer memory set to all noughts, for example, or a pellet of 1 mole of native sodium in a bucket of water, so their conditional information - their thermodynamic entropy - will be very small. But the system, by the laws of large numbers, must, in any evolution, almost certainly increase its conditional entropy simply because the maximum likelihood states are the overwhelming majority of all the microstates.

This is a sketch of a proof of a weak form of the second law of thermodynamics that an isolated system's thermodynamic entropy will increase with time, whilst its unconditional entropy, $\log_2\Omega$ does not.

To read more about these ideas, please see my answer to the Physics SE question "Why are the laws of thermodynamics “supreme among the laws of Nature”?"

At the level of universes, though, I am not a cosmologist, so I don't know for sure what the general opinion is as to whether the Universe is isolated (in the thermodynamic sense) or not, whether it is infinite or finite, what its global topology is and so forth. But if the universe is truly an isolated system and if as many physicists believe it can ultimately be thought of as a gigantic evolving quantum state, then its state at any time is defined by a unitary transformation of its state at any other time. "The World does not forget its history". So the same ideas as above would apply. We just make the experimental observation that, somehow (and this somehow is quite a mystery to modern cosmology) the Universe found itself in an exquisitely low thermodynamic entropy at the big bang and, through its unitary state evolution, has been moving further and further away from that state ever since.

Some references which you might find helpful and interesting are the following.

E. T. Jaynes, "Information Theory and Statistical Mechanics", Phys. Rev. 106, number 4, pp 620-630, 1965

E. T. Jaynes, "Gibbs vs Boltzmann Entropies", Am. J. Phys. 33, number 5, pp 391-398, 1965 as well as many other of his works in this field

The Gibbs entropy is invariant one for an isolated system, the Boltzmann entropy is the one defined from the experimentally measured macrostate variables.

Solution 2

My two cents to add to the great response from wetsavanna: a short explanation would be the following. Technically, information never increases or decreases at the microscopic level. Entropy is a measure of information about how much do we know about the microscopic levels when making a macroscopic observation. But the relationship between entropy and information is the opposite of what you stated. The more entropy the less information we have (about the microscopic level from a macroscopic level observation). Thus "apparent" information is decreasing, not increasing with an increase in entropy.

Share:
1,998
zduny
Author by

zduny

Updated on November 27, 2020

Comments

  • zduny
    zduny almost 3 years

    I watched some video where they said entropy can be considered as information. They also stated that universe's entropy is always increasing...

    Now here comes the problem my IT mind can't understand: If entropy is increasing, in other words universe is conveying more and more information, doesn't it mean it is ultimately non-deterministic?

    I mean for new information to be introduced it must come from some random source, otherwise it's not new information - if it was derived it would mean it could be compressed to what we already had - so no new information...

    So what am I missing?

    • Hypnosifl
      Hypnosifl almost 9 years
      Are you familiar with the difference between microstates and macrostates? The increase in information is essentially an increase in the number of bits required to specify the microstate if you are first given the macrostate.
    • nir
      nir over 8 years
      Is this the video? youtube.com/watch?v=sMb00lz-IfE
    • zduny
      zduny over 8 years
      Yep. That's the video.
  • zduny
    zduny almost 9 years
    Could you explain what do you mean be "more entropy the less information we have", cause I believe in IT higher entropy in fact means more information - the higher the disorder, in other words the less patterns you can observe in a given message (the more it resembles noise) the more information it contains (patterns can be described in shorter way, noise cannot be compressed. A fact that noise is usually not useful information is another matter, but it doesn't change the fact it is maximum information...).
  • aveline de grandpre
    aveline de grandpre almost 9 years
    no, the more random the less information you have about how many posible microstates are consistent with a given macrostate. But if you do not believe me you should read the wikipedia, I am pretty sure it much better explained there.en.wikipedia.org/wiki/Entropy_%28information_theory%29