Wednesday, October 10, 2012

Entropy and Einstein's turnover time

A failed attempt at explaining Entropy,
and one zombie, coming right up...

And so it was, on a nice evening, much like this one, that we had all sat around the table, and a question popped up...

The question was along the lines of "what is the Entropic principle?", and it was asked by my brother, a brilliant man, and a science-fiction aficionado, who unfortunately for the physicist community, never had the chance to dabble with physics, and so they have to find a poor substitute in the image of your humble servant here...

At first I asked if he meant the Anthropic principle, but he just wanted to understand Entropy.

Thus I found myself trying to explain Entropy, and the 2nd law of thermodynamics to the uninitiated, in layman terms, and, after a fashion, follow the Einstein grandmother rule - "You do not really understand something unless you can explain it to your grandmother." (A.Einstein).

Incidentally , you could probably calculate the period time \(T\) of Einstein's turning over in his grave, by someone misquoting him, or otherwise deifying him, and justifying a falsehood or plain ol' stupidity by attributing something to old Einei that he never would have meant in a million light years.  

We'll start with some observational data - I have around 400 people in my human network, and on average I get an Einstein quote which falls under the aforementioned category, maybe once every two weeks.

Now, suppose only a third of the world's population leads a somewhat western lifestyle (either connected to facebook, google+, twitter etc. or alternatively reads the paper and or listens to the radio at least once a day), we have about 2.3 Billion people.

let's be harsh and assume each of the 400 people sub-networks are non-connected between them and so we neglect back-propagation we can put a lower limit of
\[\frac{2.3\cdot 10^{9}}{400}=5.75\cdot 10^{6}\,\text{instances in 2 weeks}\]
Divided by the number of seconds in a two-week period we get:
\[\frac{5.75\cdot 10^{6}}{14\cdot 24\cdot 60\cdot 60}\approx 4.75\,\text{times per second}\]

And so the period time of Einstein's turning in his grave would be \(T\approx 0.21 \,sec\).

So basically even the lower limit states that Einstein, by now, is a zombified Olympic athlete, even considering the initial rigor mortis....

By now, he would have a solid six-pack.

Anyway, I digress, I was going to explain Entropy and then a random rant stole my attention... sorry for that.

In a nutshell, Entropy is a measure of disorder, and I will explain.

<Failed attempt at an explanation : but is still worth a read>

Imagine a group of four coins, each with two sides - heads, and tails - right? (we'll have non of that Two-Face shenanigans here!)
Now suppose every coin is perfectly balanced so there's a fifty-fifty chance of getting heads or tails for each coin flip.

So, now, what are the chances of getting all 4 heads, when you flip 4 coins?
if you do the experiment enough times, you get an average of 1/16 chance.
the state of all heads, or equivocally all tails is the most "ordered" result, why?
because it is the most homogenous result (and we humans like homogeneity, symmetry and by the same token order).
Now, what's the most plausible result?
that's easy - the result where two coins are tails up, and two coins are heads up (regardless of their locations), happens ideally \(\frac{3}{8}\) of the times you flip (almost half of the times you flip the coins, you'll get this result).

That is the least "ordered" result, since we don't care about locations, and the coins show the most diversity in results.

Now, I won't go through the whole derivation, but if you'll try the same logic with 6 coins and then 8 coins you'll get a breakdown of \[\frac{1}{64},\frac{6}{64},\frac{15}{64},\frac{20}{64},\frac{15}{64},\frac{6}{64},\frac{1}{64}\,\text{for six coins}\]
and a breakdown of \[\frac{1}{256},\frac{8}{256},\frac{28}{256},\frac{56}{256},\frac{70}{256},\frac{56}{256},\frac{28}{256},\frac{8}{256},\frac{1}{256}\,\text{for eight coins}\]
And so on and so forth, the reason I'm sticking to even numbers is because it LOOKS more clear that way, but really, it makes no difference. You could go on and on until kingdom come, and you'll find the middle, most unorganized result will be the most common.

It turns out that this kind of dynamic is best approximated by a gaussian function called the \(g\) function (or the multiplicity function) and I'll spare you the details in favor of a graph:
Probable results graph
So, what you see here, is basically an overlay of 4 graphs that show the relative probability of results as they stray from the middle "disorganized" and probable result.
What is interesting, is the bigger the experiment is (i.e. instead of 8 coins, let's say a 100 or 1000 coins) the sharper the peak is, meaning it's narrower, and higher in respect to other possible results. that means by the way that the most probable result is highly probable, and the others highly improbable.  Now imagine an experiment of \(10^{23}\) coins, every result other then the most probable and it's immediate neighbors is SO improbable, it virtually is IMPOSSIBLE (in the sense that it would take a ludicrously impossible amount of experiments to perform to actually get a significant chance to get such a result).

A word of caution though - this is probability we're talking about, so in theory a highly organized result MIGHT happen, in actuality - yeah, not so much...

By the way, there are roughly no more than \(6\cdot 10^{14}\) coins in circulation today in THE WORLD, meaning even if you took all the coins in the world today you couldn't perform such an experiment, even once!

Incidentally the ridiculously high number of participants in a single experiment, makes all the difference between "hard sciences" even if they are statistically oriented, and "soft sciences".

Even if we take all the people in the world, and get them to participate in one of our experiments, the result will produce some correlation that may, or may not apply to a single participant.

In physics, while the same is true, you could say a statistic result applies and be absolutely correct on a macro level (with deviations so small as to be insignificant for most purposes), and be correct almost every time on a micro level as well!.

So anyway, Entropy is defined as the logarithm of the multiplicity function.

The reason for taking the logarithm is for the sake of defining a cumulative quantity, as opposed to multiplicative.

< /Failed attempt at an explanation : but is still worth a read> 

So anyway, obviously I failed at this attempt but let's try it in a simpler manner:

Entropy is a quantity that signifies how probable a result is.
by a fluke of chance, which isn't a fluke at all, more of a deep connection really, the most probable result is also the most diverse one, or differently put, the most disorganized.

Thus, Entropy becomes a measure of disorder of a system.

Entropy is a cumulative property in the sense, that when two non-interacting experiments are done the combined entropy is the sum.
However, when systems are allowed to interact, the combined entropy is typically larger than the sum of individual entropy.

it is by that sense, that entropy tends to increase over time (and interactions).

<example of entropy increase>

Suppose, we have two systems, each of 4 coins.

The most probable state is given by 2 heads, and 2 tails for a single experiment right?
as was explained in the above failed attempt, the chance for that happening is \(\frac{3}{8}\).

Now, what is the chance of each of the experiment to get the most probable state independently? you guessed it - the product of the two independent probabilities i.e. \(\frac{9}{64}\) right?

OK, but now, let's put all the coins in a single experiment, an flip all of them, the chance of hitting 4/4 division of heads/tails is given by \(\frac{70}{256}=\frac{35}{128}\) which is almost double the size of the product of individual probabilities.

So what happened here, really?
in essence, the combined system has more places to choose from, meaning more diversity of scenarios that lead to the same end result, thus the combined system is more "disordered" thus bigger probability that leads to bigger Entropy.

</example of entropy increase>

OK, that wraps it up for this time.
Obviously I don't understand Entropy enough, since I feel I have failed at explaining it,
but I will try again ("if at first you don't succeed" etc...).

Oh by the way, for all us \(\LaTeX\) geeks out there, isn't it cool that every time we use a fractional, we immediately get a reference to Battlestar Galactica?

Anyway, till next time...

P.S. some quick ideas to utilize Einei's incredible turnover time of 0.21 seconds :
1. Attach him to a turbine and generate electricity.
2. Display him as the 8th world wonder - the quickest man on the planet (faster than Usain Bolt).
3. Use him as the engine for a horse-ride carousel for my kid.
4. Use him to refute the 2nd law of Thermodynamics as he is both dead (Entropy supposed to get bigger), but in the greatest shape of his life (which requires work and decrease in local Entropy)...

Just sayin'...


No comments:

Post a Comment