Entropy Explained, with Sheep (2016)

(engineersedge.com)

48 points | by Nevin1901 17 hours ago ago

19 comments

  • dr_dshiv 21 minutes ago ago

    I did a deep dive on entropy a couple years ago. I found the concept to be much harder to understand than I expected! Specifically, it was confusing to shift from the intuitive but wrong “entropy is disorder” to “entropy is about the number of possible microstates in a macrostate” (Boltzmann Entropy) https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula

    I found that Legos provide a really nice example to illustrate entropy, so I’ll share that here.

    Consider a big pile of Legos, the detritus of many past projects. Intuitively, a pile of Legos is high entropy because it is disordered—but if we are trying to move beyond order/disorder, we need to relate it to micro states and macro states.

    Therefore, a pile of Legos is high entropy because if you randomly swap positions of the pieces it will all be the same macrostate—ie a big pile of Legos. Nevertheless, each of the Lego pieces is still in a very specific position— and if we could clearly snapshot all those positions, that would be the specific microstate. That means that the macrostate of the pile has an astronomical number of possible microstates — there are many ways to reorganize the pieces that still look like a pile.

    On the other hand, consider a freshly built Lego Death Star. This is clearly low entropy. But to understand why in terms of microstates, it is because very few Legos can be swapped or moved without it not really being a Death Star anymore. The low entropy is because there are very few microstates (specific Lego positions) that correspond to the given macro state (being a Death Star).

    This specific case helped me grok Boltzmann entropy. To extend it, consider a box with a small ice crystal in it: this has many fewer possible microstates than the same box filled with steam. In the steam, molecules can pretty much we swapped and moved anywhere and the macrostate is the same. With the crystal, if you start randomly swapping molecules to different microstates, it stops being an ice crystal quickly. So an ice crystal is low entropy.

  • Maro 4 hours ago ago

    I've written a few articles about Entropy (I'm a physicist working in DS).

    Almost all of them have Python code to illustrate concepts.

    -

    1. Entropy of a fair coin toss - https://bytepawn.com/what-is-the-entropy-of-a-fair-coin-toss...

    2. Cross entropy, joint entropy, conditional entropy and relative entropy - https://bytepawn.com/cross-entropy-joint-entropy-conditional...

    3. Entropy in Data Science - https://bytepawn.com/entropy-in-data-science.html

    4. Entropy of a [monoatomic] ideal gas with coarse-graining - https://bytepawn.com/entropy-of-an-ideal-gas-with-coarse-gra...

    5. All entropy related posts - https://bytepawn.com/tag/entropy.html

  • LaundroMat 4 hours ago ago

    So if I get this right, there is an infinitely small possibility that a cracked egg returns to its initial state. Imagine that happening and being put on video. We'd all believe we're living in a simulation and witnessed a glitch.

    No-one would believe the scientists explaining that although highly improbable, the uncracked egg does make scientific sense.

    • speakeron 31 minutes ago ago

      It's not really whether it makes scientific sense or not, it's just that it's so very highly improbable (really, really improbable) that other explanations make more sense: the video's a fake, it's mass hysteria, or even that we're living in a simulation.

  • kaonwarb 9 hours ago ago

    I appreciate the explanation, but the very first example doesn't sit well with me. Water forming into ice cubes spontaneously looks weird simply because we’re not used to seeing it. Consider a time-lapse of an icicle forming as a sort of counter-example: https://m.youtube.com/watch?v=mmHQft7-iSU

    (Not refuting entropy as the order of time at all, just noting a visual example is not great evidence.)

    • throwuxiytayq 8 hours ago ago

      Ah, but the icicle isn’t really equivalent to water undripping and refreezing back into a nice cube-shaped object, at presumed room temperature (since our point of reference is the first gif). That would be weird to see IRL always, no matter what. You could watch that gif a million times and you’d still shit your pants if that happened to the glass of water on your desk.

  • NewsaHackO 12 hours ago ago

    Also, one should check out Tenet, it's a pretty authoritative resource about this as well.

    • ruthmarx 12 hours ago ago

      How so? That film is a narrative mess with some good action scenes an not much more.

      If you want to reference a relevant sci-fi, I'd say Asimov's The Last Question is a better fit.

  • xtrapol8 13 hours ago ago

    > entropy is just a fancy word for ‘number of possible arrangements’

    It isn’t though.

    Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute. The “number of possible arrangements” casually fits into this, yet misses some unintuitive possibilities, like the resistive variance or other characteristics not preempted by who ever constructed the intellectual model.

    Idealists insist entropy is a scalar state resolve of delta probability in their model. They are self deceived. Entropy is the existential tendency for potential to distribute toward equilibrium.

    As long as boffins can throw away results that do not correlate, they can insist it is anything they like.

    • ninetyninenine 12 hours ago ago

      >Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute.

      I don't understand this. Please elucidate.

      • IncreasePosts 11 hours ago ago

        Consider that if you Google for `"resistive variance" entropy`, the only hit is for the hn comment.

        It doesn't make sense because what they wrote makes no sense. Probably some looney with their own definition of entropy.

        • kelnos 8 hours ago ago

          Your post is a good primer on bullshit detection: if you read something on the internet that sounds confidently authoritative, but you yourself are not well-versed in the subject, find what seems to be some key phrases and search the web for them. If you find lots of other hits on seemingly-reputable sources, then what you've read may be correct. If you only find the thing that you've just read, it's bullshit, with high probability.

          • xtrapol8 6 hours ago ago

            Entropy is not the number of possible states in a system. Entropy includes outcomes not predicted by the mental model. Convention teaches this incorrectly. I’m the black sheep, bhaaaa.

        • xtrapol8 10 hours ago ago

          Well, this looney likes to point out that the manifold surface area (which is not always uniform) determines the rate of or density of distribution. All this is accounted for in the math selected for whichever for instance (“number of possible states” would count them) though a superior definition is one which is the most general technically correct without enumerating exceptions or extraneous clauses.

          Resistance controls the rate of flow of any potential. That you cannot parse English without an exact match of phrase is kind of ironic.

          I guess when arguing convention I shouldn’t be too casual with my own language.

          Entropy is the distribution of potential over negative potential. In every case. Period. All the other words we use describe how this happens in a specific context (thermal or information).

          Can you find an established definition that can be more succinctly regarded as this?

          Also I like insisting it is a phenomenon of existential reality, not a conceptual tool of the human mind.

          • IncreasePosts 8 hours ago ago

            I'm not interested in reading your word salad.

            • xtrapol8 6 hours ago ago

              Then neg it like anything else you refuse to believe.

      • xtrapol8 6 hours ago ago

        The inverse square law is an example of potential distribution, not number of states.

      • xtrapol8 10 hours ago ago

        Please refer to the comment of this comment!

      • lisper 10 hours ago ago

        > I don't understand this.

        That's because it's nonsense.