The Failure of the Thermodynamics of Computation (2010)

(sites.pitt.edu)

44 points | by nill0 3 days ago ago

10 comments

  • svnt an hour ago ago

    This was published right before people started experimentally validating the Landauer limit. I am not sure why it hasn’t been taken down at some point as the evidence has accumulated:

    2012 — Bérut et al. (Nature) — They used a single colloidal silica bead (2 μm) trapped in a double-well potential created by a focused laser. By modulating the potential to erase the bit, they showed that mean dissipated heat saturates at the Landauer bound (k_B T ln 2) in the limit of long erasure cycles.

    https://www.physics.rutgers.edu/~morozov/677_f2017/Physics_6...

    2014 — Jun et al. (PRL) — A higher-precision follow-up using 200 nm fluorescent particles in an electrokinetic feedback trap. Same basic physics, tighter error bars.

    https://pmc.ncbi.nlm.nih.gov/articles/PMC4795654/

    2016 — Hong et al. (Science Advances) — First test on actual digital memory hardware. Used arrays of sub-100 nm single-domain Permalloy nanomagnets and measured energy dissipation during adiabatic bit erasure using magneto-optic Kerr effect magnetometry. The measured dissipation was consistent with the Landauer limit within 2 standard deviations using the actual the basis of magnetic storage.

    https://www.science.org/doi/10.1126/sciadv.1501492

    2018 — Guadenzi et al. (Nature Physics) — Opens with:

    The erasure of a bit of information is an irreversible operation whose minimal entropy production of kB ln 2 is set by the Landauer limit1. This limit has been verified in a variety of classical systems, including particles in traps2,3 and nanomagnets4. Here, we extend it to the quantum realm by using a crystal of molecular nanomagnets as a quantum spin memory and showing that its erasure is still governed by the Landauer principle.

    https://www.nature.com/articles/s41567-018-0070-7

    The Landauer limit is not conjecture.

    • ogogmad an hour ago ago

      I'm not sure, but isn't 2 standard deviations a bit low? Especially so for something that can be done in a lab. It seems that 2 SD is the minimum threshold for getting published. Can we be sure that these are properly reviewed?

      • spocchio an hour ago ago

        Could it be possible that you confused the number of standard deviations one needs to falsify something? For instance, if two things are different we may want to be as many SD as we can apart. Here, on the other hand, the data agree _within_ 2S D.

  • svantana 4 hours ago ago

    It's an interesting article but I fail to see the point they are trying to make. I always thought of reversible computing as a sort of platonic ideal that cannot truly exist in real life, but the principle can still be used to reduce waste heat and energy use. For example, it will be interesting to see if the chips from Vaire ever become practically useful:

    https://vaire.co/

    https://spectrum.ieee.org/reversible-computing

    • cwillu 33 minutes ago ago

      > I always thought of reversible computing as a sort of platonic ideal that cannot truly exist in real life

      It's been experimentally demonstrated. Practical or not, the effect is real.

    • smitty1e 2 hours ago ago

      From the abstract, the idea is that we can continue to shrink: "...in a manner in which no thermodynamic entropy is created or passed to the surroundings."

      The objection seems to be the "free lunch" assumptions being made about shrinkability.

      "What Is TANSTAAFL?" https://youtu.be/ZrZUe7R44eA?si=oK2H1L9ha1zQhDOh

  • debatem1 3 hours ago ago

    The author should write a followup article about how theory of computation has failed because nobody makes a Turing machine with enough tape.

  • oh_my_goodness 2 hours ago ago

    Very clear intro to this notoriously slippery area.

  • ogogmad 2 hours ago ago

    I think I arrived at the same suspicion independently -- it was when I was trying to understand thermodynamic entropy as an instance of Shannon entropy - where the latter is defined abstractly as a property of probability distributions - which left me wondering about where the thermodynamic probabilities came from. I was wondering whether they were supposed to be subjective probabilities, or derived from ensembles. Then I recalled that entropy was originally defined non-probabilistically as dS = (1/T)δQ. Then I started reading about Boltzmann distributions as a bridge between Shannon's entropy and entropy in the earlier sense (Clausius entropy). I then concluded that instead of thinking about bits and bytes, it was much easier to think about gases and machines doing work, like a 19th century engin-eer building, er, engines.

    I am pretty ignorant of this field.

    • cwillu 32 minutes ago ago

      The effect has since been experimentally demonstrated.