13 comments

  • anotherpaulg 8 hours ago ago

    Scott Aaronson has an interesting take on the 2018 paper being discussed in the article:

    https://scottaaronson.blog/?p=3975

    • Strilanc 7 hours ago ago

      Yeah this post nails the issue.

      In order to do the X-basis measurement described in the paper, it's necessary to do very funky things to the simulated agents inside the computers. Probably the easiest way to implement the measurement would be to literally rewind the simulation back to before the measurement started, when the superposition was limited to a single qubit, do the measurement at that time, and then run the simulation back forwards to the current time. The paper doesn't specify an implementation, so it should work for any implementation, so this should be a valid way of doing the operation. But this implementation implies you're undoing all the reasoning the agents did, changing the initial state, and so as you run them forwards again they are redoing the same reasoning steps but in a new context where the premises no longer apply. Which of course results in them making mistakes. The same thing would happen classically, if you rewound a simulated agent to change some crucial fact and then assumed reasoning from premises that no longer held should still be valid.

      I think Scott also co-authored a follow up paper, where they made some steps towards proving that the only computationally efficient way to implement the X-basis measurement was to do this simulation rewinding thing. But unfortunately I can't seem to find it now.

  • Aardwolf 7 hours ago ago

    So the assumptions are:

    1. An agent can analyze another system, even a complex one including other agents, using quantum mechanics

    2. This assumption of consistency, that the predictions made by different agents using quantum theory are not contradictory

    3. If an agent’s measurement says that the coin toss came up heads, then the opposite fact — that the coin toss came up tails — cannot be simultaneously true.

    But isn't everything you measure in quantum mechanics probabilistic? E.g. the article itself gives the example of measuring a polarized photon at 45 degrees giving 50/50 chance of each outcome

    So all 3 assumptions have an issue:

    1: even if you can analyze it, you're analyzing just probabilistic data anyway.

    2: why expect consistency if your results are probabilistic?

    3: I thought the whole concept of superposition was both options being simultaneously true

    What am I missing here that makes this paradoxical?

    • lumost 6 hours ago ago

      As I recall, the majority of these paradoxes are resolved by the observation that you can't get a free measurement, for one system to measure another - both must interact. So either

      1. You Perform some form of strong measurement which is certain to perturb both systems - leading to the collapse of entanglement etc. and various constraints collapsing.

      2. You perform a weak measurement which leaves the opposing systems in probabilistic states which must still be consistent.

      3. You perform no measurement and each system maintains an internally consistent statistical distribution over possible states.

      The paradoxes show up when you assume there is a free measurement which does not perturb either system - or only perturbs one system.

  • vanderZwan 8 hours ago ago

    This is from 2018, btw.

    Anyway, to repeat the same joke I made when this came out seven years ago: speaking as a physics drop-out who then pursued a four-year bachelor of arts, multiple conflicting interpretations of the same thing being considered valid even if they lead to opposite things being considered true are my bread and butter. So, sorry quantum mechanics, I guess you're part of the humanities now.

    Bonus quasi-relevant SMBC https://www.smbc-comics.com/comic/humanity

  • comrade1234 9 hours ago ago

    > The experiment, designed by Daniela Frauchiger and Renato Renner (opens a new tab), of the Swiss Federal Institute of Technology Zurich...

    I remember when this came up in the news six years ago. I looked them up (I live in Zurich) and, if I remember correctly, the grad student quit physics after this paper and went into programming...

    • NetRunnerSu 9 hours ago ago

      S-D-R, Software-Defined Reality!

  • dandanua 9 hours ago ago

    This is from 2018 and there is nothing extraordinary in that thought experiment, see a simpler explanation here https://physics.stackexchange.com/a/707332.

  • 8 hours ago ago
    [deleted]
  • randomNumber7 9 hours ago ago

    Most physicists I have talked to would fit better into a church than the ivory tower of science.

    • dwd 8 hours ago ago

      Someone like Douglas Hoffman and his Interface Theory of Perception is one that comes to mind.

      Anyone who advocates a panpsychist view of reality seem to be trying to create a mathematical/physics proof for the existence of a God like entity.

    • jfengel 8 hours ago ago

      I don't doubt you, but I think you're meeting the wrong physicists.

      I'll admit, I'm not entirely sure what you mean by that. I suspect I also have a different idea of what church-people are like. Perhaps you could elaborate?

      • randomNumber7 5 hours ago ago

        The genius of what newton did (and what started the modern world I would argue) was to stop asking the question of "why" and just describe (with mathematics) the "what".

        It was always just a model of the world. We (humans) have defined it and it describes what we see (the experiments).

        Asking the question of the "truth" in a way similiar to platon and kant is religous. What is truth? We as humans can only gain knowledge from our experiences.