Google DeepMind Paper Argues LLMs Will Never Be Conscious

(404media.co)

7 points | by cdrnsf 2 hours ago ago

12 comments

  • letmevoteplease 5 minutes ago ago

    Remarkably dumb summary by 404. Consciousness and AGI are separate subjects. The paper in question describes AGI as an "inherently non-sentient tool." It argues software consciousness is impossible. It does not argue AGI is impossible. But this article says the paper "appears to conflict with the narrative from AI company CEOs" that we will achieve AGI and it will have a massive economic impact. At the very least, if the author believes AGI and economic transformation require sentience, they should provide some kind of reasoning and not just project their assumptions on the paper.

  • lmf4lol 36 minutes ago ago

    Phew. Good news! Imagine the AI behemoths would have to take into account the feeling of their slave labour machines! Don‘t have to do that if they wont/cant be conscious.

    And neither do I have to worry then if ask then to do stupid sh*t for me :-)

    But on a serious note. Does it matter? I think Hinton said it pretty well: Not really! what matters is that we treat it as conscious beings. We humans are just way too easily fooled. I mean, I even cant throw away that toy that my mom gave me 35 Years ago because I somehow would feeö sad for it :-)

  • torginus an hour ago ago

    I wish there was more research (maybe philosophy) would go into characterizing consciousness and intelligence, so that we could at least define what we are missing in current AI systems.

    • anthonyrstevens 5 minutes ago ago

      Philosophy of consciousness is at least 2,500 years old.

  • parliament32 an hour ago ago

    Why would a text generator ever be conscious? Was this really worth writing a paper about?

    • cma 22 minutes ago ago

      I think gpt-image-2 at least incorporates representations from the base model, even if base model doesn't itself have the output capability. And it does have image input fused directly into it that helps make those representations more usable for image gen, so it's not just generating text.

  • jaspervanderee 2 hours ago ago

    Nor wil LLMs achieve AGI. There will be too many contradicting ideas in its source code.

  • adyashakti 2 hours ago ago

    of course; consciousness is a biologically inherited trait. that inheritance can't cross the human-machine interface.

    • pixl97 5 minutes ago ago

      "Consciousness is magical and can only do things that I want it to, and none of the things that are uncomfortable to me. Of course I've not defined any of this so I can move the goal posts as needed"

    • postalrat 18 minutes ago ago

      Sure if that's how you define consciousness. What do you want to call the machine version of the same phenomenon?

    • JPLeRouzic 2 hours ago ago

      > consciousness is a biologically inherited trait

      That consciousness is a biologically trait seems a common statement, but why "inherited"?

    • subscribed 40 minutes ago ago

      I presume you used "biologically" to emphasise we don't yet know any non-biological consciousnesses, not that you determine, a priori, that the consciousness must be and is always rooted in the wet organic matter?

      I don't think you could come up with a good theory for the latter and there's nothing that would preclude the existence of the artificial / inorganic consciousness - after all, correct me if I'm mistaken, we have no idea how the consciousness emerge in some biological entities.