11 comments

  • petra 2 days ago ago

    Nvidia has more money than God. Worst case they'll buy the competition.

  • pjmlp 2 days ago ago

    If you think programming a GPU is hard, try to learn how to do a factorial on one of those quantum emulators.

    Here is Microsoft one,

    https://learn.microsoft.com/en-us/azure/quantum/qdk-main-ove...

    • cyanydeez 2 days ago ago

      i thought LLMs are the bootstrap to singularity riches.

  • RoyTyrell 2 days ago ago

    yawn Maybe d-wave should put up or shut up. QC companies and bro-advocates have been saying this for years and there's been very little use outside of pure r&d labs.

    I don't believe that QC is going to have the ease of use, time to deployment, and relative low-cost that GPUs are going to have any time soon - if ever.

    • cwillu 2 days ago ago

      QC could have all of those things and it would still not be a threat. Using a quantum computer for general computation is like using a front-end loader to go grocery shopping: it's a spectacular improvement for the task it's designed for, and utterly useless for the vast majority of other tasks.

  • Melatonic 2 days ago ago

    ....do quantum computers and GPUs have a lot of overlap in the types of tasks they compute ? I was under the impression they solve quite different problems

    • duskwuff 2 days ago ago

      Do present-day quantum computers compute any nontrivial tasks (i.e. beyond factoring the number 15)?

    • cwillu 2 days ago ago

      Correct, there's almost no overlap.

    • hank808 2 days ago ago

      No.

  • mgh2 2 days ago ago

    Sounds like another hype cycle coming...

    • duskwuff 2 days ago ago

      Quantum computing has been stuck in a sort of attempted hype cycle for the last five or ten years now, I think.