As a reminder, current estimates are that quantum cracking of a single 2048-bit RSA key would require a computer with 20 million qubits running in superposition for about eight hours. For context, quantum computers maxed out at 433 qubits in 2022 and 1,000 qubits last year. (A qubit is a basic unit of quantum computing, analogous to the binary bit in classical computing. Comparisons between qubits in true quantum systems and quantum annealers aren’t uniform.) So even when quantum computing matures sufficiently to break vulnerable algorithms, it could take decades or longer before the majority of keys are cracked.

The upshot of this latest episode is that while quantum computing will almost undoubtedly topple many of the most widely used forms of encryption used today, that calamitous event won’t happen anytime soon. It’s important that industries and researchers move swiftly to devise quantum-resistant algorithms and implement them widely. At the same time, people should take steps not to get steamrolled by the PQC hype train.

  • Mike1576218@lemmy.ml
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    1 month ago

    If qbits double every year, we’re at 20 million in 15 years. Changing crypto takes a very long time on some systems. If we’re at ~20000 in 5 years, we better have usable post quantum in place to start mitigations.

    But I’m not convinced yet, we’ll have those numbers then. Especially error free qbits…

    • humblebun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      1 month ago

      If qbits double every year

      And then we need to increase coherence time, which is 50ms for the current 433 qubits large chip. Error correction might work, but might not

      • WolfLink@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        Error correction does fix that problem but at the cost of increasing the number of qubits needed by a factor of 10x to 100x or so.

        • humblebun@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 month ago

          But who guarantees that ec will overcome decoherence, introduced by this number of qbits? Not a trivial question that nobody can answer for certain

          • WolfLink@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 month ago

            I mean the known theory of quantum error correction already guarantees that as long as your physical qubits are of sufficient quality, you can overcome decoherence by trading quantity for quality.

            It’s true that we’re not yet at the point where we can mass produce qubits of sufficient quality, but claiming that EC is not known to work is a weird way to phrase it at best.

            • humblebun@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              1 month ago

              It was shown this year for how many, 47 qbits to scale? How could you be certain this will stand for millions and billions?

              • WolfLink@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 month ago

                Because the math checks out.

                For a high level description, QEC works a bit like this:

                10 qubits with a 1% error rate become 1 EC qubit with a 0.01% error rate.

                You can scale this in two ways. First, you can simply have more and more EC qubits working together. Second, you can near the error correcting codes.

                10 EC qubits with a 0.01% error rate become one double-EC qubit with a 0.0001% error rate.

                You can repeat this indefinitely. The math works out.

                The remaining difficulty is mass producing qubits with a sufficiently low error rate to get the EC party started.

                Meanwhile research on error correcting codes continues to try to find more efficient codes.

                • humblebun@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  1 month ago

                  While you describe the way how error correction works, there are other factors you fail to notice.

                  It is widely known, that for each physical qubit T2 time decreases when you place it among other. The ultimate question here is: when you add qubits, could you overcome this decoherence with EC or not.

                  Say you want to build a QC with 1000 logical qubits and you want to be sure that the error rate doesn’t exceed 0.01% after 1 second. You assemble it, and it turns out that you have 0.1%. You choose to use some simple code, say 7,1 and now you have to assemble a 7000 chip to execute 1000 qubits logic. You again assemble it and the error rate is higher now (due to decoherence and crosstalk). But the question is how much higher? If it’s lower than your EC efficiency then you just drop a few more qubits, use 15,2 code and you are good to go. But what if no?

  • MalReynolds@slrpnk.net
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    1 month ago

    And everyone thinks about real time implications, what about historical ? Seems pretty likely that the NSA has been storing an appreciable fraction of the internet for a long damn while. Come Q-Day that all gets opened and searchable. What would Trump do ?

    • azuth@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      Nothing, he will be dead. Anyone that NSA would bother to use their new and expensive quantum machines on will be an organization that should know better than to be compromised by decades old secrets getting out.

      • MalReynolds@slrpnk.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 month ago

        You’re no fun, orange turnip was merely an example of a bad actor getting control (going Reagan would be confusingly amusing) and it’s not about anyone in particular, more so the entire worlds’ dirty laundry out to dry

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Just because you can break RSA doesn’t mean you instantly get access to all private databases.

      Encryption by itself isn’t important. You know all those big company data leaks that seem to happen every month? That data was very likely encrypted. But it doesn’t matter because when you control a computer, you can see the encryption keys being used and decrypt whatever is stored.

  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    3
    ·
    1 month ago

    Man, quantum computers has been about-to-break-encryption since the 90s. The hype never ends, just a new crop of people first hear it then figure out it’s bullshit.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 month ago

      Not to mention we already have quantum-computer-resistant cryptography.

        • Evotech@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 month ago

          There’s an idea for a crypto. You send a message, another message (or 100000) gets created by ai I guess, and based on some predetermined hash the retriever must calculate which is correct, the lie/other message is discarded.

          I’ll call it Never tell a lie, or NTL

    • Dave@lemmy.nz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 month ago

      But isn’t the point that we just need to stay ahead of it. Surely encryption used in the 90s could be broken by a quantum computer today?

        • Dave@lemmy.nz
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 month ago

          It seems the RSA-155 (512 bit) encryption commonly used in the 90s was broken in 1999, no quantum needed (due to it being based on primes).

          Though from what I can search up, reddit users from 10 years ago were confident a 128 bit modern algorithm (e.g. AES) would never be able to be brute forced, even by quantum computers.

          I dunno, sometimes I wonder if not everyone on the internet is an expert.

      • Pappabosley@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 month ago

        Didn’t you hear, they’ve almost succeeded at nuclear fusion, almost 90 whole seconds of stable fusion, any day now

    • humblebun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      Ok, I decided to dive into it today again and look what I’ve found:

      1. They still demonstrate supremacy to each other proving that their setup couldn’t be simulated. These 433 and 1000 qubit processors are good only for one purpose: to simulate itself.

      2. Photonic QC still estimates hafnian billions times faster; if only this mathematical structure appeared to have any practical meaning

      3. They demonstrated that toric codes might be effective

  • SteelGeneral@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 month ago

    Just parroting stuff I heard at black hat, but aside from all the above don’t we first need to have millions of logical qubits? I believe the numbers people advertise now are just physical qubits.