Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
The Internet Encryption Google

Google Quantum-Proofs HTTPS (arstechnica.com) 21

An anonymous reader quotes a report from Ars Technica: Google on Friday unveiled its plan for its Chrome browser to secure HTTPS certificates against quantum computer attacks without breaking the Internet. The objective is a tall order. The quantum-resistant cryptographic data needed to transparently publish TLS certificates is roughly 40 times bigger than the classical cryptographic material used today. Today's X.509 certificates are about 64 bytes in size, and comprise six elliptic curve signatures and two EC public keys. This material can be cracked through the quantum-enabled Shor's algorithm. Certificates containing the equivalent quantum-resistant cryptographic material are roughly 2.5 kilobytes. All this data must be transmitted when a browser connects to a site.

To bypass the bottleneck, companies are turning to Merkle Trees, a data structure that uses cryptographic hashes and other math to verify the contents of large amounts of information using a small fraction of material used in more traditional verification processes in public key infrastructure. Merkle Tree Certificates, "replace the heavy, serialized chain of signatures found in traditional PKI with compact Merkle Tree proofs," members of Google's Chrome Secure Web and Networking Team wrote Friday. "In this model, a Certification Authority (CA) signs a single 'Tree Head' representing potentially millions of certificates, and the 'certificate' sent to the browser is merely a lightweight proof of inclusion in that tree."

[...] Google is [also] adding cryptographic material from quantum-resistant algorithms such as ML-DSA (PDF). This addition would allow forgeries only if an attacker were to break both classical and post-quantum encryption. The new regime is part of what Google is calling the quantum-resistant root store, which will complement the Chrome Root Store the company formed in 2022. The [Merkle Tree Certificates] MTCs use Merkle Trees to provide quantum-resistant assurances that a certificate has been published without having to add most of the lengthy keys and hashes. Using other techniques to reduce the data sizes, the MTCs will be roughly the same 64-byte length they are now [...]. The new system has already been implemented in Chrome.

Google Quantum-Proofs HTTPS

Comments Filter:
  • by SlashbotAgent ( 6477336 ) on Saturday February 28, 2026 @08:11AM (#66015312)

    I didn't think that we had quantum computers already.

    How is it that we have Shor's Algorithm(from 1994) and have Google working on this?

    Is it still theoretical and prophylactic, or does this stuff exist today and have real world possibility now?

    • Google wants to be a leader in selling quantum computing, and in order to do that, they have to create the perception that it is useful.

    • by gweihir ( 88907 )

      The current actual QC factorization record is 28. And that is not with the general Shor's algorithm, because that would take a larger working QC than exists.

      There is a lot of lying and giving false expressions when it comes to what QCs can actually do. The reality is that it is almost nothing and that is after 50 years of research. Anybody that expects great things here is not living in the real world.

    • Is it still theoretical and prophylactic, or does this stuff exist today and have real world possibility now?

      Theoretical and prophylactic are not the same thing. Quantum decryption is indeed still a future threat, but that doesn't mean you shouldn't protect yourself now. Various parties are already using [medium.com] a technique called "harvest now, decrypt later" [paloaltonetworks.com] to record web traffic today that they may be able to decrypt and read with quantum computers later. So using non-quantum-resistant encryption today is as risky as using plain text, just with a delay before the consequences appear.

      • by HiThere ( 15173 )

        No, it will probably always be more expensive to read encrypted text. Just not nearly as much so. But for reasonable probabilities this could probably be handled just by using a longer key.

      • by EvilSS ( 557649 )
        Excuse me sir but this is America, we don't pro-actively address potential issues, we wait until they are real then try to fix them in a panic.
      • by Junta ( 36770 )

        It's worth mentioning there are two concerns, digital signature (authentication) and perfect secrecy (protection against decrypt later). The former you don't have to worry about 'decrypt later', the latter you do need to.

        So PQC key agreement may be prudent in the face of potentially viable scaled quantum computing, but the digital signature piece can afford to wait a bit, for all we know decades or forever.

    • I didn't think that we had quantum computers already. [...] Is it still theoretical and prophylactic, or does this stuff exist today and have real world possibility now?

      We do, actually, but they're far too small and far too unreliable to pose a current threat. That said, with something like the Internet, which takes decades to upgrade core components, if you wait until the problem exists before you try to solve it, you're gonna be in trouble.

      It's still possible that quantum computers will never be practical, but there's been significant progress over the last few years that makes it seem like it probably will happen. If you have anything that relies on asymmetric crypto

    • by tlhIngan ( 30335 )

      I didn't think that we had quantum computers already.

      How is it that we have Shor's Algorithm(from 1994) and have Google working on this?

      Is it still theoretical and prophylactic, or does this stuff exist today and have real world possibility now?

      We do have quantum computers of all kinds. The general purpose kind we are averaging around 100 qubits or so. The quantum annealing stuff is much more.

      It's not purely theoretical in that it does work. And we do know governments and state actors have been archiving en

  • This material can be cracked through the quantum-enabled Shor's algorithm.

    Yes, but it requires a quantum computer with lot of Qubits [postquantum.com], which currently there is no known path to build.

    • by gweihir ( 88907 )

      Or rather for which it is unknown whether it is even possible, giving the restrictions of this universe. Classical computers have scaled exponentially for a long time. Although scaling is basically over for CPUs, they still scale somewhat linearly for problems that can be subdivided into smaller pieces (quantum algorithms cannot be). Now, all evidence points to QCs scaling inverse exponential, i.e. for one bit more or one step longer computations, you need exponentially more efforts. That is not a computing

      • by Junta ( 36770 )

        You are *probably* right, but security likes to guard against the potentials and with forward secrecy as a design principle, you have to take even remote possibilities pretty seriously.

        Of course, this can justify a reasonable sense of urgency in implementing PQC key agreement, but it's not super critical that certificates be solved now, since you can have PQC key agreement with traditional certificates. You would not be protected against an active quantum attack (which as far as anyone knows is still impos

    • Either that or a dog trained to bark three times [iacr.org].
  • by madbrain ( 11432 ) on Saturday February 28, 2026 @08:52AM (#66015340) Homepage Journal

    Some EC public keys might be, but certs contain other identity information, and the key was often not their largest component.

    • by Anonymous Coward
      They're definitely only counting the public key - ECDSA prime256v1 keys are indeed 64 bytes, and ML-DSA-44 public keys are 2560 bytes (which are 10 times the size of the RSA-2048 keys that ECDSA largely displaced), but the certificate is far more than just the public key (hell, Slashdot's own LetsEncrypt certificate is 949 bytes log, only 64 bytes of which is the ECDSA public key).
      • They're definitely only counting the public key - ECDSA prime256v1 keys are indeed 64 bytes

        Yes, although Ed25519 public keys are only 32 bytes.

        ML-DSA-44 public keys are 2560 bytes

        That's the size of the private key. The public key is 1312 bytes. A signature is 2420 bytes.

        which are 10 times the size of the RSA-2048 keys that ECDSA largely displaced

        An RSA 2048 bit public key and RSA 2048 signature are each 256 bytes, so 512 bytes total. An ML-DSA-44 pubkey and signature are 3732 bytes... so 7.2X the size.

        but the certificate is far more than just the public key

        Indeed. Though with ML-DSA, the size of the public key and signature will dominate the remaining stuff. With RSA-2048 it was more or less even amounts, with ECDSA and especially Ed25519 we've become accus

    • Correct.

      X.509 certificates are usually closer to 2 KB (or larger) -- about 30 times bigger than this post erroneously claims.

  • Today's X.509 certificates are about 64 bytes in size, and comprise six elliptic curve signatures and two EC public keys.

    In today's X.509 certificates, 64 bytes is the size of one signature. One certificate contains a public key (32 bytes) and a signature (64 byte), plus some additional stuff, so each cert is 500 bytes. Also, I can't figure out what they mean by "six signatures and two public keys". Each certificate in a chain contains one pubkey and one signature. A chain of six certificates would contain six pubkeys and six signatures.

    This statement is so confused I can't make head or tail of it.

    Some realistic certi

    • Still, if there's a more space-efficient Merkle tree-based representation, that's a good thing. Saving a few KB on every TLS session is good. The aggregate bandwidth savings across the whole Internet would be enormous.

      It's worth mentioning that there's another possible motivation: The possibility that ML-DSA turns out to be insecure and we have to fall back on the purely hash-based schemes, like SPHINCS+, which could increase certificate sizes by a factor of at least two above the ML-DSA-88 worst case, and possibly a lot more. SPHINCS+-256f certificates would be about 50 KB in size, so a chain of several of them could get really big.

      Having a very space-efficient alternative would be really beneficial in that case. A

If you have a procedure with 10 parameters, you probably missed some.

Working...