> Cloudflare, which hosts a considerable fraction of the Internet's web sites, reports that 33% of its connections are using post-quantum crypto as of January 2025.
DJB's narrative is a little selective here: Cloudflare has done some incredibly impressive things with post-quantum key agreement, which is arguably the "easy"[1] part of moving the Web PKI/TLS to a PQ setting. But key agreement doesn't tell the parties why they should trust each other; you need signatures and certificates for that, and those will need to be PQ-ready too.
That part is much harder, for both technical (larger certificates implied by most PQ signing schemes are much harder to reliably convey over packet networks) and political (the X.509 ecosystem moves very slowly, and penetration of new signature schemes takes years) reasons.
In the Scott Aaronson talk he links at the bottom[1], Prof. Aaronson says
> A fourth reason why people didn’t take QC seriously is that, a century after the discovery of QM, some people still harbor doubts about quantum mechanics itself... Or they say things like, “complex Hilbert space in 2^n dimensions is a nice mathematical formalism, but mathematical formalism is not reality”—the kind of thing you say when you want to doubt, but not take full intellectual responsibility for your doubts.
How is that a failure to take intellectual responsibility? (Asking because it's basically what I think[2], but I promise not to argue with any explanation given here. :-)
There's an awful lot of handwaving in this blog post. I'm sorry, but I'm not convinced. The author mentions how some devices that can seemingly solve exponential time complexity problems also require exponentially high precision, but there doesn't seem to be a strong argument for why that doesn't apply to quantum computers. We haven't experimentally demonstrated quantum computing at sufficient scales to prove that the required number of physical
qubits to perform error correction doesn't scale exponentially.
We don’t need exponentially more physical qubits because we have quantum error correction schemes that exponentially decrease the logical error rate with only a polynomial increase in the number of qubits. There are in fact many schemes for this (https://errorcorrectionzoo.org/) with the surface code mentioned in the blog being a leading approach.
There will be engineering challenges to scale up these implementations but in principal you shouldn’t need exponential resources (unless there is something wrong with quantum mechanics). This sort of error correction scaling does not exist, for example, for analog computing.
Note that you do not need error correction for quantum computing. You only need it for digital quantum computing. There's a separate branch, analog quantum computing, that is also very promising.
>Everything disintegrates for physical error rates around 1% or above
Last I heard we were 1-2 orders of magnitude away from the error correction break even point for noise performance; that point where it would take an infinite number of noisy qubits to break 2048 bit RSA. So does this mean that we are still at an error rate of something like 10%?
Several approaches are better than the break even point today, including the Google demonstration of error correction working to reduce logical errors: https://www.nature.com/articles/s41586-024-08449-y
Did the Google experiment actually hit break even? I got that it had only demonstrated that surface codes did what they were predicted to do. Is it really only a matter of creating more hardware at this point?
Yes it exceeded break even, but no you can't just copy paste hardware yet. For example, some kind of chip-to-chip coupling is needed since chips can't be arbitrarily large.
> Cloudflare, which hosts a considerable fraction of the Internet's web sites, reports that 33% of its connections are using post-quantum crypto as of January 2025.
DJB's narrative is a little selective here: Cloudflare has done some incredibly impressive things with post-quantum key agreement, which is arguably the "easy"[1] part of moving the Web PKI/TLS to a PQ setting. But key agreement doesn't tell the parties why they should trust each other; you need signatures and certificates for that, and those will need to be PQ-ready too.
That part is much harder, for both technical (larger certificates implied by most PQ signing schemes are much harder to reliably convey over packet networks) and political (the X.509 ecosystem moves very slowly, and penetration of new signature schemes takes years) reasons.
[1]: Nothing about it is easy.
Wanted to provide the source for your posting about 33% of Cloudflare TLS traffic having Post-Quantum Encryption as of Jan 2025 [1]
Also note that Google reports using Post-Quantum Encryption internally [2]
sources:
[1] https://radar.cloudflare.com/adoption-and-usage#post-quantum...
[2] https://cloud.google.com/blog/products/identity-security/clo...
In the Scott Aaronson talk he links at the bottom[1], Prof. Aaronson says
> A fourth reason why people didn’t take QC seriously is that, a century after the discovery of QM, some people still harbor doubts about quantum mechanics itself... Or they say things like, “complex Hilbert space in 2^n dimensions is a nice mathematical formalism, but mathematical formalism is not reality”—the kind of thing you say when you want to doubt, but not take full intellectual responsibility for your doubts.
How is that a failure to take intellectual responsibility? (Asking because it's basically what I think[2], but I promise not to argue with any explanation given here. :-)
[1] https://scottaaronson.blog/?p=8329
[2] https://news.ycombinator.com/item?id=42374112
There's an awful lot of handwaving in this blog post. I'm sorry, but I'm not convinced. The author mentions how some devices that can seemingly solve exponential time complexity problems also require exponentially high precision, but there doesn't seem to be a strong argument for why that doesn't apply to quantum computers. We haven't experimentally demonstrated quantum computing at sufficient scales to prove that the required number of physical qubits to perform error correction doesn't scale exponentially.
We don’t need exponentially more physical qubits because we have quantum error correction schemes that exponentially decrease the logical error rate with only a polynomial increase in the number of qubits. There are in fact many schemes for this (https://errorcorrectionzoo.org/) with the surface code mentioned in the blog being a leading approach.
Details for how this could work for factoring are here: https://arxiv.org/abs/1905.09749
There will be engineering challenges to scale up these implementations but in principal you shouldn’t need exponential resources (unless there is something wrong with quantum mechanics). This sort of error correction scaling does not exist, for example, for analog computing.
I got the impression that DJB was criticizing the arguments for why quantum computers won't work. Not trying to demonstrate why they will work.
Note that you do not need error correction for quantum computing. You only need it for digital quantum computing. There's a separate branch, analog quantum computing, that is also very promising.
The author is DJB.
For those not familiar: https://en.m.wikipedia.org/wiki/Daniel_J._Bernstein
>Everything disintegrates for physical error rates around 1% or above
Last I heard we were 1-2 orders of magnitude away from the error correction break even point for noise performance; that point where it would take an infinite number of noisy qubits to break 2048 bit RSA. So does this mean that we are still at an error rate of something like 10%?
Several approaches are better than the break even point today, including the Google demonstration of error correction working to reduce logical errors: https://www.nature.com/articles/s41586-024-08449-y
There’s more citations to gate fidelity progress here: https://metriq.info/Task/38
Did the Google experiment actually hit break even? I got that it had only demonstrated that surface codes did what they were predicted to do. Is it really only a matter of creating more hardware at this point?
Yes it exceeded break even, but no you can't just copy paste hardware yet. For example, some kind of chip-to-chip coupling is needed since chips can't be arbitrarily large.
Great to see DJB work posted here!
[dead]