Quantum Attack Resource Estimate: Using Shor’s Algorithm to Break RSA vs DH/DSA VS ECC

Most security experts are by now aware of the threat that the rise of quantum computing poses to modern cryptography. Shor’s quantum algorithm, in particular, provides a large theoretical speedup to the brute-forcing capabilities of attackers targeting many public-key cryptosystems such as RSA and ECDSA. But how much, exactly, is the impact in terms of security? What is the timeline we have to expect? Which algorithms are more vulnerable than others?

In this post I am going to deep-dive in quantum attack resource estimates for public-key cryptography based on recent advances on the topic. In particular, we will see the exact computational cost of running Shor’s algorithm to attack different keysizes for different public-key algorithms, we will put this in context with the current state of quantum computing, and we will see how RSA will be broken by June 25th 2027 (just kidding!)

Computational Assumptions

First of all we need to quickly recap, from a purely classical standpoint, the difference between various computational hardness assumptions used in modern cryptosystems. This is actually a very complex topic. For the sake of simplicity, in this blog post we only take into account the following three problems:

  • The integer factorization problem (IFP): given an integer number N that is the product of two large primes p and q, find p and q.
  • The (finite field) discrete logarithm problem (DLP): given a generator g for a large multiplicative subgroup of a field \mathbb{F} and an element y in such subgroup, find x such that g^x = y.
  • The elliptic curve discrete logarithm problem (ECDLP): given a non-singular elliptic curve \mathcal{E} defined over a field \mathbb{F}, and a given a point G that generates a large cyclic subgroup in the additive group of the points of \mathcal{E}, and given another point P on such subgroup, find an integer k such that P=kG.

Actually, taking into account these three problems in the context of quantum security is an oversimplification. In fact, in reality and unlike very often and wrongly claimed in popular literature, most existing cryptographic schemes are not directly based on the above assumptions. For example, the security of RSA is not based on IFP, but rather on a similar hardness assumption, called “RSA assumption”, that is known to be reducible to IFP, but the vice versa is not known. This means that “breaking RSA” is at most as difficult as solving the IFP, but it might be easier than that, just so far nobody figured out an easier way. The same happens for most schemes “based” on discrete logarithm problems, such as Diffie-Hellman key exchange or ECDSA elliptic-curve signatures. However, from a practical standpoint, most of the modern cryptanalytic efforts in breaking these schemes focus on solving the above math problems, so this is what is going to be relevant for us when looking at quantum attack resource estimates.

So, how large (again, from a non-quantum perspective) should, e.g., an RSA or ECDSA key be? It depends from two things:

  • The desired security parameter, and
  • The efficiency of the best known attack against the underlying problem.

For factorization and finite field discrete logarithm the situation is similar: the best known algorithm for solving these two problems in the cryptographic case (Number Field Sieve and Index Calculus Method, respectively) have a similar asymptotic subexponential complexity that can be roughly approximated as \mathcal{O}\left( 2^{9\sqrt[3]{n}}\right) for n-bit moduli. This means that targeting n bits of security for cryptographic schemes such as RSA and DH requires pumping up the key size quite a lot: 2048 bit for 112 bit of security, 3072 bit for 128 bit of security, 7680 bit for 192 bit of security, etc.

For the ECDLP problem, instead, the best known general solving algorithm (Pollard’s Rho) has an exponential complexity of roughly \mathcal{O}\left(2^\frac{n}{2}\right) for n-bit curve fields. The lack of known subexponential attack algorithms is basically what made elliptic curve cryptography attractive so far, because the resulting key can have much smaller size for the same level of bit security, with corresponding increase in bandwidth and efficiency. This is summarized in the table below.

Classical bit-security requiredModulo size for RSA, DH, DSA etcModulo size for elliptic curves
1122048224
1283072256
1927680384
Table 1: Keysize for equivalent bits of security

However, as we will see, this is also what makes elliptic curve cryptography more vulnerable to quantum computing.

Shor’s Algorithm

Now we look at the quantum scenario, i.e., we consider a situation where a large scalable quantum computer has been built and is able to run complex quantum algorithms. It is well-known that Shor’s quantum algorithm can solve the integer factorization problem in polynomial time. Let’s dig a bit deeper in this claim.

First of all, Shor’s algorithm is actually composed of two parts: a purely quantum part (Quantum Fast Fourier Transform, or QFFT in short) and a purely classical pre- and post-processing phase. From a complexity standpoint, the QFFT has polynomial time complexity of roughly \mathcal{O}\left(n^3\right) for n-bit input integers. Given that the classical part has a similar complexity, we will only consider the quantum complexity as relevant.

For factoring n-bit integers, a quantum computer running Shor’s algorithm will clearly need at least n (logical) qubits of quantum memory to represent the integer, but it will also require additional working quantum registers that make the total qubit count increase. Given that the qubit count can be seen, as a first approximation, as a limiting resource for the attack, circuit versions capable of running the QFFT minimizing the number of required qubits have been proposed. The current state of the art of these circuits requires a number of qubits that is roughly double the bitsize of the input integer.

As in the classical case, again the situation for IFP and DLP is similar in the quantum scenario: Shor’s algorithm can be generalized to discrete logarithms, and the QFFT can be used to solve the DLP over n-bit moduli using roughly 2n logical qubits and with the same polynomial time complexity of roughly \mathcal{O}\left(n^3\right).

The situation is slightly different in the case of elliptic curves. Again, the QFFT can be used to efficiently solve the ECDLP in roughly cubic time as above, but because of how the classical part of the algorithm needs to embed curve points representation into a larger group, this time the number of qubits required is roughly ten times the bitsize of the curve field!

Now, you might be tempted to say that the above implies that elliptic curve cryptography sounds more resilient to quantum attacks than RSA or DSA, because mounting an attack requires more qubits. But if you paid attention you can already see why it is exactly the opposite!

In fact, remember that elliptic curves keys are small because the best classical attacks are more inefficient. This difference disappears in the quantum scenario, as quantum attacks have all similar time complexity. And since RSA and discrete log cryptosystems already needed to use large keys in order to resist classical attacks, the required qubit count is actually larger for them! This is summarized in the table below.

Equivalent (classical) bit-securityMinimum qubits needed to attack RSA, DSA etcMinimum qubits needed to attack ECDSA and similar EC schemes
11240982042
12861462330
192153623484
Table 2: Minimum logical qubit count for attacks

As you can see, at the increase of the security parameter, the number of qubits necessary to attack RSA grows much faster than for an equivalent attack on elliptic curves of the same classical security level. We can almost say that classical attacks made RSA and related “more resilient” also to quantum attacks.

This should not be interpreted as saying that RSA is secure against quantum attacks: none of the schemes we are considering here are. However, so far it looks like EC-based cryptography is much more vulnerable than RSA to quantum attacks, and given the steady raise in the number of qubits available for quantum computation, it looks like elliptic curve cryptograhy will likely fall much earlier than other schemes.

Quantum Attack Resource estimates

Things get a bit more complex than that. Ultimately, we are interested in the question: “how much quantum resources we need to mount an attack against a certain scheme?” Answering this question will require us to dig a bit deeper.

We need to remember that a quantum computation is made of several “layers”:

  • The algorithm layer: more efficient algorithms equals more efficient attacks. So far, as we explained above, we only look at the QFFT algorithm.
  • The circuit layer: the same algorithm can be implemented in different ways at a circuit level, using different quantum gates, using a tradeoff between width (number of qubits used) and depth (running time) etc. The component responsible for translating a (description of) a quantum algorithm into a quantum circuit is called quantum compiler. Advances in quantum compiling can lead to more efficient attacks.
  • The logical layer: error-corrected (“perfect, ideal, logical”) qubits and gates that the circuit operates on. A logical qubit is realized by “grouping” together a number of low-level, noisy, physical qubits in an error-correcting structure. The most used approach for this is the so-called “surface code”, which is a square grid of physical qubits that is error-corrected at regular intervals. One of these intervals is called a “surface cycle” which currently takes between 1 \mus and 1 ms (with a target of 200 ns being “reasonable” in the near future). Improvements in both the compactness of the quantum error-correcting code used and the speed of the correction cycle can lead to better attack performance.
  • The physical layer: elementary physical units that exhibit quantum effects and are treated as “dirty, noisy” qubits. Improvements in the manipulation of these physical qubits and a reduction of their base noise will lead to better overall performance.

Clearly, any improvements on any of these layers will yield an increase in the efficiency of quantum attacks, and a lower amount of quantum resources needed to mount the attack. Assessing the necessary amount of resources is therefore tricky, as this is a moving target. Regardless, something can be said about it, and a few recent works shed new light on the topic.

First of all one has to consider that the number of qubits necessary to mount the attack might not be the limiting factor. The estimates provided in the previous section of this blog post are lower bounds that take into account the minimum number of logical qubits required, but this is not an universally accepted measure of hard resources. In fact, there are better ones: If we consider the full implementation stack from the bottom layer of physical qubits up to the algorithmic layer, we see that certain common elementary quantum gates are more expensive than others. Pauli gates are usually “cheap”, in the sense that they are easy and fast to implement, while T gates (Toffoli) are extremely hard. They are so hard in fact that, as a first approximation, one can simply disregard every other quantum gate, and only count the number of T gates as a measure of the complexity of running a quantum algorithms. The “T count” complexity reflects exactly this: the higher the T count complexity of a quantum algorithm, the more difficult it is to build a quantum computer able to run it.

Given the above, quantum compilers usually allow to operate in a mode that produces a circuit representation that does not optimize the number of logical qubits, but rather the number of T gates. This usually has a side effect: a large increase in the number of physical qubits required (because converting from physical to logical requires T gates) and also an increase in the running time overall. It is also wasteful if a certain T gate is only used once on a certain qubit, while it could be better to reuse them as much as possible once they are implemented. For this reason, another very useful metric is the so-called “T depth” complexity, which takes into account the fact that a circuit can be described in “layers” where many T gates can be used in parallel on different qubits.

Building circuits that optimize the T-count or T-depth might end up using more logical layer than the strict minimum, but it would result in a more efficient attack overall, because it minimizes the real-world resources (time and cost of implementation). Recent works in quantum cryptanalysis adopt this approach, and new results have been published recently. The tables below show the state of the art in quantum attack resource estimates for two different scenarios: the current (realistic) case of base noise error of 10^{-3} in the underlying physical qubits (achieved by current state of the art superconducting qubits), or the more optimistic case of an error rate of 10^{-5} that most experts seems to agree it might achievable in the short-term future. Notice that the minimum number of logic qubits has increased compared to the previous table because, as mentioned before, recent results in quantum optimization aim at minimizing T-count and T-depth rather than circuit width. Moreover, the time necessary to mount an attack greatly depends on the failure rate of a single run of the underlying algorithm, which itself depends on the level of purity achieved in the error correction mechanism, which itself depends on the number of physical qubits used in the surface code. As it turns out, the ratio between number of physical qubits employed and time necessary to run the attack is pretty constant in each of the scenarios below. For this reason, a new measure of quantum resource has been introduced: megaqubit days. This is the number (expressed in millions) of physical qubits necessary to run the attack in 24 hours given a surface cycle (“quantum clock”) of 200 ns (5 MHz) and an error rate of {10}^{-3} or {10}^{-5} errors per measurement.

CipherMinimum logical qubits requiredOverall cost (megaqubitdays)
RSA-204861901.17
RSA-307292884.03
RSA-76802323986.5
ECDSA-25626197.43
ECDSA-384390110.0
ECDSA-512527315.6
Table 3: Quantum attack resource estimates for noise rate of 10^-3 in the physical qubit realization at 200ns per surface cycle.
CipherMinimum logical qubits requiredOverall cost (megaqubitdays)
RSA-204861900.34
RSA-307292881.14
RSA-76802323918.9
ECDSA-25626190.89
ECDSA-38439011.00
ECDSA-51252731.56
Table 4: Quantum attack resource estimates for noise rate of 10^-5 in the physical qubit realization at 200ns per surface cycle.

Notice the following, interesting fact. For “low” security parameters (e.g., RSA-3072 and ECDSA-256) attacking RSA is actually less expensive than attacking elliptic curve cryptography. This is a consequence of recent results in the optimization of Shor’s algorithm for factorization. However, as the security parameter increases, we see a steady increase of the quantum resources necessary for attacking RSA, while attacking elliptic curve cryptography becomes relatively easier.

Conclusions

Recent results in quantum attack resource estimates have started to shed light on how vulnerable, exactly, is conventional cryptography to the ever increasing performance of quantum computers. The traditional view that “elliptic curve cryptography is much more vulnerable to quantum computers than RSA and discrete log” still holds, sort of, but the cut-off point has been moved to roughly 160-bit of classical security, while for 128 bit of security the difference is not so relevant. This is a moving target, mainly given by recent optimization on Shor’s algorithm for attacking IFP and DLP, so the real cost of attacking ECDLP might be lower. Further results can surely change the current estimates. What is clear is that with a large enough number of physical qubits all these cryptosystems can be attacked in a matter of hours.

The situation for symmetric-key cryptography is radically different, but still recent advances in quantum resource estimates contributed to better frame the impact that quantum computers might have on the practical bit-security of primitives such as AES and SHA-3.

Leave a Reply