Current Realizability of Shor’s Algorithm

Marco Russo (Politecnico di Torino)



While research in quantum computing is more and more focused on finding new computational tasks that can exploit the nature of quantum mechanics to achieve an advantage over the classical counterpart (the so-called “quantum supremacy”), Shor’s algorithm [1] is with no doubt the oldest quantum algorithm with an unmatched improvement with respect to classical computing. Its main result is the ability to find the prime factors of any integer number in polynomial time, instead of the exponential time that a classical algorithm takes. Nowadays computer security mainly relies on secret-key cryptography algorithms such as RSA, that rely on the very fact that the key cannot be found by brute force since it’s necessary to find the prime factors of integers that range from 1024 to 4096 bits, and the exponential nature of the task makes it unfeasible with a classical computer. Shor’s algorithm overcomes this barrier by theoretically being able to find the prime factors with a number of computations that is polynomial with respect to the number of bits. This allows to make it much easier to break RSA and similar algorithms.

Physical realizability

Despite being an algorithm with polynomial complexity, the actual implementation requires a quantum circuit with a number of gates in the order of O~(n2), where the Big O tilde notation ignores the logarithmic factors, i.e. O(f(n)logkn) = O~(f(n)), and where n is the number of bits of the integer number. This means that breaking a 4096 bit RSA key takes 40962 = 224 = 16777216 gates. The number of necessary qubits, instead, is O(n). Current technologies that realize quantum computers, of which the main ones are superconducting, photonic and neutral atom qubits, have two problems:

  • They have a limited number of qubits: even though in the best case they are in the order of thousands, they still are subject to errors that require error correction techniques. These techniques make use of other qubits to compensate for these errors, to the point where the actual number of qubits that are actually available for computation, called logical qubits, can be from 0.1% to 1% of the physical qubits. Despite the number of required qubits in Shor’s algorithm is linear in n, this is a problem anyway, since for 1024 bits we would need a million of superconducting qubits, and IBM has yet to deliver a 1121 qubit processor;
  • The errors reflect in the fact that only a maximum circuit size is allowed. The main physical error is decoherence, where the state of a qubit tends to lose coherence after a certain number of manipulations since they involve in part an interaction with the external environment, to the point where in the end it doesn’t bring any information any longer.

Given the first point and the exponential number of gates required for Shor’s algorithm, it’s evident that we are still far from being able to implement such an algorithm. In a recent paper [2] it was shown how the complexity can be reduced to just O~(n32), but despite being a good improvement the current technology still doesn’t allow an actual implementation.


Although we are far from being able to break RSA and other cryptographic algorithms such as ECDSA, it is worth noting that we should expect to do so within a decade. Therefore, it makes sense to start transitioning to other types of cryptography that are immune to Shor’s algorithm as soon as possible (see the blog post New Problems Become Post-Quantum Solutions) for two main reasons: the transition to PQC is not easy and involves many layers, processes and standards in an a priori unknown cascade of dependencies, and the “store now, decrypt later” attack is a threat today.


[1]    P.W. Shor. Algorithms for quantum computation: discrete logarithms and factoring. In Proceedings 35th Annual Symposium on Foundations of Computer Science, pages 124–134, 1994.

[2]    Oded Regev. An efficient quantum factoring algorithm. arXiv, 2023. [Online]. Available:

Share on