Table of Contents
Fetching ...

Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer

Peter W. Shor

TL;DR

The paper addresses factoring integers and computing discrete logarithms, problems central to modern cryptography, and presents quantum algorithms that run in polynomial time on a quantum computer. It adopts a gate-array quantum computation model, relies on reversible computation to handle classical tasks, and uses the quantum Fourier transform to extract information about hidden periods and orders. Key contributions include a reversible modular exponentiation construction that runs in $O(l^3)$ time and $O(l)$ space for $l$-bit inputs, a two-register quantum order-finding algorithm that factors $n$ via a classical gcd step with high probability, and a method to compute discrete logs using two modular exponentiations and two quantum Fourier transforms with high probability. The work also analyzes practical barriers like precision and decoherence, discusses error correction and fault tolerance, and highlights cryptographic implications and directions for future work.

Abstract

A digital computer is generally believed to be an efficient universal computing device; that is, it is believed able to simulate any physical computing device with an increase in computation time of at most a polynomial factor. This may not be true when quantum mechanics is taken into consideration. This paper considers factoring integers and finding discrete logarithms, two problems which are generally thought to be hard on a classical computer and have been used as the basis of several proposed cryptosystems. Efficient randomized algorithms are given for these two problems on a hypothetical quantum computer. These algorithms take a number of steps polynomial in the input size, e.g., the number of digits of the integer to be factored.

Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer

TL;DR

The paper addresses factoring integers and computing discrete logarithms, problems central to modern cryptography, and presents quantum algorithms that run in polynomial time on a quantum computer. It adopts a gate-array quantum computation model, relies on reversible computation to handle classical tasks, and uses the quantum Fourier transform to extract information about hidden periods and orders. Key contributions include a reversible modular exponentiation construction that runs in time and space for -bit inputs, a two-register quantum order-finding algorithm that factors via a classical gcd step with high probability, and a method to compute discrete logs using two modular exponentiations and two quantum Fourier transforms with high probability. The work also analyzes practical barriers like precision and decoherence, discusses error correction and fault tolerance, and highlights cryptographic implications and directions for future work.

Abstract

A digital computer is generally believed to be an efficient universal computing device; that is, it is believed able to simulate any physical computing device with an increase in computation time of at most a polynomial factor. This may not be true when quantum mechanics is taken into consideration. This paper considers factoring integers and finding discrete logarithms, two problems which are generally thought to be hard on a classical computer and have been used as the basis of several proposed cryptosystems. Efficient randomized algorithms are given for these two problems on a hypothetical quantum computer. These algorithms take a number of steps polynomial in the input size, e.g., the number of digits of the integer to be factored.

Paper Structure

This paper contains 7 sections, 48 equations, 1 figure, 2 tables.

Figures (1)

  • Figure 5.1: The probability $\rm P$ of observing values of $c$ between $0$ and $255$, given $q=256$ and $r=10$.