With all the rage around the impact of quantum computing it can be ambiguous as to the actual capability of modern quantum computers and their timeline to be practically utilized. What it boils down to is a dependence on multifactor sources of error and the error rate that arises from that. Some of the larger sources of error happen to be crosstalk, decoherence, and gate infidelities. What these issues culminate into is a quantum system that can run operations for a while but after a certain point the error compounds and becomes unreliable. In the current state of affairs, a push is underway for fault-tolerant quantum computing (FTQC) where error can be corrected in a continuous manner.
A question that may be asked is how important can it be if we have measures for error correction like quantum low density parity checks (qLDPC) and other quantum error correction algorithms? You would be right to think that error correction codes are beneficial to overall performance but they are fallible. They come into play when a qubit has a mistake, such as a bit flip or a phase flip, or both, and the error correction code attempts to rectify it. However, it does have limitations where if a large majority of errors propagate there will be a point of no return that would require the system to be reinitialized. For example, if your 5-qubit state was supposed to be|00000>, but erroneously due to qubit errors or unwanted interactions from the environment it becomes|10101>, it would not be able to correct it to the correct designation reliably.
Another aspect of quantum computing’s competitiveness is its ability to be efficient through quantum characterization, validation, and verification (QCVV) methods. These methods are intended to optimize resource utility and assess the system health of the quantum computer. The purpose of this is to reduce, reuse, and recycle the qubits being used so that qubits are actively used and minimize qubit requirements while computing. An application of this is reusing a qubit later on in the computation in lieu of having it handle one operation and act as a vestigial organ for the rest of the computation. In a simple way, QCVV aims to proactively optimize quantum computers and QEC aims to reactively fix shortcomings in the computation.
Now, circling back to the issue at hand: how far out is practical quantum cryptography? The rationale behind the previous paragraphs was to convey some of the active areas of research to optimize quantum computers and their capabilities, and their limitations. Now, what if we had varying degrees of error mitigation and error correction? How many qubits, logical or physical, would we need to crack ECC 256?
Figure 1. Logical Error vs Physical Error Rates (Top Left). Attack success rate given physical error rate (Top Right). Hardware requirements based on physical qubits and physical error rate (Bottom Left). Qubits needed to break ECC key (Bottom Right).
| Parameter | Value |
|---|---|
| Physical Error Rate | 10e−10 |
| Logical Error Rate | 1e−15 |
| Gate operation Time | 10e−9 s |
| Decoherence Time | 10 s |
| T1 Relaxation Time | 20 s |
| Error Correction Cycle Time | 1e−6 s |
| Surface Code Distance | 3 |
| Parallel Operations Possible | 1000 |
Here there is a considerably futuristic selection of parameters which are not currently feasible. This is seen with error rates that are orders of magnitude better than contemporary metrics and timing that is not yet fully realizable. Obviously, as you change parameters such as surface code distance or parallel operations possible, outcomes can be considerably varied with less stringent thresholds being needed to accomplish a Q-Day event. A more realistic set of parameters would render most applications infeasible. For general context, there is currently an error rate of 0.001 or a little below ~99.9% fidelity for superconducting architectures. This is due to a multitude of factors mentioned previously such as environmental noise, quasiparticle poisoning, and mixed signals from crosstalk that all lead to less error resistant systems.
This also includes the leakage of qubits to non-target states. In conclusion, there is a future where post-quantum cryptography not only becomes possible but practically implemented. As we have seen in this writeup, as the technology advances, more will be capable as time progresses with many roadmaps approaching required parameters by 2030. The question is then: when will Q-Day occur and how long until the parameters needed to achieve it are available?
