The development of large-scale quantum computers will threaten the security of computer networks and services that depend on public key encryption, including Smart Ledgers. This paper from the Distributed Futures research programme looks at how real the threat is and how and when we should start preparing for it.
You can read the full report here:
The post-quantum cryptography (“PQC”) problem will threaten the security of the world’s computer networks when large-scale quantum computers become available. The problem exists because such quantum computers would be able to break the security of widely-used public key cryptography, which allows remote parties to communicate securely and authenticate transactions and data without sharing a secret key in advance. It is uncertain when (and if) such quantum computers will become available - the nearest estimates are 10 to 15 years.
Fortunately, there are good solutions to the PQC problem, and better ones are emerging. The hard questions for individual computer system operators involve when and how to address the PQC problem, given its uncertain timing and the evolving solutions.
Defining the PQC Problem – Background and Key Concepts
Understanding the PQC problem requires background on three technical issues:
- how modern digital cryptography works, including its key building blocks;
- the progress of quantum computing, and the basics of how it works;
- how quantum computing threatens the viability of certain types of cryptography, including the possible effect of these vulnerabilities for Smart Ledgers and other applications.
The paper goes on to describe how cryptography works with specific descriptions of symmetric encryption and public key cryptography. For distributed ledgers, the most common uses of public key cryptography are for digital signatures. These are used to authenticate the right of a user to make transactions or to access data or assets stored on a ledger.
There are two purely mathematical hard problems that underpin the most widely used public key cryptography algorithms and they are integer factorisation and calculation of discrete logarithms. A working large-scale quantum computer would be able to substantially reduce the difficulty of solving these hard problems.
In addition to symmetric algorithms and public key algorithms, modern cryptography uses a third technique called a ‘hash algorithm’, which reduces data of arbitrary length (e.g. a password) to a fixed-length ‘digest’ (typically around 256 bits, or 32 characters). Hash algorithms are crucial to distributed ledgers, because they guarantee the authenticity of a blockchain by including the digest of each block within the next block. Fortunately, hash algorithms are not as vulnerable to the PQC problem as are public key algorithms.
Next, the paper looks at the progress of quantum computing. Quantum mechanics is one of the key discoveries in 20th century physics and describes how atoms and sub-atomic particles behave at very small scales. Richard Feynman first presented the idea of a quantum computer in 1981 at MIT. Since then, there has been significant progress, notably the first demonstration of a quantum computer, which occurred at Oxford University in 1998.
Quantum computers depend on two phenomena of quantum mechanics:
Superposition: means that a particle has multiple states at the same time. In a classical digital computer, each bit of data is either 0 or 1. By contrast, a quantum bit (known as a ‘qubit’) has both values 0 and 1 at the same time, with a probability distribution defining which value it will have when observed.
Entanglement (between two particles): means that the quantum state of each particle cannot be described independently of the state of the other particle, even though the two particles are separated by a (potentially large) distance.
Together, superposition and entanglement mean that a quantum computer behaves very differently from a classical digital computer. In a classical computer, eight bits of memory (or one ‘byte’, often corresponding to one character) can hold any of 28 = 256 different values. In a quantum computer, eight entangled qubits hold all 256 values at the same time, and a program running on the computer could theoretically determine in a single step which of the 256 states is most likely.
Currently, quantum computers are unable to solve any problem more efficiently than a classical computer. This is due to a huge engineering challenge called decoherence which is when qubits fall out of the state of quantum superposition when they interact with the surround environment. The industry record is a prototype 50-qubit quantum computer, developed by IBM, which holds a record-high decoherence time of 90 microseconds which is less than one ten thousandth of a second. The paper goes into further detail of methods that scientists are using to build a quantum computer with stable qubits as well as companies around the world that are involved in this field.
And so how do quantum computers attack cryptography? This research paper lays out two key mathematical algorithms that are known to provide useful ways to attack the security of current cryptography using quantum computers. This would be Shor’s Algorithm and Grover’s Algorithm. And how do these threaten smart ledgers and other applications? The research considers threats to distributed ledger architecture or the blockchain, threats to distributed ledger transactions and threats to data and software stored on ledgers, as well as general threats beyond smart ledgers.
Changing Cryptography to Address the PQC Problem
The fortunate aspect of the PQC problem is that good solutions are known and even better solutions are being developed. Securing encryption systems has long been an arms race between cryptography designers and attackers, and that arms race will certainly continue. In the research paper you will find security details for: symmetric algorithms, hash algorithms, and public key algorithms and their quantum-resistant counterparts.
The paper includes details explanations of quantum resistant public key cryptography algorithms. Note that this section of the paper is rather technical.
The final solution raised by this research paper is Quantum Key Distribution (QKD) which involves hardware that uses quantum entanglement to transmit encryption keys between parties at distance, replacing key exchange using public key cryptography with symmetric cryptography. However, QKD has substantial limitations of range, security and cost and does not address the PQC problem for public key digital signatures. Therefore is not considered a serious contender for addressing the PQC problem for blockchain and distributed ledgers.
Deciding the Time for Action
There are two key strategic questions for any system potentially threatened with the PQC problem:
- When is it appropriate, or necessary, to take action?
- What action should be taken?
The right answer depends heavily on the nature of the system, including the hardware, software and processes used, the sensitivity of data being stored, and the possible migration paths to a quantum-resistant system. The key decision drivers being: uncertainty, cost and risk.
As discussed, there is substantial uncertainty as to when and if large-scale quantum computer will be available. With cost, there are two sides. On the one hand, changing a typical system to be resistant to the PQC problem is likely to be an expensive but on the other hand, failure to act could be associated with the devastating cost of having a vulnerable system if action is taken too late or the remediation costs for implementing a solution close to the ultimate deadline. And finally, with risk, there are objective and subjective components.
One clear implication is that organisations implementing new systems now should strongly favour quantum-resistant solutions — in order to minimise uncertainty and risk, without the cost of changing or abandoning a legacy system.
The paper also goes in-depth into the Mosca Inequality framework which was developed for assessing the timing for transition to PQC. It involved three time periods:
- X = the desired duration of security of data on the system;
- Y = the time required to replace the cryptography in a system with quantum-resistant cryptography;
- Z = the time until availability to quantum computing sufficient to break current encryption.
…And what is Cardano doing towards this problem??
At IOHK, they are already on path with a long-term research agenda to gradually harden all algorithms used in Cardano’s protocol stack. This is to protect the protocol against an adversary who may possess a quantum computer. This is not the general case across the blockchain market. Many cryptographic tokens being launched today use the ECDSA signature algorithm (which is vulnerable to Shor’s algorithm). You can follow Cardano’s progress with the roadmap which shows workstreams such as quantum resistant signatures.
Cardano Foundation partnered up with Z/Yen’s Distributed Futures to sponsor a research programme to investigate applications of blockchain technology. This research on Smart Ledgers and new technologies will not only help the development of the Cardano protocol but also help shape and influence the wider blockchain industry. The program is structured around the themes of society, technology, economic and politics and has four direct outcomes to expand frontiers, change systems, deliver services and build communities. The overarching aim is to work with many stakeholders to learn and build the vital infrastructure needed to make Smart Ledgers a success.
For more information on this research programme, click here.