Quantum computers are the fastest computers humanity has not quite seen. That is to say, we have seen them but we are still some way off a situation where quantum computing can solve real-world problems beyond the scope of current hardware. Let's call that 'quantum advantage' although the term means different things to different people and can sometimes be used interchangeably with 'quantum supremacy', usually taken to mean the solving of a specific problem faster than by using a conventional device.
'Big tech' businesses including IBM, Microsoft, Google and Amazon Web Services are working to develop quantum computing. In October 2019, Google announced it had achieved quantum supremacy. It claimed its 54-qubit Sycmamore quantum processor performed a specific task in 200 seconds which would have taken the world's most sophisticated computer 10,000 years to complete (although this was later challenged by IBM). At the end of 2021, IBM announced it had developed a 127 qubit processor and it is looking to produce a 1,121-qubit chip by 2023.QuEra announced shortly after that it had built a 256 qubit device using a different technology.
These developments, while impressive, do not provide quantum advantage, even if they do demonstrate quantum supremacy. More recently, however, Australian scientists announced that silicon quantum computers can now be operated with better than 99% accuracy. This is a huge step forward because the instability of qubits has been a fundamental stumbling block to accuracy and scale of quantum computing. More than 99% accuracy is thought to be the threshold at which quantum processors can be scaled into computers.
Estimates as to when quantum computing will go mainstream vary widely and IBM's optimism may yet prove misplaced. Many predict we still have decades to go, but most agree that QC is coming – it's a matter of when, not if. And it's not just big tech which has an interest. Governments around the world, notably China, India, the UK and the US, are competing to get ahead in a field which really does have potential to change the world.
Is the quantum apocalypse coming?
The (currently largely hypothetical) ability of QC to process enormous quantities of data at great speed (as discussed here) has a vast range of applications from life sciences, to financial services, to saving the environment, but currently gaining a lot of attention is its ability to crunch through encryption.
Imagine a computer with the processing power to break open every piece of information held about you anywhere online. In 1994, mathematician Peter Shor discovered an algorithm for finding the prime factors of an integer. Shor's algorithm is proven to be able to breach asymmetric or public key cryptography but today's computers would take billions of years to solve it. Quantum computers with a sufficient number of qubits would, however, be a different matter.
This could mean that data you gave up on assurances that it would be held securely, would be accessible to anyone with the right technology. Global financial systems, infrastructure, nuclear and military facilities, all vulnerable. This doomsday scenario is known as the quantum apocalypse and the race is on to head it off.
We may not know how soon quantum advantage will be achieved, but according to Mark Webber at the University of Sussex (as reported in the New Scientist), quantum computers would need between 317m to 1.9 billion qubits to beat bitcoin's encryption during the 10-60 minute interval each day during which a bitcoin transaction key is vulnerable. Even if that vulnerability extends to a day, 13m qubits would be needed.
Having said that, other systems are more vulnerable and would require less processing power to decrypt them. By way of example, breaking a standard 2048-bit RSA standard encryption scheme, the minimum key size recommended by the US National Institute of Standards and Technology (NIST), would take a quantum computer with 20m qubits eight hours.
Quantum ethics - can the law help?
Is it possible for us to regulate the ethics of quantum computing in order to maximise the benefits while minimising the risks, effectively heading them off before we get to an apocalyptic future?
When the GDPR hit the statute books in 2016, it was hailed as a 'future proof' data protection law. Claims made around cybersecurity legislation have been less bold but the EU is close to agreeing its second Network and Information Systems Directive less than four years after the first came into effect which shows quite how much of a challenge it is to legislate effectively in such a rapidly developing area.
Just as the Data Protection Directive 1995 failed to protect personal data against the full force of the internet, the GDPR and other current data protection and cybersecurity legislation may look rather pointless in the face of quantum computing. The GDPR and its UK equivalent are already struggling to protect EU personal data from access by intelligence authorities in third countries. What sort of protection could these laws provide against a rogue actor with enormous quantum computing power?
We are already witnessing governments trying to legislate to regulate AI and prevent harm to individuals caused by algorithms, but as we have seen time and again when it comes to the law and technology, it's very difficult to write meaningful legislation to regulate hypotheticals.
And of course, laws have borders while technology does not. This is a geopolitical issue of potentially monumental importance, but global agreement on an ethical approach seems unlikely – after all, look how little we've achieved on the environment.
Quantum proofing
All is not yet lost because this is not a one track race. This is a potential problem caused by technology and the solution is likely to be found, at least initially, in technology. Alongside the push for quantum advantage, is the focus on post-quantum cryptography.
Encryption uses symmetric (private) and/or asymmetric (public) keys, sometimes along with other techniques like hashing. Symmetric key cyphers use the same key for encrypting a message or file while asymmetric ones used a linked private and public key. The public key is shared and the private key is kept secret to decrypt the information. Asymmetric encryption is more vulnerable to quantum processors than symmetric.
There are a number of techniques which may help extend the life of current encryption. One suggestion is longer encryption keys, but the longer they are, the slower and more expensive they are, and ultimately, quantum processors will still be able to crack them.
Creating new types of quantum-resistant encryption algorithms is the longer term goal. NIST launched a competition in 2016 which aims to produce quantum-proof algorithms by 2024. The winner is supposed to be announced this year and NIST recently confirmed there were 15 contenders left in the running from an initial 69. Most of the finalists focus on lattice-based cryptography.
Long thought to be one of the most promising post-quantum solutions, lattice-based cryptography uses grids with billions of individual points across thousands of dimensions. To break the code, you need to get from one specific point to another which (for some, if not all types of lattices) is virtually impossible without the keys. The private key is associated with the lattice point, and the public key is associated with the arbitrary location of the lattice point.
Another focus is on quantum key distribution (QKD) which aims to provide quantum-safe key exchange. Simplistically, it relies on quantum physics, sending photons individually through fibreoptic cables to allow detection of interference with the keys. China has reportedly gone beyond this approach by using a quantum communication satellite to send keys.
There are disadvantages to QKD including equipment requirements, but also vulnerabilities created in the equipment relay chain by implementation flaws. It remains to be seen whether it is the answer.
What can we do now?
Quantum cryptography may not be with us yet but 'harvest now, decrypt later' attacks are already a reality. Hackers are believed to be accumulating huge quantities of encrypted data in the hope that they will be able to decrypt it with quantum technology at some point in the future.
For most businesses, the focus at the moment should be on using robust cybersecurity which should be regularly reviewed to ensure it is up to date. However, breaches can't always be avoided, even with the best cybersecurity practices and technologies, not least because they are often a result of human error. This means it's important to minimise the amount of data likely to be accessed through a vulnerability, for example by using micro-segmentation and rotating encryption keys for each data classification.
As always, the more data rich the organisation and the more sensitive the data, the more important getting cybersecurity right will be, both now, and in the quantum future.