Quantum computing has the potential to change everything. It's just a matter of taming qubits. We investigate the latest scientific advancements into the field of the unknown.
It was the stuff of science fiction a decade ago; of increasingly pointed research five years ago; and of cautious and growing optimism now. Yet while it's still a long way from your local computer shop's shelves, quantum computing is making huge strides as researchers notch up small wins in their battle to wrest useful productivity tools from ill-behaved particles that have long resisted taming.
Based on the strange world of the qubit – the base unit of information storage in quantum science, frequently represented as a single atom of phosphorous – quantum computing has become a unifying purpose for forward-thinking physicists of all stripes. After years of work under a previous Centre of Excellence, last year researchers at seven Australian universities united forces with international bodies, a dozen commercial entities and each other to jointly pursue the state of the art.
The ARC Centre of Excellence for Quantum Computation and Communication Technology (CQCCT
) has received $24.5m in government funding over seven years and now has over 150 scientists working across three main areas including quantum communication, silicon quantum computation and optical quantum computation. Their common goal: harness the significant possibilities of quantum information capabilities to build a new generation of computing devices that crunch numbers orders of magnitude faster than our current best efforts.
Because qubits can exist in many states at once based on their rotation and other properties, they can effectively represent a far greater range of numbers at any given time, compared with current binary systems in which bits are either set to 0 or 1. This will let quantum computers churn through mind-numbingly complex modelling and mathematical brainteasers like a knife through proverbial butter if – and it's a very big if – scientists can develop methods to reliably control and measure the behaviour of individual qubits.
Asserting this control in a programmatic way will allow completely new approaches to mathematical operations – but the process becomes exponentially more complicated the more qubits are involved. This is because of a quantum characteristic known as 'entanglement' – which means the state of an individual qubit can affect the state of a neighbouring qubit; if one qubit's orientation changes, so does the other's. Effective quantum computers will need to manage the position of qubits so that entanglement can be monitored and compensated for.
Implemented correctly, quantum computers will deliver unbreakable encryption via totally secure exchange of encryption keys (made possible because trying to snoop on a qubit-based key changes its state in a way that can be observed). They'll also enable rapid processing of classical CPU-busters such as methods for quickly finding the prime factors of an arbitrarily large number.
Scientists have already developed quantum algorithms for these and other tasks – the well-known Grover's algorithm database search algorithm, for example, was developed in 1996, and Shor's algorithm for quantum factoring has driven rudimentary quantum key-exchange systems for years. The bigger challenge lies in building quantum computers capable of running more complex algorithms – and delivering results that can be reliably taken to be correct.
To reliably manage the effect of randomness in what is inherently a completely random environment is not a task for the faint-hearted. Extensive research work has focused on creating physical environments conducive to controlling the position and spacing of quantum elements; this is because, like a class of prep children on an excursion, qubits tend to spin around and go their own way unless they're continually pointed in the right direction.
The CQCCT team recently addressed this by figuring out how to use a nanoscale lattice to insert phosphorus atoms in particular configurations – ideal spacing appears to be around 20nm between qubits – with accuracy of just one atom in either direction. In our little metaphor, it's the equivalent of getting those preppies to queue in a line and hold hands while they walk at a steady pace.
Such a lattice adds predictability to qubit positioning and helps researchers extend coherence time – the length of time during which a pair of entangled qubits can be frozen in a particular state. Just a few years ago, coherence times peaked in the order of milliseconds – but these days researchers have extended coherence time out to several seconds. That's long enough for algorithms to compute and make use of qubit-generated data before natural processes take over – and it's accelerating hopes that usable quantum computers may be available within the next decade.
"If you look at the field of international computation, we're still at the level of systems with a few qubits," says Professor Michelle Simmons, a researcher at UNSW who's also director of CQCCT. "To be able to put scale in place, you need to be able to put individual atoms in place with atomic precision.
"We've been able to do this by making a very thin mask on a silicon surface, opening up a hole and bringing the phosphorus atom qubit into that with atomic precision. Then we encapsulate the whole thing in silicon so it's robust. We've been able to keep the phosphorus atoms in place and measure its electronic fingerprint; now we're beginning to scale from one qubit to two, three, four and beyond. Eventually we'll want to take it from a few tens to hundreds of qubits."
At scales of hundreds of qubits, quantum computers will outperform current supercomputers on computationally intensive tasks such as modelling protein folding, nuclear explosions and other complex chemical reactions. Yet there are other challenges, since existing models are designed to run just one specific task; this is why it will be decades before you're buying desktop quantum computers.
Early binary systems had a similar limitation until researchers separated the computer's memory and support infrastructure from its programmable CPU. This allowed the processing of volumes of data larger than what could be stored in the computer at any given time.
Replicating this model in the quantum world has proved complicated, but a University of California Santa Barbara team recently had a breakthrough by using superconducting circuits to build a system with two qubit registers and two entangled memories that could operate as a three-qubit logic gate – a sort of quantum transistor, if you will.
Coherence time in this 'Resonator/zero-Qubit architecture' (RezQu) is much reduced – in the order of 4 microseconds. Its ability to complete hundreds of operations during that time shows great promise, although lingering control issues mean its results are only 98% accurate – far too error-prone for general usage.
Research continues, however, and the sheer level of investment suggests the quantum nut will be cracked sooner rather than later. "The biggest unknown of any implementation of quantum computing is nature," Professor Simmons explains. "Nature is either going to let us scale the system and maintain coherence across a lot of qubits, or it's not. That's the big unknown, and that's why it's so exciting. Technologically, we'll keep pushing the boundaries and see how far we can go."