Earlier this month, IBM presented some of the work it is doing to further its goals to develop a 1,000-qubit quantum computer.
Among the work in quantum computing that IBM showcased is a new technique that allows researchers to test the preliminary results of a quantum algorithm, which is then used to continuously refine the algorithm. This technique makes quantum software much more precise and easier to fine-tune, which means quantum programmers will be able to develop more and more sophisticated algorithms.
In 2020, IBM laid out a roadmap in which it hopes to deliver systems with 1,121 qubits by 2023, for running quantum applications in natural sciences.
IBM has a multi-pronged plan to get the world of IT thinking about quantum computing. It is a phenomenon that promises so much, but is so far removed from what has come before, that many struggle to get their heads around the terminologies. IBM has published a roadmap that illustrates its path to getting value from quantum computing.
Jerry Chow, director, quantum hardware system development at IBM Quantum, said the company is trying hard to attract a broad developer community to quantum computing. “We want frictionless development that does not need to be more specialised than classic computing,” he said.
But from a conversation Computer Weekly had with Chow recently, it is apparent that there is a gap in understanding between the level of abstraction offered in modern classical computing and what is required to get to grips with quantum computing.
From a software roadmap perspective, Chow said some of this work involves building out the fundamental base layer to control how the device works, which is a bit like the application programming interfaces (APIs) that a kernel developer on classical computing uses. In the world of quantum, this means concepts such as rotating qubits and building a seamless network between quantum computing and classical computing.
For 2021, the IBM roadmap has a milestone to deliver Qiskit, which provides the tools and runtime software to manage this.
Then there is the idea of pulse control. “Pulse control is equivalent to the hardware abstraction layer in an operating system,” said Chow.
In effect, the developer drives pulses that control qubits. Conceptually, Chow describes this as akin to programming a microprocessor in assembler language, where the assembler allows a programmer to send machine code instructions to the processor in order to manipulate logical bits on a classical computer.
He said a quantum circuit is analogous to digital gate operations, such as the binary logic “And”, “Or” and “Nand” (Not And) operations for manipulating zero and one bits in classic computing. But unlike these simple binary operations, which are effectively encoded as digital circuits in a microprocessor, quantum circuits are able to run far more complex operations than is achievable in binary logic, said Chow.
The QASM tool in Qiskit effectively performs the same function as an assembler, but sends instructions to manipulate qubits on a quantum computer, rather than sending instructions to manipulate logical bits to a classical computer.
“There are provable gaps, where quantum circuits have an advantage over classic circuits,” said Chow. By 2022, IBM plans to be able to run dynamic circuits on its quantum computers, he said, and these expand the variety of quantum algorithms that can be run.
Up to this point on its roadmap for quantum computing, IBM has targeted the kernel developers who build quantum circuits that talk directly to the hardware and algorithm developers who consume these circuits as they build out quantum applications.
During 2023, according to its roadmap, IBM is planning to deliver libraries of pre-built quantum circuits, and raise the level of abstraction with pre-built quantum runtime software. As is the case with an operating system running on a classical computer, this should enable developers to start creating quantum applications in high-level programming languages without having to understand the intricacies and vagaries of qubits.
Error handling
One of those quirks is that quantum computing is error-prone. In quantum terms, they are regarded as noisy. Just as memory chips need error corrections, there is work under way to detect and fix errors in quantum computing.
One of the biggest issues in scaling quantum computers is eliminating the errors that naturally occur in a two-qubit gate – the building blocks of a quantum computer. Among the ideas IBM presented recently is a new method to reduce errors, which will make it easier to achieve higher quantum volume devices in years to come.
One example IBM is trying, said Chow, is a small error detect scheme, where, at a small scale, code can be developed such that it can ensure integrity. He said the company is looking to drive new ideas to perform large error correction code.
“Quantum inspired” is one of the terms the industry has coined as an interim step towards mainstream quantum computing. In effect, a computer can be developed in a way that simulates some aspects of quantum computing. In some cases, such quantum-inspired algorithms can solve problems more efficiently than other approaches running on a classical computer.
Chow said IBM has done a lot of work on simulating its Q System to give developers a way to test quantum algorithms before running them on a real machine. Clearly, there will be a limit to how complex an algorithm can be simulated on a classical computer and, as error correction evolves, it may become increasingly difficult to check that a quantum computer is giving the same results as a simulation running on a classical computing architecture.
A question of architecture
Quantum computing does appear to be a radical departure from previous models of computing. But in the last few years, the industry has adopted different computing paradigms. Just as the GPU (graphics processing unit) offered software developers a route into writing code that could run in parallel across multiple processing cores, Chow believes quantum computing will ultimately become another computer resource at their disposal.
“It is another model of computation,” he said. “Some applications may run on high-performance computing in the cloud; others may go on cloud-based quantum computers.”