Quantum computing is here… with one small caveat
The government is backing 30 projects to kickstart quantum technologies, even though the hardware isn’t ready. Nicole Kobie meets one company mitigating the effects of climate change to find out why
Quantum computers are only just edging into useful existence, but the world isn’t willing to wait for the technology to mature. They’re already being put to good use in finance, energy production and manufacturing – and soon, if all goes well, quantum systems will predict potential flood damage across the UK to help plan mitigations against the impacts of climate change.
Multiverse Computing is leading one of 30 projects backed by the UK government as part of efforts to develop quantum technologies for public sector applications. Alongside its partners, Oxford Quantum Circuits and Moody’s Analytics, Multiverse is developing an algorithm to optimise neural network outputs for more detailed flood modelling.
This is part of the government’s Quantum Catalyst Fund, a £15 million pot of cash to encourage quantum technologies to be developed for public use. Multiverse’s project was handed a slice of that money for a three-month feasibility study as phase one of the fund, with contracts doled out for any promising ideas as part of phase two, in which they’ll be asked to make a prototype or product demonstration.
That £15 million is just the start. The fund is part of a wider National Quantum Strategy published in March 2023 that will see £2.5 billion invested in the next ten years. The belief is that quantum technologies could offer solutions in healthcare and energy infrastructure. And – through quantum clocks and communications – it could help railways, emergency services and telcos step away from satellites to a more secure alternative.
But given quantum computers remain very much a work in progress, how can anything get done? Enter hybrid quantum computing.
What is quantum computing?
Let’s step back to basics. Classical or traditional computers make use of on/off transistors, with data stored in bits that are either a one or a zero. Quantum computers differ by taking advantage of quirks of physics, in particular quantum superposition and entanglement. So a qubit – a quantum bit – can be on or off, but it can also be both; this is superposition. That means data can be processed in parallel. The second property is
entanglement, which means the state of a qubit can be related to another, and this enables advanced algorithms.
This means quantum computing could enable advanced modelling or maths that we can’t do now. But there are challenges. Reading the state of a superposition qubit isn’t really possible, so it drops back to a one or zero, meaning the final result requires further processing to unpick; that means more processing power is needed, not to mention software to do the work. Another challenge is decoherence: quantum computers interact with their environment, and that can disrupt data and calculations, corrupting results. Mitigating that requires complex error correction.
There’s a further challenge: actually building a quantum computer. They do exist now, but with a limited number of qubits. The IBM Quantum One was the first such device to be made available for commercial use, but it only has 20 qubits. Back in 2019, Google claimed it had achieved quantum supremacy – performing a calculation that a supercomputer couldn’t do on a human timescale – using its Sycamore device, which had 53 qubits. And in December 2023, IBM revealed its Condor quantum computer with more than 1,000 qubits.
We don’t really know how many qubits are required to do the work we’d like quantum machines to do as it depends on the task, but thousands or even tens of thousands could be required for the applications being dreamed up. It also depends on the type of quantum computer, as there are different designs, from superconducting machines like those made by Google and IBM to photonic designs. These use neutral atoms, trapped ions and quantum dots.
“Each of those architectures has several benefits and drawbacks, so right now it’s not clear which of those architectures is going to be the one that scales,” Victor Gaspar, head of business development at Multiverse, told PC Pro. (For this project, Multiverse Computing is working with Oxford Quantum Circuits, which uses a patented three-dimensional scalable design called a “coaxmon”.)
Quantum computers aren’t designed to take over from the laptop on your desk or the smartphone in your pocket. Instead, they’re a specific tool that works well for some applications, be it maths problems, cryptography or simulations. Making use of quantum hardware requires algorithms written specifically for such computers – and that’s part of the reason hybrid systems have appeal. We need to develop quantum computers and the software and algorithms to run on them in concert.
The key takeaway on all of this is that quantum computing isn’t quite here yet and it’s not easy to build.
Hybrid today
Though we as yet lack quantum computers at a practical level – they exist, but largely in labs, and not in the form required to do everyday work – we can make use of quantum ideas and more limited quantum hardware by combining it with classical algorithms and computers.
“A quantum algorithm is like a classical one but uses several phenomena that are quantum that you don’t have in a classical computer,” Gaspar said. “For example, entanglement and superposition... We are currently building machines that can make use of those effects.”
To an extent, it follows the development pattern of traditional computing, Gaspar says. At first, algorithms ran sequentially on a single CPU; then a second CPU allowed for parallel computing. “Now we are developing a technology that has several properties that are not in classical [computers] that you can make use of for developing new algorithms,” Gaspar said.
And that takes time, he notes – after all, many of the most famous algorithms in use today were developed in the 1980s, and we’re still coming up with new ways to get the most out of classical computers.
That’s one reason why it makes sense to start developing software and algorithms for quantum computers when the hardware isn’t quite ready yet, as it’s going to take years or decades to really get to grips with these weird new machines. Multiverse Computing has an algorithm that will work with the project in question, but one of the first steps will be to optimise it for the hardware being supplied by project partner Oxford Quantum Circuits.
“A quantum algorithm is like a classical one but uses several phenomena that are quantum that you don’t have in a classical computer”
Floods of data
This particular hybrid quantum project seeks to better model the
potential for flooding across the UK, especially as the climate crisis exacerbates the risks. The aim is to understand where and when to expect floods, and better predict their impact on surrounding areas, be it homes, transport networks or infrastructure. However, computational fluid dynamics are notoriously complicated, and the challenge is exacerbated by the need to pull in a lot of data to improve the accuracy and granularity of a model.
Multiverse Computing and its partners will use shallow water equations, a subset of computational fluid dynamics, to model bodies of water including rivers and the ocean. “In classical computers this has a really high computational cost for simulation, especially for large areas in a high resolution,” said Gaspar. “If you want to model a huge mass of land that intertwines with a huge mass of water, and you want high resolution with buildings and coastal features and all that, it’s highly complex.”
For example, a modelling system could choose to ignore the impact of windows on flood effects, but a more detailed simulation might include windows. “We want to proceed in precise detail,” Gaspar explained.
Multiverse Computing is going to help address that computational challenge by adding a quantum circuit into the neural network architecture to optimise the system and improve training performance while also reducing memory consumption. That will use Oxford Quantum Circuits’ 32-bit quantum circuit.
The system will also be able to increase the expressivity of neural networks, which refers to how such a deep learning system approximates functions for better predictions. And, when more qubits are available, this system will be able to scale up to boost the neural network for better accuracy.
Practically, the neural network output feeds into the variational quantum circuit. That means the neural network must be designed in the right way for the output to match the quantum gates. That quantum circuit offers a result that is measured, and that measurement is then input back into the neural network.
“Essentially what we’re trying to solve is an optimisation problem at the end of the day, to simulate the effects of these floods,” said Sam Mugel, chief technology officer at Multiverse. “And for this optimisation problem, we’re going to solve it on a quantum computer using these variational quantum algorithms.”
The challenge is that the quantum computer is small, with just 32 qubits, but the problem is very large, with millions of variables. “The trick we’re going to do is we’re going to use a neural network architecture to compress the information,” Mugel said. “We have input, we’re going to compress it, run this on the QPU (the quantum computer), run the optimisation problem on the QPU, and then run it back through [the] neural network to decompress it.”
There’s plenty of work to do first. In the initial phase of governmentfunded work, Multiverse Computing is proving the project can succeed – it’s effectively a feasibility study. That requires understanding the nature and structure of flood data and how it needs to be processed so it can be used to train the neural network, but also showing how the hardware and software will work together.
“What we want to show in phase one is that... larger quantum circuits will be able to solve problems that can’t be solved with classical computing,” Gaspar said. If approved by DEFRA for phase two, Multiverse Computing and its partners will shift to implementation and building a working prototype.
Long road ahead
So if a 32-qubit circuit isn’t enough to run an entire algorithm, how many qubits do we need? Gaspar says we simply don’t know, and nor do we know how long it will take to build large enough quantum computers.
It’s a chicken and egg situation: we need software to show what the hardware can do, but it’s difficult to develop software when the hardware doesn’t exist yet. And that’s why hybrid quantum makes sense: it lets us see the value of quantum computing now, expand our simulation and modelling without waiting for larger quantum machines, and allows us to start developing the associated technologies such as software and algorithms so we’re ready to go when the hardware can be scaled up.
“If you think about it, the semiconductor industry has been around for 70 years,” said Gaspar. “I’m not going to say quantum computers are going to take 70 years, but we need the technology to develop. And right now we are at the stage where we need to be clever in how to design those algorithms and do more hardware crossover designed to make the most out of these scarce resources.”
Hybrid quantum means we can get some results now. But full quantum computing will first need serious technological breakthroughs – and serious cash. “Before we can justify that level of investment, we need to be able to say that we know when we arrive at this level, we’ll be able to solve this type of problem better,” said Mugel.
Like Gaspar, he points to the long history of chips. “The semiconductor industry has had trillions poured into it, but before we went ahead and poured all that money into it we first started with transistors,” he said. “Applications with very, very few transistors showed that there was value. Once we showed the initial disruptive use case, from there we were able to justify all the investment. For us, I think this project is one of several where we really are seeking to show that for quantum computers.”
In other words, this project isn’t just about water flow simulation, though anyone living in an area prone to floods will welcome better predictions. Instead, it’s a way to test if quantum computing is worth all the effort – and to spark investment in a technology that could be the next revolution in computing.
“We need software to show what the hardware can do, but it’s difficult to develop software when the hardware doesn’t exist yet”