Today, quantum computer company D-Wave announces the availability of its next generation quantum glow, a specialized processor that uses quantum effects to solve optimization and minimization problems. The hardware itself isn't much of a surprise – D-Wave discussed its details months ago – but D-Wave spoke to Ars about the challenges of building a chip with over a million individual quantum devices. And the company couples the hardware version to the availability of a new software stack that works a bit like middleware between the quantum hardware and classic computers.
Quantum computers built by companies like Google and IBM are universal, gate-based machines. They can solve any problem and should have tremendous acceleration for certain problem classes. Or they will once the number of goals is high enough. Currently, these quantum computers are limited to a few dozen gates and have no error correction. Bringing them down to the required level poses a number of difficult technical challenges.
D-Wave's machine is not intended for general purposes. It's technically a quantum incandescent device, not a quantum computer. It performs calculations that find low energy states for various configurations of the hardware quantum devices. As such, it only works if a computer problem can be translated into a power minimization problem in one of the possible configurations of the chip. This is not as limiting as it sounds, as many forms of optimization translate into a problem of energy minimization, including complicated planning problems and protein structures.
It is easiest to think of these configurations as a landscape with a series of peaks and valleys. The problem solution corresponds to the search in the landscape for the lowest valley. The more quantum devices there are on the D-Wave chip, the more thoroughly it can scan the landscape. Therefore, increasing the qubit number is absolutely critical to the usefulness of a quantum glow.
This idea goes pretty well with D-Wave's hardware as it is much easier to add qubits to a quantum annealer. The company's current offering includes 2,000 of them. There is also a question of fault tolerance. While errors in a gate-based quantum computer usually result in useless output, errors on a D-Wave machine usually mean that the response it returns is low in energy, but not the lowest. And for many problems, a reasonably optimized solution can be good enough.
It is less clear whether the approach offers clear advantages over algorithms that run on classic computers. For gate-based quantum computers, researchers had already worked out the mathematics to show the potential for quantum superiority. This is not the case with quantum glow. Over the past few years there have been a number of instances where D-Wave's hardware has had a clear advantage over classic computers, only to see that a combination of algorithm and hardware improvements on the classic side erased the difference.
D-Wave hopes the new system, which it describes as an advantage, can have a significant difference in performance. To date, D-Wave has offered a 2,000 qubit quantum optimizer. The Advantage system scales that number to 5,000. These qubits are just as critically interconnected in an additional way. As mentioned above, problems are structured as a specific configuration of connections between the qubits of the machine. If a direct connection between the two is not available, some of the qubits must be used to make the connection and are therefore not available for troubleshooting.
The 2,000 qubit machine had a total of 6,000 possible connections between its qubits, an average of three for each of them. The new machine increases this total to 35,000, an average of seven connections per qubit. This of course allows many more problems to be configured without using qubits to make connections. A white paper released by D-Wave indicates that it works as expected: larger problems fit into the hardware, and fewer qubits need to be used as bridges to connect other qubits.
Each qubit on the chip is shaped like a loop of superconducting wire called the Josephson junction. But there are well over 5,000 Josephson connections on the chip. "The lion's share of it goes into superconducting control circuits," D-Wave's processor director Mark Johnson told Ars, "they're basically like digital-to-analog converters with memory that we can use to program a particular problem."
In order to have the necessary level of control, the new chip has a total of over a million Josephson connections. "Let's put that in perspective," said Johnson. "My iPhone has a processor with billions of transistors. In that sense, it's not much. However, if you are familiar with superconducting integrated circuit technology, this is way off the curve." It also required over 100 meters of superconducting wire to connect everything – all on a chip that is roughly the size of a miniature picture.
While all of this is made on silicon using standard manufacturing tools, this is just a convenient substrate – there are no semiconductor devices on the chip. Johnson was unable to go into details of the manufacturing process, but he was willing to talk about how these chips are made more generally.
This is not a TSMC
One of the big differences between this process and standard chip manufacturing is volume. Most of D-Wave's chips are in their own facility and are accessed by customers via a cloud service. Only a handful are bought and installed elsewhere. That means the company doesn't have to make very many chips.
When asked how many it makes, Johnson laughed and said, "I will end up being the one who predicted there will never be more than five computers in this world," before continuing, "I think we will." can meet our business goals with a dozen or less of them. "
If the company were to make standard semiconductor devices, it would mean making a wafer and calling it daily. However, D-Wave believes that progress has reached the point where a useful device is being removed from every wafer. "We're constantly pushing past the comfort zone you might have with a TSMC or Intel, where you're looking for the number of nines I can make in my earnings," Johnson told Ars. "When we have such a high return , we probably didn't push hard enough. "
Much of that pressure came in the years prior to this new processor. Johnson told Ars that the higher connectivity will require new process technology. "[It's] the first time in about 10 years that we've made a significant change to the technology node," he told Ars. "Our fabulous cross-section is much more intricate. It has more materials, more layers, more types of equipment, and more more steps. "
Aside from the complexity of manufacturing the device itself, the fact that it operates at temperatures in the milli-Kelvin range also adds to the design challenges. As Johnson noted, any wire that gets into the chip from the outside is a potential conduit for heat that needs to be minimized – again, a problem most chip makers don't face.