Jump to content
  • IBM pushes qubit count over 400 with new processor


    Karlston

    • 502 views
    • 7 minutes
     Share


    • 502 views
    • 7 minutes

    Milestone is important for the company's road map, less critical for performance.

    52476285842_4627ecdedb_k-800x450.jpg

    IBM

     

    Today, IBM announced the latest generation of its family of avian-themed quantum processors, the Osprey. With more than three times the qubit count of its previous-generation Eagle processor, Osprey is the first to offer more than 400 qubits, which indicates the company remains on track to release the first 1,000-qubit processor next year.

     

    Despite the high qubit count, there's no need to rush out and re-encrypt all your sensitive data just yet. While the error rates of IBM's qubits have steadily improved, they've still not reached the point where all 433 qubits in Osprey can be used in a single algorithm without a very high probability of an error. For now, IBM is emphasizing that Osprey is an indication that the company can stick to its aggressive road map for quantum computing, and that the work needed to make it useful is in progress.

    On the road

    To understand IBM's announcement, it helps to understand the quantum computing market as a whole. There are now a lot of companies in the quantum computing market, from startups to large, established companies like IBM, Google, and Intel. They've bet on a variety of technologies, from trapped atoms to spare electrons to superconducting loops. Pretty much all of them agree that to reach quantum computing's full potential, we need to get to where qubit counts are in the tens of thousands, and error rates on each individual qubit are low enough that these can be linked together into a smaller number of error-correcting qubits.

     

    There's also a general consensus that quantum computing can be useful for some specific problems much sooner. If qubit counts are sufficiently high and error rates get low enough, it's possible that re-running specific calculations enough times to avoid an error will still get answers to problems that are difficult or impossible to achieve on typical computers.

     

    The question is what to do while we're working to get the error rate down. Since the probability of errors largely scales with qubit counts, adding more qubits to a calculation increases the likelihood that calculations will fail. I've had one executive at a trapped-ion qubit company tell me that it would be trivial for them to trap more ions and have a higher qubit count, but they don't see the point—the increase in errors would make it difficult to complete any calculations. Or, to put it differently, to have a good probability of getting a result from a calculation, you'd have to use fewer qubits than are available.

     

    Osprey doesn't fundamentally change any of that. While the person at IBM didn't directly acknowledge it (and we asked—twice), it's unlikely that any single calculation could use all 433 qubits without encountering an error. But, as Jerry Chow, director of Infrastructure with IBM's quantum group, explained, raising qubit counts is just one branch of the company's development process. Releasing the results of that process as part of a long-term road map is important because of the signals it sends to developers and potential end-users of quantum computing.

    On the map

    IBM released its road map in 2020, and it called for last year's Eagle processor to be the first with more than 100 qubits, got Osprey's qubit count right, and indicated that the company would be the first to clear 1,000 qubits with next year's Condor. This year's iteration on the road map extends the timeline and provides a lot of additional details on what the company is doing beyond raising qubit counts.

     

    52476203722_e6e9022e77_k.jpg
    IBM's current quantum road map is more elaborate than its initial offering.
    IBM

    The most notable addition is that Condor won't be the only hardware released next year; an additional processor called Heron is on the map that has a lower qubit count, but has the potential to be linked with other processors to form a multi-chip package (a step that one competitor in the space has already taken). When asked what the biggest barrier to scaling qubit count was, Chow answered that "it is size of the actual chip. Superconducting qubits are not the smallest structures—they're actually pretty visible to your eye." Fitting more of them onto a single chip creates challenges for the material structure of the chip, as well as the control and readout connections that need to be routed within it.

     

    "We think that we are going to turn this crank one more time, using this basic single chip type of technology with Condor," Chow told Ars. "But honestly, it's impractical if you start to make single chips that are probably a large proportion of a wafer size." So, while Heron will start out as a side branch of the development process, all the chips beyond Condor will have the capability to form links with additional processors.

     

    Initially, these connections will be via direct superconducting links between hardware on the chips. But longer term, Chow said IBM is working on meter-long links that can survive the near-absolute-zero temperatures that the chips require and allows communication between the chips to maintain quantum coherence.

     

    But Chow indicated there's a lot of work that's not obviously on the road map. Some of that is related to the hardware needed to control the qubits during operations. The first processors the company experimented with needed a rack of equipment to control five qubits; with Osprey, the entire chip can be controlled with a single rack, and the company is experimenting with chips that can control the qubits while operating at the qubits' super-cold temperatures.

     

    Also missing from the road map is the fact that IBM experiments with process technology for qubits on small systems with a handful of qubits, then shifts any useful developments back to the main branch in the form of interim releases. "You see this with some of our evolutions of later generations of previous birds," Chow said. "The Falcon R8 had a significant increase in coherence times from Falcon R5. And then similarly, earlier this year, we had the third revision of Eagle, Eagle Rev. Three, [that] had about a 3x increase in median coherence times from our initial Eagle that we released last year."

     

    These changes allow users to perform more complicated calculations on a chip with the same qubit count, and may ultimately help users use more of the qubits on existing generations of hardware. "We have a main Osprey, and we already have trailing behind it another generation Osprey," Chow told Ars. "Maybe about 2x improvement, two and a half times."

    Software too

    Software is a major component on the current road map, which is a large departure from the initial map. IBM has supported the development of software called Qiskit, which allows developers to implement algorithms for quantum computers without having to worry about issuing all the discrete control commands that perform the actual manipulations of qubits. It's now modifying Qiskit to allow it to perform repeated runs of an algorithm and track where errors occur; it can then alter the algorithm's implementation so that it's less likely to run into the circumstances that triggered the error.

     

    "It trades off time to learn about errors," Chow said, "By running more circuits with different types of operations, different exposures to the noise, you can learn about the actual noise that you're susceptible to." Avoiding that noise, in turn, lowers the overall error rate that an algorithm will experience. "We're actually finding that we can go to depth 10, depth 20 Quantum circuits using error mitigation techniques," he told Ars.

     

    Combined, the process improvements and improved error avoidance will gradually improve the performance of Osprey, although it will take time, and IBM's already working on Osprey's replacement. So, IBM is being careful about not overpromising. "We're not talking about achieving anything like quantum advantage, or anything like that on this device," Chow said. Instead, the qubit count is about assuring people in the field that IBM can continue along its set road map.

     

    "Everyone wants that 'Well, now, this enables X, Y, and Z,'" Chow said. "But ya know, it's really about the journey, right?"

     

     

    IBM pushes qubit count over 400 with new processor


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...