Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
IBM

IBM Wants Its Quantum Supercomputers Running at 4,000-Plus Qubits by 2025 (engadget.com) 60

An anonymous reader shares a report: Forty years after it first began to dabble in quantum computing, IBM is ready to expand the technology out of the lab and into more practical applications -- like supercomputing! The company has already hit a number of development milestones since it released its previous quantum roadmap in 2020, including the 127-qubit Eagle processor that uses quantum circuits and the Qiskit Runtime API. IBM announced on Wednesday that it plans to further scale its quantum ambitions and has revised the 2020 roadmap with an even loftier goal of operating a 4,000-qubit system by 2025.

Before it sets about building the biggest quantum computer to date, IBM plans release its 433-qubit Osprey chip later this year and migrate the Qiskit Runtime to the cloud in 2023, "bringing a serverless approach into the core quantum software stack," per Wednesday's release. Those products will be followed later that year by Condor, a quantum chip IBM is billing as "the world's first universal quantum processor with over 1,000 qubits." This rapid four-fold jump in quantum volume (the number of qubits packed into a processor) will enable users to run increasingly longer quantum circuits, while increasing the processing speed -- measured in CLOPS (circuit layer operations per second) -- from a maximum of 2,900 OPS to over 10,000. Then it's just a simple matter of quadrupaling that capacity in the span of less than 24 months.

This discussion has been archived. No new comments can be posted.

IBM Wants Its Quantum Supercomputers Running at 4,000-Plus Qubits by 2025

Comments Filter:
  • by AndrewZX ( 9173721 ) on Tuesday May 10, 2022 @11:48AM (#62519580)
    4000 qubits might sound like a lot but it really isn't. Due to the noisy and unreliable nature of qubits, you need 3000 just to implement the checksums to reliability. So most of these qubit go towards just overhead and do not contribute to computing goals. Meanwhile, light based quantum computers do not have as bad a reliability problem and may scale up quicker, cheaper, and easier to program. IBM is not guaranteed a win even if they execute this strategy.
    • They've announced plans to do something that's never been done before. Since it's never been done before, I'll assume their experience level is at least a match for yours or mine - and when they're done, they'll have considerably more experience than you and I combined. It doesn't matter if this project ever meets your expectations or mine; just that IBM learns how to do more.

      In trying something new, no one is guaranteed to win, even if they execute their strategy perfectly. IBM is a great example of th

    • by drolli ( 522659 )

      > Meanwhile, light based quantum computers do not have as bad a reliability problem and may scale up quicker, cheaper, and easier to program.

      That is somewhere between a gross oversimplification and a wrong statement. YMMV depending on the purpose of the quantum circuit.

    • Hell it was almost 15 years ago that I read OPUs (Optical Processing Units) would replace CPUs by 2025 and there initial processing power would be equivelent to a 50ghz CPU. Whatever happened to that? Can we just not work with photons like we need to?
      • by Jeremi ( 14640 )

        Can we just not work with photons like we need to?

        Indeed, we can not.

        (or rather, we can, but the hardware to emit, receive, and process photons is still several orders of magnitude larger than modern on-chip silicon-transistors, which means that any "optical chips" built any time soon are going to be either quite simple or extremely large and power-hungry (or both).

  • A marketing firm.

    • I'm old enough to remember the IBM putting out thick books every quarter that announced new products, retiring products and any technical research direction. They used to be a technology company but that changed with a few CEOs/Chairmans that decided to milk the organization and weed out R&D. Now they're a shell of what they once were.

    • by mmell ( 832646 )
      The same marketing firm that invented the Blue Gene supercomputing system, virtualization, and the PC-XT.
      • Well, that's not to say that Watson has been a huge success either. [reddit.com]

        • by mmell ( 832646 )
          It may not have been a success - but it was new and previously unavailable. Nobody ever saw a computer play and win at Jeopardy before then.
          • And that's its epitaph.

          • It may not have been a success - but it was new and previously unavailable. Nobody ever saw a computer play and win at Jeopardy before then.

            It's also true that prior to Watson no computer had ever lost to a human ... because no computer had ever played a human at Jeopardy. While the answering capabilities of Watson were indeed impressive, the most important reason it beat Ken Jennings and Brad Rutter was its unparalleled ability to hit the button far better than any human. Since Watson and both humans knew the answers to most of the questions, the deciding factor for winning was button timing.

      • Sadly, very much not the same firm. All of those people have moved on and been replaced by MBAs.
  • This quantum computing stuff sounds a lot like cryptocurrency and NFT's. Only 7 people really understand it all and we can't figure out who those 7 are. And how will we know if we've found them? Seems to me like after 40 years, we could at least get a quantum computer to do some calculation that we could talk about. Say something like 2 + 2 + ? How fast can it do that? Maybe AndrewZX knows. Explain please.
    • If I understand well, the quantum computer will take the 2+2 algorithm and give you all the possible answers of 2+2 at the same time, thanks to quantum superposition.

      • (according to IBM's 127-qubit Eagle processor, it's very likely to be 4)

        • by Jeremi ( 14640 )

          (according to IBM's 127-qubit Eagle processor, it's very likely to be 4)

          That's correct, although for some reason, all calculations resulting in numbers greater than 4 are reported as "a suffusion of yellow".

      • by gweihir ( 88907 )

        Pretty much. For most things (including "supercomputing") QCs are completely worthless. Even these 4k Qbit chips are basically a lab-demo only. It starts with computations bein exceptionally slow. Then you do not actually get 4k Qbits, you get a lot less, more like 1k or below, due to error correction. This pretty much means that for the few things QCs can potentially do well, this these chips are slower than my 30 year old programmable pocket calculator.

    • by mmell ( 832646 )
      You can't solve class-NP problems with cryptocurrency.
    • by ceoyoyo ( 59147 )

      Neither quantum computing nor cryptocurrency and NFTs are very complicated. All three do seem to get heaps of bullshit piled on top by people who either don't know how they work or would prefer *you* didn't know how they work.

      Pretty much anything quantum is just linear algebra. There are lots of blog articles, papers, tutorials, simulators, etc. with whatever level of detail you could want.

      For example (couple random ones that came up in a Google search and looked okay):

      https://towardsdatascience.com... [towardsdatascience.com]
      and f

  • Your answer must:

    a) contain no technical jargon.
    b) contain no Quantum Computing terms.
    c) be easy for the layman, such as 'it can make you a sandwich.'


    -I am waiting.
    • It can theoretically break the encryption that is used to store your OnlyFans password.
    • by LeDopore ( 898286 ) on Tuesday May 10, 2022 @12:24PM (#62519718) Homepage Journal

      It could design superior genes that could help boost crop yields. It could also retroactively break nearly all the public-key encryption humanity has used to date. It (probably) cannot break all possible encryption schemes, and it (probably) cannot solve many of the computer science problems known to be the most difficult.

      It probably won't accomplish anything practical this decade, but it might next decade or later this century.

    • The most powerful types of quantum computer can break the cryptography that keeps web connections secure. IBM's computer probably isn't of this type. Instead it can perhaps solve some puzzles faster than ordinary computers, and it's possible that these puzzles are of a kind that will make businesses more money (e.g. better logistics).

      It's not certain that the less powerful quantum computers are faster than ordinary computers at all. The companies who produce them like to claim that they are, of course.
    • by mmell ( 832646 )
      It can solve algebraic problems of the class "NP". Quick example is factoring primes. I can take a pair of integers, multiply them together and get a composite number with at least two known factors. Takes milliseconds. When I hand you that number, it'll take you a lot longer to factorize it than it took me to generate it. As I increase the difficulty of generating that composite number (say, doubling my difficulty), I'll increase the difficulty of factorizing that number exponentially. The solution t
    • This sounds fun! I would like to try your challenge. Let us pick one topic, broadly discussed, encrypted data for privacy.

      In classical computing encryption works by having a math equation transform your data, where without knowing one of the numbers, 'a key', you cannot reverse that math to decrypt the data. Nothing prevents someone guessing that key - it is just a number. However due to the equation used, to hack encryption right now you could have a million million computers 'guessing' keys millions of

    • by ZiggyZiggyZig ( 5490070 ) on Tuesday May 10, 2022 @12:39PM (#62519772)

      Hmm. OK, let's go for the sandwich analogy.

      Imagine you're in Subway and you're not sure which sandwich ingredients you want. You have a specific taste that is not about having this or that ingredient in your sandwich, but you want a specific flavor and texture that can only be generated by specific combinations of ingredients (or absence thereof). As you are a Sandwicher Scientist, you know how to describe how a given combination of ingredients give birth to a specific flavor. However, from a known specific flavor, you can't go back to a specific list of ingredients (possibly because several combinations can give the same flavor, or the texture arises from mixing varied ingredients together in a putty, etc.).

      With classical sandwiching, you have one, or maybe several, Subway employees making sandwiches. They need to create all possible combinations one after the other until they find one that fits your specific taste according to your description. But as Subway tries to cater for more and more customers, they offer more and more possible ingredients, so trying all the combinations quickly becomes unmanageable, even with more employees, as adding a single ingredient to the menu doubles the quantity of sandwiches to try. Subway would need to double its workforce every time they add an ingredient, so that you still can get your sandwich combination before you decide to go to Mc Donald's.

      This is a problem because you are hungry for the right sandwich, and you want it now, and Subway wants your money.

      With quantum sandwiching, each ingredient is turned into a quingredient. The Quandwich Machine has as many quingredients as Subway offers ingredients on its make-your-own-sandwich menu. When Subway adds a new ingredient to its menu, it only needs to add one extra quingredient in the Quandwich Machine to represent its menu in the Machine. The quingredients put together represent a superposition of all the possible combinations of ingredients, and you have provided an algorithm that translates every combination to a taste. So, it's already known what all possible sandwiches taste like. It's then just a matter of you providing a description of the specific taste you're looking for, and the Quandwich Machine will cater for you instantly.

      Writing this made me hungry, and I'll go make myself a sandwich.

    • Essentially: scientific & engineering progress.

      Quantum computers have the potential to solve some types of problems more efficiently and/or faster than classical computers can. In some cases, orders of magnitude better. Which may help to achieve scientific or engineering breakthroughs (from which we ALL profit), that -without quantum computing- would either have been impossible, much more costly, or much more time-consuming.

      How important this will become in the overall scheme of things - that remai

    • by gweihir ( 88907 )

      Simple: Nothing. Ask again in 20 years or so, but to not expect much more.

  • IBM famously achieved 50 qubits in 2017. If we use 4000 in 2025 as the other data point and model with geometric growth akin to Moore's Law, we have an annual quibit growth rate of (4000/50) ^ (1/(2025-2017)) = 1.73x annual growth.

    An MIT Technology Review article [technologyreview.com] says 20 million qubits are required to break RSA. So 2025 + ln(20,000,000 / 4000) / ln(1.73) = 2056 is when a quantum computer will break RSA -- meaning credit card purchases relying on today's https would no longer be possible. Of course, by then

    • by mmell ( 832646 )

      Look out - revolutionary advances have a funny way of appearing out of nowhere, even where science is concerned.

      Side question- can we rename "Moore's Law"? Something like "Moore's theory", or "the Moore postulate"? "Law" implies something absolute to me - the law of gravity, the laws of thermodynamics, the laws of motion, etc. Moore himself was spitballing when he first expressed his opinion that computing power would double every decade. He seems to have (more or less) nailed it, but I've never seen a

      • by ceoyoyo ( 59147 )

        Laws are observed relationships. Newton's Laws, for example, are just that.

        Sometimes those relationships turn out to have some pretty deep meaning behind them, like F=ma, but not always.

        You should read Moore's article, and a few of the papers on the topic. There is quite a bit of analysis. Also, Moore's Law is not what you think it is (beyond your mistake with the timeline).

    • IBM famously achieved 50 qubits in 2017. If we use 4000 in 2025 as the other data point and model with geometric growth akin to Moore's Law, we have an annual quibit growth rate of (4000/50) ^ (1/(2025-2017)) = 1.73x annual growth.

      What does qubit mean? What is the difference between 50 qubits and 4000 qubits in terms of computing power?

      An MIT Technology Review article says 20 million qubits are required to break RSA

      What is the difference between 4000 and 20 million qubits?

      • Similar to how QLC SSDs store 4 bits per cell, one qubit, which is analog in nature, can typically store the equivalent of four conventional bits of information. Or, because a qubit has both phase and amplitude, perhaps 4 conventional bits worth for each of those for a total of 8 conventional bits worth.

        Now the mind-blowing part enters when entanglement comes into the picture. IF (and that's a big "if") all N qubits of an N-bit quantum computer are all entangled together, then the quantum computer can store

        • Now the mind-blowing part enters when entanglement comes into the picture. IF (and that's a big "if") all N qubits of an N-bit quantum computer are all entangled together, then the quantum computer can store and compute upon 2^N of those 4-8 bit values simultaneously. That means one can explore a search space of 2^N (like, say, an N-bit encryption key) -- running a validation test on every possible value simultaneously -- in a few seconds. But in real-life quantum computers, often only a subset of the N qubits can be entangled together simultaneously. And as another Slashdotter noted in this thread, a number of qubits have to be devoted to error correction. Plus algorithm overhead of course.

          This is what I don't understand. How can people talk about qubits and do math and make predictions of when things will happen if what a qubit actually means depends entirely upon undisclosed context and architecture?

          • It's not unlike the "bit wars" for videogame consoles 30 years ago. Being able to claim "32 bits" isn't meaningless; it indicates some rough magnitude of capability.

            However, to address the very valid concern you raise, IBM has introduced the metric Quantum Volume [wikipedia.org].

    • by gweihir ( 88907 )

      Bullshit. You can make RSA keys longer, but you cannot factor larger RSA moduli with a given QC. That MIT TR article is for RSA 2048. Also, there is absolutely no indication that QCs will even scale to that 20M Qbit size.

  • I want mine running at 10,000+ qubits by 2024!

  • Furlongs per fortnight is that?
  • Is it too much to ask for useful metrics to at least describe in some meaningful way the scale of the improvement?

    It used to be each additional qubit doubled your computing power. Then came "topological" and various fanout schemes for error correction. People appear to have retained "qubit" as a marketing term while its real world meaning has radically changed for the worse in ways which are entirely dependent upon specific architectures.

    • by gweihir ( 88907 )

      Well, useful metrics would probably show that this whole thing is doomed to failure. So they are not used.

  • a Beowulf cluster of these things

  • When IBM reaches the 100,000 qubit barrier, no bitcoin will be left to mine with your petite mining rigs.
    Just like Watson won Jeopardy, IBM will mine all remaining bitcoin with this monster.

God doesn't play dice. -- Albert Einstein

Working...