10

Sorry if this is a dumb question.

With conventional processors we increase cores, clock speed and IPC etc. With quantum computers the race seems to be to have the most qubits, seemingly in conflict with how we improve conventional processors. I think I am confusing instruction sets (x86) with bits?

I'm writing an essay on the topic and am trying to compare the progression of conventional computers to that of quantum computers.

I understand that qubits are more akin to memory or cache than processing speed, with an increase in qubits allowing larger inputs to be processed though at the same speed. Could a good analogy be to compare qubits to transistors or RAM?

Thanks for any help. Not finding it easy to wrap my head around this topic.

ArduinoBen
  • 201
  • 2
  • 3

7 Answers7

9

"Why do quantum computers have more qubits than classical computers have bits?"

They do not. I'm typing on a classical laptop with 500GB (8 x 500 billion bits). Quantum computers do not have more than 8x500 billion bits.

"Could a good analogy be to compare qubits to transistors or RAM?"

RAM. Qubits are analogous to bits. RAM contains rapid access bits. Transistors are used for things like switches and logic gates, analogous to devices that would be used to do things to qubits, but not analogous to qubits themselves.

7

It is generaly hard to compare classical and quantum computers, as both are based on different calculations paradigmas.

Besides number of qubits, it is also important to track inter-qubits connectivity, gates fidelities, relaxing and dephasing times etc. These measures do not have analog in a classical computer.

Also, it is hard to compare "clock speed" of classical and quantum processor. Quantum processor work in kHZ or low MHz range, while classical ones operate in GHz range. This is not a problem since power of quantum processors lie in lower complexity (not in all cases, of course!) of employed algorithms in comparison with classsical processors. So, you can for example take several problems (algorithms) and compare their complexities in classical and quantum cases, however, be aware of assumption under which the complexities are derived.

Concerning memory, there is an issue with quantum analog to RAM, so called qRAM. Currently such memory is not available (with some exceptions). Non-existence of qRAM somehow hinders theoretically reachable power of quantum computers. Once qRAM is available, qubits will be akin rather to registers in classical processors. So, comparison of qubits number and classical RAM size also says nothing.

If you wanted to make a comparison of classical and quantum computers development, I would advise to show how some crucial parameters evolved in time. In case of classical processor, number of transistors or clock speed would be appropriate. In case of quantum computers, relaxing and dephasing times, number of qubits or gates fidelities would be suitable. However, as with any other technology, you will see exponential imporovement in the parameters. A question is if this indicates something new. Simply any new technology evolves rapidly when some turning point is met.

It makes much sense to compare different technologies used for physical qubits implementation, e.g. trapped ions with superconducting qubits.

Martin Vesely
  • 15,398
  • 4
  • 32
  • 75
6

To add onto Andreas' answer, there are currently three measures that are taken into account to assess the quality of a quantum computer:

  1. The number of qubits, which upper-bounds its quantum volume.
  2. The quantum volume, which measures the quality of the gates that are applied to the qubits.
  3. The CLOPS, which measures the number of gates per second that can be applied to a system.

Essentially, the goal is to take into account both the quality of the results and the speed at which one can get them. It is quite hard to compare two quantum computers which differ in two quantities.

Concerning your essay about breaking RSA using Shor's algorithm, you'll probably find the quantum volume more interesting than CLOPS. The question of applying Shor's algorithm to numbers which factors are not known already has been discussed on SE, for instance here and here.

Also, note that you may find interesting this answer, which discusses the problems quantum computers have to deal with and the progress made in that sense.

Tristan Nemoz
  • 8,694
  • 3
  • 11
  • 39
5

I don't know how to compare a quantum computer to a classical computer, but maybe this helps:

In a quantum computer, we act with gates onto qubits. Qubits 'store' our information, gates 'change' it.

Indeed, it does seem like companies are racing for higher qubit counts. But in reality the amount of gates ('circuit depth') we can use is often far more important.

In theory we can add a lot of gates to our circuit. But gates only work around 99% ('gate fidelity') of the time. 1% of the time they produce a random result ('noise').

So too many gates just produces gibberish.

While the race for more qubits is going on, researchers are also working hard to increase the gate fidelity. I think fidelity is just less flashy than qubit counts

Andreas Burger
  • 311
  • 2
  • 11
3

As others have pointed out, in addition to number of qubits, researchers are very focused on improving qubits stability and gate fidelity. However, the raw number of qubits is special, because those other desirable properties exhibit diminishing returns. E.g., a fidelity of 99.99% may be "good enough" for most cases, so once we reach the "good enough" point, there's not much point in further research to get another 9. (As a commenter pointed out, the actual "good enough" point may be much better than 99.99%; the point I'm trying to illustrate is that a "good enough" point does exist beyond which further increasing fidelity yields little to no performance improvement.)

Number of qubits, on the other hand, has essentially infinite demand. The state space you can manipulate in your quantum algorithm scales exponentially with the number of qubits. Just as an example, one of the holy grails of quantum computing is many-body nuclear physics calculations. The more qubits I have available, the more complex interaction I can model, and the demand is infinite for all practical purposes. E.g., think of a full quantum simulation of a fusion reactor. If you wanted to simulate fully without any approximation, the number of qubits you need scales with the number particles in your plasma, which means "billions of billions" to channel Sagan.

thegreatemu
  • 191
  • 4
3

I don't know why everyone is underexplaining - I think a little more explanation of the fundamentals is necessary!

As you probably know, a qubit is literally just an electron or some small particle that's either spin up or spin down (or a superposition of up and down). It can also be in another form, but it always exactly replicates this behavior of spin up/spin down, so you should just think of it like that. Spin down is called a 0 and spin up is called a 1.

Quantum circuits (otherwise known as a quantum computers) are totally different from regular computers: they have 1 input and 1 output and don't run for very long at all (only an order of a millisecond or microsecond). The input to a quantum circuit is a bunch of qubits, which are typically all in the 0 state. The output is the same qubits, but modified and in a superposition (so they probably won't all be 0 anymore: they could be something like a superposition of 01101 and 11000).

What a quantum computer does is take those inputted 0s and make them interact (ie use "entanglement" to put them in a "superposition") in a way that humans designed so that the final result out the other end has a high probability of being measured as the correct answer. For example, the input might be 000000 and the output might be a superposition of 100011, 110000, and 100111, with a high probability of 110000. For the 6-qubit example I just gave, a picture of the full quantum computer would be 000000 -> (interaction between the 6 qubits) -> superposition of 100011, 110000, and 100111.

In order to do anything interesting we need enough qubits to describe the problem we're trying to solve. Right now qubits are very unstable and we don't have enough coherent qubits to describe any interesting or non-trivial problems. We also need the interactions in the circuit to be successful, which again is only possible when the qubits stay coherent. Coherence is the single issue making quantum computers difficult to achieve.

To answer your question of "Why are there more qubits than bits?": actually, comparing the number of bits needed to describe a problem, most people would say you need more bits than qubits, which is a result of the power of superposition. But this is not quite right: it really depends on the implementation of the quantum algorithm and what the scientists / algorithm designers take those qubits to mean in the context of the problem (how will they interpret the qubits as an answer to their problem?). I hope it's a little more clear why it makes zero sense to compare these two technologies. It's best to actually understand what a quantum computer is, and I hope I helped.

Lucas Mumbo
  • 131
  • 3
2

An ordinary computer these days uses 32 or 64 bits for calculations, and can store the results for successive calculations. So a calculation can, in fact, involve many more than 64 bits.

The quantum computer is based on the principle that calculations are peformed with correlated bits, called qubits. I will not explain how such correlation is obtained (I don't even have that knowledge) but it is practically impossible to store qubits. And I suppose that it is impossible (practically and even in theory) to correlate a stored result with a new calculation.

So the only viable solution by now is to use as many qubits as possible, in the processor.