Understanding Quantum Computers

Article By : Maurizio Di Paolo Emilio

Quantum computers have been an active research topic in recent years, with several companies starting to work on specific projects.

This is the third in a series of articles that take us in the realms of quantum physics. You can read Part 1 here, and Part 2 here.

Quantum computers have been an active research topic in recent years, with several companies starting to work on specific projects. Quantum computers have some unique properties that make them truly powerful. Advances in quantum technology require a new way of thinking and experts will need to be able to collaborate by intersecting multidisciplinary domains of knowledge, science, and technology. The discovery and design of new materials is one of the first applications where the use of quantum technology could have a significant impact. Having quantum computers available on a commercial scale will allow the simulation of new materials and allow us to design materials and chemicals having specific properties. It will be necessary to have a commercial-scale quantum computer with millions of qubits. A Qubit is a breakthrough concept, and we will talk about it in this article.

Quantum Computing

In the classical computational system, each bit (the information that travels in the system bus) is represented by a zero or a one – a binary system; in quantum computing, the qubit is the element to be considered and can be 0, 1, or zero and one at the same time. In practice, the qubit is the information that travels in the new quantum system bus. This is possible thanks to the superposition of quantum states, i.e., the parallel (rather than sequential) execution of the operations, bringing with it an exponential increase in the speed of calculation as well.

A quantum computer is similar to a typical computer, but it is made up of quantum circuits formed by elementary quantum logic gates. Quantum computers have the potential to address problems that conventional computing solutions cannot handle. The technology on which these computers are based is quantum physics; since a quantum bit (or qubit) can be in multiple states at the same time, it can be used to compute all possible states simultaneously, thus greatly accelerating the resolution of complex problems.

The quantum computer originated with the work done by Richard Feynman who, while at MIT, in 1981 demonstrated how standard computers are unable to efficiently simulate the evolution of quantum systems. He, therefore, proposed a basic model of quantum computer, with which he outlined the possibility of exponentially exceeding the processing capacity of standard computers. However, it took another ten years before a special algorithm was created that could change the vision of quantum computing: Shor’s algorithm. In 1994, Peter Shor developed his own algorithm to efficiently factor large integers, with an exponentially faster speed than the best classical algorithm implemented on traditional computers. The latter, in fact, takes millions of years to calculate numbers consisting of 300 digits. The ability of quantum computers to solve complex problems in hours rather than millions of years has sent technology in a new direction.

In 1996, Lov Grover invented a quantum database search algorithm that ran four times faster than standard computers. In 1997, the first scaled-down version of a quantum computer was made, but the sector really took off only in 2007, when Canadian D-Wave introduced its 28-qubit quantum computer. In 1998, a 2-qubit quantum computer was built. About twenty years later, in 2017, IBM unveiled the first commercial quantum computer, taking the competition to the next level. Some quantum computers can be freely used by anyone who wants to program them. IBM, Rigetti, Google, all offer free access, with open-source tools, to real computing systems for quantum processing. Some engineers are also currently working on a quantum communication system. This is very important for building a quantum version of the Internet.

Let’s see, in detail, what qubits are. We have seen​ that ​qubits represent the elementary units of information (the ​​​”​bits”​​) of quantum computing. But how are qubits physically created? How can electronics efficiently control qubits in a quantum system? Quantum computers are based on some concepts. The first is the “quantum superposition”, which is the idea behind Schrödinger’s famous cat paradox. Unlike the classic bits which, as we have said, can only have two states, 1 or 0, the “qubits” can be a combination of both. A cat is placed in a steel box along with a Geiger counter, a vial of poison, a hammer, and a radioactive substance. When the radioactive substance decays, the Geiger detects it and clicks the hammer to release the poison, which subsequently kills the cat. Radioactive decay is a random process, we do not know when it could happen: the atom exists in a state known as superposition, both decayed and non-decayed at the same time. In other words, until the box has been opened, the state of the cat is completely unknown, and this is related to the decay of the atom. The cat can be thus considered to be both alive and dead at the same time until it is observed.

The second concept is entanglement, which binds quantum particles together in both time and space. In traditional computers, each bit is closely connected to the state of the next. The third concept is related to the amplitude of probability, but we will discuss it later on in this series of articles. In quantum processing, qubits are strongly intertwined. Mathematical operations on overlapping and intertwined qubits can act simultaneously on all qubits in a single processing operation. The quantum computer places a qubit in a certain state and then intersects it with a neighboring one. This allows the rules of quantum physics to work with the states and connections of qubits that evolve over time. Eventually, the qubits are examined simultaneously to get an answer. The main task is to find the correct answer among billions of wrong answers. Unlike the conventional bit, which can be in the zero or one state, a qubit can exist as a simultaneous “superposition” of both states. Furthermore, in a quantum processor, multiple qubits in the superposition state are connected to each other, to the point they form a single group called entanglement. Entanglement is at the basis of the incredible computing power offered by quantum computers and it is the source of their potential to solve complex tasks beyond the capabilities of traditional supercomputers. In an interview I had with Intel, they used an interesting analogy. Let’s explain it!

An easy way to understand quantum computing is to think of a bit as a coin. It can be indifferently in the head state or in the tail state. Now let’s imagine that the coin is in rotation. During the rotation, the coin is simultaneously in both heads and tails: That is, it is in an overlap of the two states. This concept is similar to that of quantum bits, or qubits. If we think of putting two qubits together and intertwining them, in this way we will have obtained four states at the same time. Hence, two intertwined qubits represent a combination of four states at the same time. And more generally, n qubits represent 2n states. The computing power of a quantum computer grows exponentially with the number of qubits. In theory, if we had 50 of these intertwined qubits, we could represent more states than any other supercomputer. If we had 300 intertwined qubits at our disposal, we could represent more states at the same time than the number of atoms in​ the​​ universe. Qubits are special. They have a long life, but noise and observation cause a loss of information. In reality, it would take millions of high-quality qubits to make a commercial-sized quantum computer. In other words, we need a scalable quantum for it to be usable on a practical level. Intel and other companies such as IBM are working to finalize the quantum computer research by addressing several issues related to qubit control.

Let’s dig deeper into quantum mechanics and apply it to computers. We will then come back to the concept of control.

Since photons are discontinuous, it is not possible to use a classical deterministic theory, but only a probabilistic and statistical theory. Quantum mechanics provide information on the probability of measuring a given value, which can be interpreted as follows: having infinite identical systems available and performing the same measurement on all systems, the distribution of the obtained values and only the square modulus of the function of wave describing the system. In a quantum computer, the fundamental unit of information is the quantum bit, or qubit. To explain this new concept, we need to use a mathematical notation, known as Dirac’s notation. The state of a classic bit is described by the values 0 and 1; similarly, for qubits, vectors 0 and 1 are used. The difference between bits and qubits is that a qubit can also be in multiple states, as mentioned before. While a standard bit corresponds to two precise physical states such as 0 and 1, in the case of a qubit it is not possible to measure its quantum state accurately, we can only associate a probability with it. Formally, a quantum register of n qubits is an element of​ the 2n-dimensional Hilbertspace​​, C2n​ with a computational basis formed by 2n​​ registers of n qubits. Let’s analyze the case with two qubits. Similarly to what happens with the single qubit, we can construct the computational basis of the state space as formed by the vectors | 00>, | 01>, | 10>, | 11>. A quantum register with two qubits is a superposition of the form:

With the normalization on the amplitudes of the coefficients. Consider the system function given by the following formula:

where α and β are coefficients, while 0 and 1 are the qubit vectors.

We can associate a probability equal to | α |2 at state 0 and equal to | β | 2 at state 1; these two values therefore represent probability amplitudes. An interesting visualization of the qubit can be obtained through the geometric interpretation that associates the states of the qubit to the points placed on the surface of a sphere with unitary radius. The south pole of the sphere corresponds to state 1 and the north pole to state 0. This sphere is called the Bloch sphere.

Bloch sphere (Source: Wikipedia)

With a 3-bit register, we can represent up to 8 possible different states (numbers from 0 to 7). A quantum register composed of 3 qubits, on the other hand, will be able to contain all 8 values simultaneously thanks to a coherent quantum superposition of states that offers high computing power. The only problem is that we can only measure one state at a time, allowing us to manipulate multiple qubits at a time and observing the result obtained. Another problem that arises is the possibility of manipulating the qubits without interfering with their state. To solve this problem, we can use Shannon’s information theorems and introduce techniques for error correction.

Superposition and entanglement are two key concepts of quantum mechanical theory and contribute to the high computational capacity achieved by quantum computers. The superposition principle provides that an electron immersed in a magnetic field can have its spin aligned with the field (and in this case the electron is said to be in a spin-up state) or have a spin opposite to the field (the electron is in a spin-down state). According to the quantum laws, a particle can also be in a superposition state, behaving as if it were in both a spin-up and a spin-down state. In entanglement, also known as quantum correlation, particles that have interacted in the past maintain a connection with each other (provided they are in a completely isolated system). By knowing the spin of a particle, it will be possible to automatically know the spin of the other particle: If the first is in spin-up, the second will be in spin-down, regardless of the distance that divides them. In quantum computing, this allows information to be transferred from one end of the system to the other (but theoretically also from one end of the world to the other) practically instantaneously (it can be referred to as quantum teleportation).

The industrial sector aims to build quantum computers with millions of qubits. Several technologies allow to control qubits. The first is based on the trapped ion, in which the excitation states of a metal ion stimulated with a laser are studied. These excited states can represent the zero and one of the qubits. This technology is actually very similar to that of an atomic clock, which was awarded the Nobel Prize in Physics in 2012. The second technology is based on superconducting qubits, in which small rings of a superconducting metal are used to create an artificial atom whose state represents the zero and one of the qubits. The third technology is that of silicon quantum dots, or spin qubits; By controlling the spin of the electron, this technique allows to determine the zero and one states of the qubit. The analogy with the transistor allows us to better understand the latter technology. The transistor is essentially a switch: When a voltage or electrical potential is applied, a current of electrons is generated which flows through the device. Instead of having a current formed by a stream of electrons flowing through the device, a single electron is trapped with the latter technology. A single electronic transistor is created in that device. By putting many of these individual transistors together, we can create a network of electrons. And by checking the potential of individual transistors, we could control the interaction between two adjacent electrons. By inserting a single electronic transistor in a magnetic field, two energy states that can be used for a qubit can be obtained. With two states, we can control a single electron. The way we make our rotational qubits is the same way we make our transistors. At this point, to improve qubit technology, the same challenges imposed by transistors have to be solved, i.e., variable sizes, gate oxide defects, voltage variability.

One of the challenges offered by quantum computing is the control of qubits. Current qubits are controlled by several electronic racks connected to them via complex wiring and placed in a cryogenic refrigerator to shield the fragile qubits from thermal and electrical noise. As the number of qubits increases, millions of harnesses have to be created, resulting in an overly complicated hardware situation. The goal in the future is therefore to reduce the wiring. The introduction of future chips will improve the scalability of quantum computers by reaching thousands, or even millions of qubits, reducing the complexity of the quantum system’s interconnections: one of the critical barriers to achieving the viability of quantum computers.

A quantum computer does not work like a standard computer: Instead of operating on binary arithmetic, the quantum computer manipulates the probability amplitude of the quantum wave functions (which we saw in the previous chapter and earlier), subsequently sampling the probability distributions resulting. The control chip of the qubit sends the microcode to the control electronics and translates all the logical operations of the algorithm that must be carried out in order for the quantum algorithm to be executed. The microcode tells the control electronics which pulses to send to the qubits and at what instants of time. The software, running on a classical processor, loads and executes the quantum program, or algorithm, and sends the sequences of quantum operating instructions to the qubit control processor for execution. The program code, which includes both standard and quantum instructions, is generated by a quantum compiler. It establishes how to map and program quantum operations on qubits, knowing how the qubits are connected to each other and their properties. Quantum compilers have a challenging task to do, basically choreographing some sort of qubit dance that positions and moves the qubits to the right place at the right time. The algorithm must fight against time, taking into account that qubits have a very short life, usually a few fractions of a second, while operations require significant and often variable times.

Error correction is another interesting topic that requires extensive work and is currently being researched. This is critical in most quantum computer-related projects as it helps preserve the fragile quantum states that computation depends on. The operations needed for the correction of errors are very complex, as they must keep the quantum information unaltered. One way to improve fault tolerance is to delegate some of the computation to a CPU. In fact, this hybrid (quantum and standard) approach is needed throughout the computational stack. Let’s go deeper into electronics.

As with standard computers, a quantum computer is composed of quantum circuits formed by elementary quantum logic gates. Here’s an example: Let’s consider a one-bit logical gate, the NOT gate, which implements the logical negation operation defined through a truth table in which 1 → 0 and 0 → 1 (the logical zero turns into a logical one and vice-versa). Obviously, | 0> would turn into | 1> and | 1> into | 0>. The operation that implements this type of transformation is linear and is a general property of quantum mechanics demonstrated at an experimental level. The matrix corresponding to the NOT, which has been called​​ X for historical reasons​​, is defined by:​

With the normalization condition | α |2 + | β | 2 = 1 and any quantum state α | 0> + β | 1>. In addition to the NOT, two other important operations are represented by the Z matrix:

which acts only on the component | 1> changing the sign, and the Hadamard gate:

The latter operation is very often used in the definition of quantum circuits. The effect of H can be defined as a NOT not fully executed, so that the resulting state is not defined but only a coherent superposition of the two primary states (base states, i.e. 1 or 0). The way modern digital computers encode information is through voltages or currents applied to tiny transistors created inside integrated circuits, which act as digital or analog elements. Each transistor is addressed by a bus which can define a state of 0 (low voltage) or 1 (high voltage).

Superconducting QUantum Interference Device. B is the magnetic field, V the voltage and I the current (it is the induced current). Source

Quantum computers have several similarities with the architecture presented above. In this image, we can observe a superconducting qubit (also called SQUID – Superconducting QUantum Interference Device), which is the basic element of a quantum computer (a quantum ‘transistor’). The term “interference” refers to electrons that behave as waves within a defined quantum world, with the interference patterns triggering quantum effects. In this case, the basic element is niobium and not silicon, as occurs in classical transistors. The property of this material allows electrons to behave like qubits. When the metal is cooled, it becomes a superconductor and begins to manifest effects which are typical of quantum mechanics. The superconducting structure of the qubit encodes 2 states as tiny magnetic fields pointing in opposite directions. By means of quantum mechanics, we can control these states defined as +1 and -1 or | ψ> = α | 0> + β | 1>.  By means of elements known as superconducting loop couplers, it is possible to create a multi-qubit processor. A programmable quantum device can be designed by putting together many of these elements, such as qubits and couplers. To control the operation of the qubits, it is important to have a switch structure composed of Josephson junctions capable of directing each individual qubit (by routing pulses of magnetic information to appropriate points on the chip) and storing the information in a local magnetic memory element to each device. The Josephson effect consists in the generation of current between two superconductors separated by an insulating junction, known as the Josephson junction. This behavior is due to the tunnel effect that is created between pairs of electrons in each of the superconductors. If the isolation is too wide, the likelihood of the tunneling effect is low and the phenomenon does not occur. Most Josephson junctions represent a quantum processing unit (QPU). The QPU does not have large areas of memory (cache), as they look more like a biological brain than the classic “Von Neumann” architecture of a traditional silicon processor. Qubits can be considered as neurons and couplers as synapses that control the flow of information between these neurons. The requirements for a successful quantum implementation are encapsulated in the number of quantum bits which must be large enough to ensure high efficiency. This also implies that it will probably be possible to perform many operations on quantum bits in a very short time. The algorithms used require the application of numerous types of gate logic on quantum bits. To keep the probability of error low enough, these ports must be very accurate.

We have analyzed physics and then understood how to apply it. Research and development begin with a deep understanding of the workloads the system must perform. The nature of the workloads drives the design of a complete computing system. There is still a long way to go before we have a practically usable, commercial-sized and useful quantum computer. Intel is creating the qubits with the same technology we use for our advanced transistors. This process is performed using 300mm wafers; with each wafer thousands and thousands of quantum devices are produced to test these qubits. During the process, probes are used to characterize (test and measure) the transistors.

The quantum structure of the computer requires very low temperatures to function. In particular, a temperature reduction below approximately 80 mK (milliKelvin) is required. The performance of a quantum processor increases with decreasing temperature: the lower the temperature, the better the achievable performance. The latest generation D-Wave 2000Q system has an operating temperature of about 15 millikelvin. The CPU, or rather QPU, and parts of the input / output (I / O) system comprising about 10 kg of material, are cooled to this temperature. To reach temperatures close to absolute zero, the systems use liquid helium as a cooling element. Liquid helium is enclosed within a closed loop system, where it is recycled and recondensed using pulse tube technology. This makes the cooling system fit for purpose as liquid helium does not need to be replenished on site.

Future applications of quantum computing have captured the imagination of the whole world and, as a result, have been the subject of considerable press interest. Potentially, quantum computers could have a future impact on logistics and shipping services, on the design of new drugs, on areas of scientific interest such as protein folding, on climate risk modeling and analysis, on option pricing and, of course, one of the main applications that has captured interest in quantum computers: cryptography. In reality, however, these applications depend on the discoveries made in the field of quantum computing, both on a hardware and software basis and will take many years to be concretely realized. Since quantum computing is an entirely new type of processing that executes programs in a completely different way, we need hardware, software, and applications developed specifically for quantum computing.

It all started with the introduction of the concept of quantum, which took place during studies on the radiation of black bodies, that is, ideal bodies capable of completely absorbing the incident radiation. Albert Einstein used Planck’s concept of what was introduced to explain the photoelectric effect. The rules underlying quantum computing differ enormously from the standard ones. Quantum information cannot be copied but transferred with absolute fidelity. Each quantum measurement destroys most of the information, leaving it in a base state. Due to Heisenberg’s uncertainty principle, some observations cannot have precisely defined values at the same time.

This article was originally published on EE Times Europe.

Maurizio Di Paolo Emilio holds a Ph.D. in Physics and is a telecommunication engineer and journalist. He has worked on various international projects in the field of gravitational wave research. He collaborates with research institutions to design data acquisition and control systems for space applications. He is the author of several books published by Springer, as well as numerous scientific and technical publications on electronics design.

Leave a comment