Quantum computing: the story of one of the greatest quantum technologies (Part 3)

Quantum computing: the story of one of the greatest quantum technologies (Part 3)

In the first two parts we introduced quantum computing, and then explained the principles and concepts needed to better understand it. In this part, we will review the physical realizations that have been implemented so far.

Physical realization

There are many engineering challenges to building quantum computers. Qubits must constantly be protected from decoherence, which is achieved by minimizing Qubits interaction with the environment. However, even the best designed system cannot completely prevent entropy from entering the system. After cooling the qubits to a low entropy state, quantum computers should be able to do their calculations fast enough to give an output response before being decoherenced by thermal noise or other factors. Decoherence cannot always be prevented, so error correction mechanisms must be present in the system, such as a set of additional qubits that can recover corrupted states before the computation is completed.

Quantum computers, on the other hand, require Turing machine-equivalent resources to be considered quantum computers. Quantum computers need to be built by gates. They can also be manufactured as cluster-state or adiabatic machines. In adiabatic systems, the response to a calculation is defined as the ground state of a network of qubit interactions, and then takes the qubits to the ground state by activating these interactions sequentially. In cluster-state systems, a specific quantum state is constructed by manipulating the qubits with a small set of non-universal gates, then being universal is achieved by varying the degree of reliability of the measurements. Normally built cluster-state and adiabatic systems are equivalent to powering gate-based quantum computers and are easier to implement with certain technologies.

Finally, all quantum computer designs must be scalable. The main point is that exponential speed-up for certain types of problems can be obtained by linearly increasing the sources. While qubits are inherently capable of quantum parallelism, they are not the only component of a physical system. On the other hand, the system needs a way to isolate qubits from the outside world and to prevent decoherence, to restore the corrupt state, to cool the qubits to low entropy to prepare them for computation and to access resources for manipulating and measuring qubits. These resources are often several orders of magnitude larger than those of classical computers, so if the system is scalable, all of these resources must grow linearly with the number of qubits. So far, quantum physical computers have been implemented in the form of different systems, which we will discuss in the following five major categories.

Atomic trap computers

Atomic trap designs use single atoms as qubits. They employ bizarre hardware designs using electric fields to keep the atoms suspended in nanometer precision in the vicinity of absolute zero temperatures. As expected, these critical conditions are effective in isolating atoms from external disturbances; as a result, these qubits generally retain coherence much longer than is necessary to complete quantum computation, what is the strength of this technology.

A common subcategory in this field are trapped ion computers. In this design, lasers can act as logic gates. The quantum state transformations of each qubit can be accomplished by applying lasers to ions, and the qubits can also be entangled. The preparation is accomplished by optical pumping (the process of raising the energy level of electrons using light), which couples the target ion with excited states that eventually decay into a single state. These ions are measured by a laser that, if the ion is reduced to a state of 1, will cause the ion to irradiate the photons, and if it is reduced to a state of 0, it will not radiate anything. These designs have scalability problems, as coherence sensitivity and fragility increase with the adding more ions, which can reduce the efficiency of using lasers as logic gates.

One of the alternatives to ion traps is neutral atoms. In this design, arrays of atoms are enclosed using an optical lattice of transverse laser beams. Qubits can interact by interconnecting neighboring atoms through contact interactions. The major challenge of this design is to control the preparation, interaction and measurement of qubits.

Nuclear Magnetic Resonance Computers

Nuclear magnetic resonance (NMR) is a phenomenon in which a nuclei absorb and emit radiation in a magnetic field. NMR is used to study the effects of quantum and molecular physics as well as magnetic resonance imaging (MRI). NMR is a relatively mature technology at the moment, so in 1966, methods were proposed to build quantum computers using this technology.

In general, NMR quantum computers can be divided into two categories: solid state and liquid state. In both cases, NMR computers use entire molecules as qubits and their overall molecular spin to distinguish states. Unfortunately, the poor signal-to-noise ratio impedes the scalability of NMR designs. NMR liquid state designs have difficulty creating quantum entanglements, which impedes real quantum computing. While NMR is one of the most mature quantum computer design technologies available today, it seems to help further development of other quantum technologies rather than itself leading to the development of large-scale NMR computers.

Photonic or optical computers

This class of quantum computers uses photons as their building blocks. One of the strengths of photon designs is that photons are relatively resistant to decoherence, but on the contrary, it is relatively difficult to achieve the interactions that make universal logic possible with this technology.

In 2001, Knill and his colleagues showed that photon-based quantum computers were scalable. However, modern designs use non-deterministic interactions that reduce their usefulness, as research is moving toward determining interactions. Simple quantum algorithms have been proved using a cluster state design with the help of photonic systems. Current circuits use logic gates about one centimeter in size, which are several times larger than their classical counterparts, but as the power of quantum computers increases exponentially with the linear increase of physical resources, the one mentioned for practical use is still small.

Quantum dot computers

Quantum dots are very small crystals that operate in a semiconducting way with electrical properties depending on the size and shape of the crystal. In quantum dot calculations, quantum dots are used as qubits. The flow of electrons through quantum dots can be precisely controlled, which allows accurate measurement of spin and other properties. As with other quantum computer technologies, there are a variety of sub-categories, such as electrostatically defined points and self-assembling points. In addition, various methods have been proposed to achieve global computation using quantum dots, for example quantum dots, each containing a single electron. In this design, the electrons themselves act as qubits.

Electrostatically defined quantum dots and the self-assembling each have weaknesses. Electrostatically defined quantum dots are inhibited by an extremely short-lived exchange interaction, which is a major constraint for performing fault-tolerant quantum error correction. The big problem with self-assembling quantum dots is that they are random, in fact they are formed in random situations and do not have homogeneous optical properties. Advanced production techniques are being explored to control the behavior of points or even to allow their definite placement. Quantum dot computers show the ability to control each operation in one picosecond, demonstrating their potential for ultra-fast computing.

Superconducting computers

Classical integrated circuits suffer from high power leakage, and because of this characteristic, if attempted to be used in quantum circuits, decoherence would occur very quickly and thus would not allow useful computation. But in low-temperature superconductors, this decoherence is much lower, so the researchers are trying to make quantum circuits using this technology, and the good news is that they can be produced using existing methods. Superconducting qubits have the closest physical resemblance to classical bits of all quantum computer designs. They are made of circuits with a Josephson junction, a thin layer of insulation that separates parts of a superconductor. The flow of electrons during the Josephson junction results in physical properties that make the circuit suitable for use as a qubit.

In this design, fundamental quantum logic gates are constructed by having pairs of adjacent qubits, either capacitive or inductive, but this mechanism is not highly configurable. Research on whether to enable or disable interactions through adjustable pairs has been done and the possibility of using this technique to achieve adjustable quantum computations with superconductors has been investigated.

Initially, it was believed that the macroscopic nature of superconducting qubits would lead to fast and impractical scattering times. In the early experiments, quantum superconductors certainly experienced nanosecond scaling times. But thereafter, the decoherence times increased to a few microseconds, one or two times longer than the superconducting preparation and measurement time. However, for the time being, the fight against rapid decoherence is the most important obstacle in the implementation of superconducting quantum computers, and it may be necessary to engineer microscopic materials to reduce the noise.


to be continued...
Tags :
Quantum Computing
 
Rate
 
 

نظر شما
Name  
Email
Website
Enter the code shown above
Comment