Actions

Quantum Computing

Revision as of 12:39, 28 October 2021 by User (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Quantum computing is a theoretical computing model that uses a very different form of data handling to perform calculations. The emergence of quantum computing is based on a new kind of data unit that could be called non-binary, as it has more than two possible values. A traditional computer works on bits of data that are binary, or Boolean, with only two possible values: 0 or 1. In contrast, a quantum bit, or "qubit," has possible values of 1, 0 or a superposition of 1 and 0, in the case of an unknown value. According to scientists, qubits are based on physical atoms and molecular structures. However, many find it helpful to theorize a qubit as a binary data unit with superposition.[1]



Quantum Computing Fundamentals[2]

All computing systems rely on a fundamental ability to store and manipulate information. Current computers manipulate individual bits, which store information as binary 0 and 1 states. Quantum computers leverage quantum mechanical phenomena to manipulate information. To do this, they rely on quantum bits, or qubits.

Three quantum mechanical properties — superposition, entanglement, and interference — are used in quantum computing to manipulate the state of a qubit.

Superposition: Superposition refers to a combination of states we would ordinarily describe independently. To make a classical analogy, if you play two musical notes at once, what you will hear is a superposition of the two notes.

Quantum Computing - Supposition

Entanglement: Entanglement is a famously counter-intuitive quantum phenomenon describing behavior we never see in the classical world. Entangled particles behave together as a system in ways that cannot be explained using classical logic.


Quantum Computing - Entanglement


Interference: Finally, quantum states can undergo interference due to a phenomenon known as phase. Quantum interference can be understood similarly to wave interference; when two waves are in phase, their amplitudes add, and when they are out of phase, their amplitudes cancel.

Quantum Computing - Interference


Quantum Computing Models[3]

There are a number of quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are:

  • Quantum gate array (computation decomposed into a sequence of few-qubit quantum gates)
  • One-way quantum computer (computation decomposed into a sequence of one-qubit measurements applied to a highly entangled initial state or cluster state)
  • Adiabatic quantum computer, based on quantum annealing (computation decomposed into a slow continuous transformation of an initial Hamiltonian into a final Hamiltonian, whose ground states contain the solution)
  • Topological quantum computer(computation decomposed into the braiding of anyons in a 2D lattice)

The quantum Turing machine is theoretically important but the direct implementation of this model is not pursued. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead.


Quantum Computers Vs. Conventional Computers[4]

Although people often assume that quantum computers must automatically be better than conventional ones, that's by no means certain. So far, just about the only thing we know for certain that a quantum computer could do better than a normal one is factorization: finding two unknown prime numbers that, when multiplied together, give a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the "prime factors" of a large number, which would speed up the problem enormously. Shor's algorithm really excited interest in quantum computing because virtually every modern computer (and every secure, online shopping and banking website) uses public-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an "intractable" computer problem). If quantum computers could indeed factor large numbers quickly, today's online security could be rendered obsolete at a stroke. But what goes around comes around, and some researchers believe quantum technology will lead to much stronger forms of encryption. (In 2017, Chinese researchers demonstrated for the first time how quantum encryption could be used to make a very secure video call from Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Not exactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that would be better performed by quantum methods. Given enough time and computing power, conventional computers should still be able to solve any problem that quantum computers could solve, eventually. In other words, it remains to be proven that quantum computers are generally superior to conventional ones, especially given the difficulties of actually building them. Who knows how conventional computers might advance in the next 50 years, potentially making the idea of quantum computers irrelevant—and even absurd.


History of Quantum Computing[5]

Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. This speech is also generally considered the starting point of nanotechnology.

Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality.

In 1985, the idea of "quantum logic gates" was put forth by the University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound.


Applications of Quantum Computing[6]

Quantum computing could:

  • Speed up the development of drugs; improve chemical industry manufacturing; desalinate seawater; and even suck carbon dioxide out of the atmosphere to curb climate change.
  • Result in the invention of room temperature superconductors that would be impervious to power drain during electrical transmission.
  • Handle problems of image and speech recognition, and provide real-time language translation.
  • Greatly enhance big data processing from sensors, medical records and stock fluctuations.
  • And generate many other similarly important applications not yet imaginable.


The Advantages and Disadvantage of Quantum Computing[7]

Advantages of Quantum Computing

  • The main advantage of quantum computing is it can execute any task very faster when compared to the classical computer, generally the atoms changes very faster in case of the traditional computing whereas in quantum computing it changes even more faster. But all the tasks can’t be done better by quantum computing when compared to traditional computer.
  • In quantum computing qubit is the conventional superposition state and so there is an advantage of exponential speedup which is resulted by handle number of calculations.
  • The other advantage of quantum computing is even classical algorithm calculations are also performed easily which is similar to the classical computer.

Disadvantages of Quantum Computing

  • The main disadvantage of computing is the technology required to implement a quantum computer is not available at present. The reason for this is the consistent electron is damaged as soon as it is affected by its environment and that electron is very much essential for the functioning of quantum computers.
  • The research for this problem is still continuing the effort applied to identify a solution for this problem has no positive progress.


See Also

Affective Computing
Cloud Computing
Big Data
Predictive Analytics
Artificial Intelligence (AI)
Artificial Neural Network (ANN)
Data Mining
Data Analysis
Data Analytics
Machine Learning]
Cognitive Computing
Statistical Analysis


References

  1. Definition: What is Quantum Computing? Techopedia
  2. Quantum Computing Fundamentals IBM
  3. Quantum Computing Models Wikipedia
  4. What can quantum computers do that ordinary computers can't? Explain That Stuff
  5. History of Quantum Computing ThoughtCo
  6. Possible Applications of Quantum Computing OWDT
  7. The Advantages and Disadvantage of Quantum Computing 1000projects.org


Further Reading