-
Tuesday, June 06, 2023 12:00 pm - 1:00 pm(New York Time) Add to my calendar
Understanding Quantum Computing - Part 1
Dr. Abbas Omar
Professor Emeritus
University of Magdebur, Germany
Abstract: Classical computers are based on the binary coding of digital data. The binary codes are processed by applying Boolean algebra using digital electrical circuits known as logic gates. The latter are simple configurations of transistors and other circuit elements (resistors, capacitors, etc.). The operation of the logic gates is governed by the Circuit Theory, which is a simplified form of the Classical Electromagnetics.
In the sixties of last century, an observation has been made, which is nowadays known as “Moor’s Law”. It predicts a doubling in the number of components (mainly transistors) in an integrated circuit every two years. Based on that, molecular-scale transistor size has been forecasted (state-of-the-art transistors can be as small as few atoms), which must lead to invalidate the Circuit Theory as a mathematical tool for describing the performance of the logic gates. This has motivated a number of scientists in the early eighties of last century to think about replacing these classical-computer building blocks by what is called Quantum Gates. The operation of the latter is fully governed by the laws of Quantum Mechanics. Computers whose building blocks are quantum gates are called Quantum Computers.
As the classical laws (of either mechanics or electromagnetics) are a limiting case of the more comprehensive quantum ones, it has become evident that quantum computers can implement algorithms that are not available in their classical counterparts. This has encouraged scientists to develop such algorithms and research and development facilities to find and build corresponding hardware realizations. Prominent examples are the hardware developed by IBM and Google.
Most of the approaches used for explaining quantum computing rely on the highly academic concepts of Quantum Mechanics. Used terminologies are not easily understandable by the majority of interested audience, who just have only general knowledge about the subject. In some cases, the exaggerated perception is gained, that this “magic thing” is capable of solving all computationally based problems much more efficient and much faster than the classical computer.
In this talk just graduate-level mathematical tools and terminologies are used to explain the concepts underlying quantum computing and the functions of the corresponding hardware—The Quantum Computer. The main differences between classical and quantum mechanics will be concisely reviewed. Quantum dynamic variables like position and momentum in mechanical systems as well as voltage and current in electrical circuits are shown to behave as random rather than deterministic signals, whose attributes can be used to carry information. It will also be demonstrated that these random signals can be stored and processed in what is called Qubits. The latter are the quantum counterpart of the classical Bits. The hardware realization of qubits in form of superconducting circuits—the Transmons—will be explained. Other realizations of qubits, such as trapped ions and quantum dots, will not be considered.
Due to the impossibility of fully isolating dynamic systems from their surroundings, thermal noise and quantum dynamic variables, both being random signals, interact together. As opposed to deterministic signals, noise corruption has a different form in this case. It deteriorates an essential statistical attribute of interacting quantum dynamic variables, which is known as Coherency. The latter is a sort of “memory”, which enables different quantum dynamic variables to “remember” each other.
Using quantum systems for encoding and processing information needs therefore cooling the systems down to very near the absolute zero temperature (0°K) in order to reduce the noise impact on the coherency. Coding and processing errors due to deteriorated coherencies might also need involving error-correction techniques similar to those known in Channel Coding.
Speaker’s Bio: Dr. Abbas Omar is Professor emeritus at the Otto-von-Guericke University of Magdebur in Germany. He received the B.Sc., M.Sc. and Doktor-Ing. degrees in electrical engineering in 1978, 1982 and 1986, respectively. He has been professor of electrical engineering since 1990 and director of the Chair of Microwave and Communication Engineering at the Otto-von-Guericke University of Magdeburg, Germany from 1998 to his retirement in 2020. He joined the Petroleum Institute in Abu Dhabi as a Distinguished Professor in 2012 and 2013 as an organizer of the research activities for the Oil and Gas Industry in this area. In 2014 and 2015 he chaired the Electrical and Computer Engineering at the University of Akron, Ohio, USA. Dr. Omar authored and co-authored more than 480 technical papers extending over a wide spectrum of research areas. His current research and teaching fields cover the areas of health aspects of millimeter-wave radiations, quantum computing, phased arrays and beamforming for massive MIMO, and magnetic-resonance imaging. He also covered in the past other disciplines including microwave and acoustic imaging, microwave and millimeter-wave material characterization, indoor positioning, subsurface tomography and ground penetrating radar, and field theoretical modeling of microwave systems and components. Dr. Omar is IEEE Fellow.