Home > Issue_04 > The Essence of Quantum Computing

The Essence of Quantum Computing

4.5 Bridging quantum and classical mechanics

 

The brain is the only kind of object capable of understanding that the cosmos is even there, or why there are infinitely many prime numbers, or that apples fall because of the curvature of space-time, or that obeying its own inborn instincts can be morally wrong, or that it itself exists.63 —David Deutsch

Why Nature conforms to the above four postulates at the quantum level, and how they may link to the laws of classical physics (Newton’s laws of motion, Maxwell’s equations of electromagnetism, and Einstein’s theory of general relativity) has confounded the human mind. Yet, a strong belief in Bohr’s “correspondence principle” lingers that such a link would eventually emerge in the limit when the quantum numbers describing the system are large, meaning either some quantum numbers of the system are excited to a very large value, or the system is described by a large set of quantum numbers. For the present, the gulf between classical physics and quantum physics, as seen from Sections 2 and 3, appear so wide as to be unbridgeable. Even Bohr was pessimistic:

The repeatedly expressed hopes of avoiding the essentially statistical character of quantum mechanical description by the assumption of some causal mechanism underlying the atomic phenomena and hitherto inaccessible to observation would indeed seem to be as vain as any project of doing justice to the increased profundity of the picture of the world achieved by the general theory of relativity by means of the ordinary conceptions of absolute space and time. Above all such hopes would seem to rest upon an underestimate of the fundamental differences between the laws with which we are concerned in atomic physics and the every day experiences which are comprehended so completely by the ideas of classical physics.64

Recently, Peter Renkel with remarkable insight has hypothesized a bridge. He “starts from the generalization of a point-like object and naturally arrives at the quantum state vector of quantum systems in the complex valued Hilbert space, its time evolution and quantum representation of a measurement apparatus of any size. … [He shows] that a measurement apparatus is a special case of a general quantum object. [Finally, he provides an] example of a measurement apparatus of an intermediate size .”65

5. The thermodynamics of computation

 

Modern information theory is based on two important facts: Shannon’s definition of information, and Landauer’s observation that information is physical because it must always be encoded in a physical system without which it is impossible to store, transmit, process, or receive information. Further, the information held by a physical system contributes to defining the state of the system. Note also, that in a mathematical (and hence a computational) model of the laws of Nature, semantic aspects are irrelevant to the interpretation problem. Nature can exist without humans interpreting it!

Shannon’s famous twin papers, A Mathematical Theory of Communication,66 published in July, October 1948, set the foundation for information theory. His seminal contribution was to define the concept of information mathematically, and then to consider the transmission of information as a statistical phenomenon in a way that gave communications engineers a method to determine the capacity of a communication channel in terms of classical bits. We know that information can be expressed and encoded in different but equivalent ways without losing its essential nature. The binary form allows convenient automatic manipulation of information: a computer need only manipulate quite simple things like integers to do surprisingly powerful information processing, from document preparation to differential calculus, to translating between human languages, to even mimicking human intelligence. Translation allows us to choose suitable hardware and software technologies for information processing.

Shannon also defined the crucial concept of information entropy (now called Shannon entropy)67 H(X) analogous to thermodynamic entropy:

which provides a way to estimate the average minimum number of bits needed to encode a string of symbols, based on the frequency of the symbols or the probability pi of its appearance. Shannon entropy does not measure degree of accuracy, it measures the degree of degeneracy of a system. In general, a complicated set of instructions can be reduced to n binary choices. We then have a ready measure of the information content of the object by simply counting the number of binary choices or the state of n binary bits. The state of a binary bit, by previous agreement, can be arbitrarily mapped to the binary alternatives, such as, 0 for spin-down of an electron and 1 for spin-up. We can thus use the bit as the unit of information and measure the information content of an information carrying object as the size of the set of instructions needed to reconstruct the state of the object.

In 1961, Rolf Landauer, in a seminal paper on reversible computing68, provided a remarkable insight. Since information is physical, a fact recently verified experimentally,692 the laws of physics must therefore place limits on information processing whether classical or quantum. He insightfully turned to thermodynamics and proved that only logically irreversible operations — those which cause information loss — expend energy. He showed that there is a fundamental asymmetry in the way Nature processes information. He proved the counter-intuitive result that all, but one, operation required in computation could, in principle, be performed in a reversible manner without dissipating heat. For example, copying classical information can be done reversibly and without wasting any energy, but when information is erased there is a minimum energy cost of kT ln 2 per classical bit (about 3  1021 joule at room temperature) to be paid, where k is the Boltzmann constant, and T is the absolute temperature in Kelvin of the computer’s environment. That is, the erasure of information is inevitably accompanied by the generation of heat. Once a system undergoes an irreversible action, its past history, unless archived, is irrevocably lost.

Landauer saw computations as engines converting free energy into waste heat and mathematical work where erasure of information inevitably leads to heat generation. For a lucid explanation of the erasure principle, see Plenio and Vitelli70. For experimental support of Landauer’s principle, see Orlov, et al71. Landauer’s principle and the second law of thermodynamics can indeed be understood as a logical consequence of the underlying reversible laws of physics as seen in the general Hamiltonian formulation of mechanics and in the unitary time evolution in quantum mechanics. Thus, Landauer’s principle connects physics with information theory. The connection led people to think of energy-efficient algorithms at a time when reducing algorithmic complexity was the craze.

What is remarkable about Landauer’s principle is that while computation is “an abstract mathematical process, mapping one set of input bits into another set of output bits”72, it is not at all obvious that there exists a fundamental connection between such a mapping and microscopic motion associated with heat. It, indeed, appears remarkable that logical reversibility of a computation (i.e., inputs can be inferred from the outputs) also implies physical reversibility73. The key to physical reversibility is to ERASE WITH A COPY.74 That information is convertible into energy has been experimentally shown.75

When Landauer published his paper, it was generally assumed that deterministic computation is not necessarily logically reversible and typical programming is unlikely to be so. Lecerf (1963) 76and Charles Bennett (1973)77 showed that deterministic computation could be simulated by a logically reversible Turing machine. That is, “it is possible to reprogram any deterministic computation as a sequence of logically reversible steps, provided the computation is allowed to save a copy of its input.” 78Bennett further suggested computing with nucleic acids to realize physical reversibility.79

By analogy with thermodynamics, one may conclude that Information that has not been compacted to reveal a useful interpretable pattern is random information, which on average does not create useful knowledge in a human mind. In this respect, molecular biologists face an acute problem. Till they find a theory to generate a given DNA sequence, the only way to communicate complete genetic information to another is to send the entire DNA sequence. On the other hand, the equations of physics have continually progressed towards compactness to an amazing degree. Physics is coded and computable information. Gregory Chaitin had the remarkable insight at a very young age that
a scientific theory is a computer program that calculates the observations, and that the smaller the program is, the better the theory. If there is no theory, that is to say, no program substantially smaller than the data itself, considering them both to be finite binary strings, then the observations are algorithmically random, theory-less, unstructured, incomprehensible and irreducible.80

To Chaitin, theory and computations are intimately connected:

theory = program –> Computer –> output = experimental data


[63] Deutsch (2012).
[64] Bohr (1937).
[65] Renkel (2017).
[66] Shannon (1948).
[67] Shannon, undecided whether to name his logarithmic formula ‘information’ or ‘uncertainty’, asked John von Neumann, who responded: Neither, and suggested: “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.” Source: Shannon’s Information Theory. Science4All, 17 March 2013. https://www.science4all.org/article/shannons-information-theory/
[68] Landauer (1961).
[69] Toyabe, et al (2010a, b).
[70] Plenio & Vitelli (2001). See also: Bennett (2003, revised 2011).
[71] Orlov, et al (2012).
[72] Orlov, et al (2012).
[73] Orlov, et al (2012).
[74] Orlov, et al (2012).
[75] Toyabe, et al (2010a, b).
[76] Lecerf (1963).
[77] Bennett (1973). A seminal contribution.
[78] Bennett (2003, revised 2011).
[79] Bennett. (1973). See also: Bennett. (1982); Thachuk (2013).
[80] Chaitin (2003).

Pages ( 8 of 13 ): « Previous1 ... 67 8 910 ... 13Next »

Leave a Comment:

Your email address will not be published. Required fields are marked *