Vishal Gupta (TE - Elecs - IIT)
The program to re-investigate the fundamental principles of physics from the standpoint of information theory is still in its infancy. However, it already appears to be highly fruitful, and it is this ambitious program that is summarized here.
Historically, the concept of information in physics does not have a clear-cut origin. An important thread can be traced if we consider the paradox of Maxwell's demon of 1871.
Recall that Maxwell's demon is a creature that opens and closes a trap door between two compartments of a chamber containing gas, and pursues the subversive policy of only opening the door when fast molecules approach it from the right, or slow ones from the left. In this way the demon establishes a temperature difference between the two compartments without doing any work, in violation of the second law of thermodynamics, and consequently permitting a host of contradictions.
In this illustration the demon sets up a pressure difference by only raising the partition when more gas molecules approach it from the left than from the right. This can be done in a completely reversible manner, as long as the demon's memory stores the random results of its observations of the molecules. The demon's memory thus gets hotter. The irreversible step is not the acquisition of information, but the loss of information if the demon later clears its memory.
A number of attempts were made to exorcise Maxwell's demon, such as arguments that the demon cannot gather information without doing work, or without disturbing (and thus heating) the gas, both of which are untrue. Some were tempted to propose that the 2nd law of thermodynamics could indeed be violated by the actions of an "intelligent being.'' It was not until 1929 that Leo Szilard made progress by reducing the problem to its essential components in which, the demon need merely identify whether a single molecule is to the right or left of a sliding partition, and its action allows a simple heat engine, called Szilard's engine, to be run. Szilard still had not solved the problem, since his analysis was unclear about whether or not the act of measurement, whereby the demon learns whether the molecule is to the left or the right, must involve an increase in entropy.
A definitive and clear answer was not forthcoming, surprisingly, until a further fifty years had passed. In the intermediate years digital computers were developed, and the physical implications of information gathering and processing were carefully considered. The thermodynamic costs of elementary information manipulations were analyzed by Landauer and others during the 1960s (Landauer 1961, Keyes and Landauer 1970), and those of general computations by Bennett, Fredkin, Toffoli and others during the 1970s. It was found that almost anything could in principle be done in a reversible manner, i.e. with no entropy cost at all. Bennett (1982) made explicit the relation between this work and Maxwell's paradox by proposing that the demon can indeed learn where the molecule is in Szilard's engine without doing any work or increasing any entropy in the environment, and so obtain useful work during one stroke of the engine.
However, the information about the molecule's location must then be present in the demon's memory. As more and more strokes are performed, more and more information gathers in the demon's memory. To complete a thermodynamic cycle, the demon must erase its memory, and it is during this erasure operation that we identify an increase in entropy in the environment, as required by the 2nd law. This completes the essential physics of Maxwell's demon.
Classical information theory is founded on the definition of information. A warning is in order here. Whereas the theory tries to capture much of the normal meaning of the term "information", it can no more do justice to the full richness of that term in everyday language than particle physics can encapsulate the everyday meaning of "charm". "Information" for us will be an abstract term. Much of information theory dates back to seminal work of Shannon in the 1940's (Slepian 1974). The observation that information can be translated from one form to another is encapsulated and quantified in Shannon's noiseless coding theorem (1948), which quantifies the resources needed to store or transmit a given body of information. Shannon also considered the fundamentally important problem of communication in the presence of noise, and established Shannon's main theorem, which is the central result of classical information theory. Error-free communication even in the presence of noise is achieved by means of "error-correcting codes", and their study is a branch of mathematics in its own right. Indeed, the journal IEEE Transactions on Information Theory is almost totally taken up with the discovery and analysis of error-correction by coding. Pioneering work in this area was done by Golay (1949) and Hamming.
What lies behind us and what lies before us are tiny matters compared to what lies within us. - Oliver Wendell Holmes
The great thing in the world is not so much where we stand, as in what direction we are moving. - Oliver Wendell Holmes (PooBaer7)
Nothing gives one person so much advantage over another as to remain always cool and unruffled under all circumstances. - Thomas Jefferson
Take a moment to show some kindness to a man who thinks that blindness is a disease that affects the eyes alone. - Maya Angelou