Shannon: information theory
A A satellite in the solar system got pictures of Jupiter and Saturn and the pictures were meant to be sent back to the earth. Unfortunately, the satellite was broken so it took a while for the earth to receive the collected photographs. And eventually, the malfunctioning satellite was off the solar system. All this information transmitting technology should be credited to Claude E. Shannon(1916-2001) and his information theory.
B Shannon was born in Petoskey, Michigan. His father, Claude Sr (1862-1934), a descendant of the early New Jersey settlers, was a self-made businessman and for a while, Judge of Probate. Shannon showed an inclination towards mechanical things. His best subjects were science and mathematics, and at home he constructed such devices as models of planes, a radio-controlled model boat and a wireless telegragh system to a friend’s house half a mile away. While growing up, he worked as a messenger for Western Union. His childhood hero was Thomas Edison, who he later learned was a distant cousin. Both were descendants of John Ogden, a colonial leader and an ancestor of many distinguished people.
C Shannon first began his research in the information field just to distinguish the correctness of a piece of information. The main concepts of information theory can be grasped by considering the most widespread means of human communication: language. Two important aspects of a concise language are as follows: First, the most common words (e.g., “a”, “the”, “I”) should be shorter than less common words (e.g., “roundabout”, “generation”, “mediocre”), so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise, e.g., a passing car, the listener should still be able to glean the meaning of the underlying message. Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory.
D Note that these concerns have nothing to do with the importance of messages. For example, a platitude such as “Thank you; come again” takes about as long to say or write as the urgent plea, “Call an ambulance!” while the latter may be more important and more meaningful in many contexts. Information theory, however, does not consider message importance or meaning, as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities.
E Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory.
F Coding theory is concerned with finding explicit methods, called codes, of increasing the efficiency and reducing the net error rate of data communication over a noisy channel to near the limit that Shannon proved is the maximum possible for that channel. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. The rate of transmitting information relies on the amount of noise. In the latter case, it took many years to find the methods Shannon’s work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from
...