## 2 thoughts on “C. E. Shannon”

1. shinichi Post author

A Mathematical Theory of Communication

by C. E. Shannon

Reprinted with corrections from The Bell System Technical Journal,

Vol. 27, pp. 379–423, 623–656, July, October, 1948.

http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf

**

Introduction

The recent development of various methods of modulation such as PCM and PPM which exchange bandwidth for signal-to-noise ratio has intensified the interest in a general theory of communication. A basis for such a theory is contained in the important papers of Nyquist and Hartley on this subject. In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible due to the statistical structure of the original message and due to the nature of the final destination of the information.

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.

If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Although this definition must be generalized considerably when we consider the influence of the statistics of the message and when we have a continuous range of messages, we will in all cases use an essentially logarithmic measure.

The logarithmic measure is more convenient for various reasons:

1. It is practically more useful. Parameters of engineering importance such as time, bandwidth, number of relays, etc., tend to vary linearly with the logarithm of the number of possibilities. For example, adding one relay to a group doubles the number of possible states of the relays. It adds 1 to the base 2 logarithm of this number. Doubling the time roughly squares the number of possible messages, or doubles the logarithm, etc.

2. It is nearer to our intuitive feeling as to the proper measure. This is closely related to (1) since we intuitively measures entities by linear comparison with common standards. One feels, for example, that two punched cards should have twice the capacity of one for information storage, and two identical channels twice the capacity of one for transmitting information.

3. It is mathematically more suitable. Many of the limiting operations are simple in terms of the logarithm but would require clumsy restatement in terms of the number of possibilities.

2. shinichi Post author

(sk)

多くの日本の「情報」の教科書の最初に出てくるシャノンの「通信の数学的理論」の「論文」。教科書には、

その論文では、「情報」のある一つの側面に着目して情報量を定義し、それをもとに情報理論と呼ばれている学問を構築しています。今回の講座では、シャノンの情報理論のアウトラインを紹介します。

などと書いてある。

これは「情報」の論文ではなく、「情報技術」の論文ですらなく、「通信の数学的理論」の論文なのだ。

それを「情報」の教科書の最初に載せてなんの疑問も持たない人たちの鈍感さには、ただただ恐れ入る。