Transmission of Information(1927)
September 9, 2023In Transmission of Information, Hartley developed a quantitative measure of “information” in 1927. Hartley claims that information is the outcome of a selection among a finite set of possible messages. Shannon’s “A mathematical theory of communication”, which is based in part on Hartley’s ideas, published in 1947. Hartley did not model the source of information probabilistically. Shannon modeled the source of information as a random process.
Hartley stated that information should be proportional to the number of selections for practical engineering value. He uses the letter \(H\) to denote the amount of information associated with \(n\) selections. $$ H=Kn $$ where \(K\) is a constant which depends on the number \(s\) of symbols available at each selection. Let us consider two systems with \(s_1, s_2, K_1, K_2\). We then define \(K_1, K_2\) by the condition that whenever the number of possible sequences is the same for both systems, then the amount of information is also the same for both: $$ s_1^{n_1} = s_2^{n_2} $$ $$ H=K_1n_1=K_2n_2 $$ $$ \frac{K_1}{\log s_1}=\frac{K_2}{\log s_2} $$
This relation will hold for any \(s\) only if \(K\) depends on \(s\) where $$ K=K_0\log s $$ and \(K_0\) is the same for all systems.
Since \(K_0\) is arbitrary, we can omit it if we make logarithmic base arbitrary. Then we can state that: $$ H=n\log s $$