The main problem of information and coding theory can be described in a simple way as follows. Imagine that a stream of source data, say in the form of bits(0s and 1s), is being transmitted over a communications channel, such as telephone line. From time to time, disruptions take place along the channel, causing some of the 0s to be turned into 1s, and vice-versa. The question is "How can we tell when the original source data has been changed, and then it has, how can we recover the original data?"
本书为英文版。
目录
Preface Introduction Part 1 Information Theory Chapter 1 Entropy 1.1 Entropy of a Source 1.2 Properties of Entropy 1.3 Additional Propcrties of Entropy Chapter 2 Noiseless Voding 2.1 Variable Length Encoding 2.2 Huffman Encoding 2.3 The Noiseless Coding Theorem Chapter 3 Noisy Coding 3.1 The Discrete Memoryless Channcl and Conditional Entropy 3.2 Mutual Information and Channel Capacity 3.3 THe Noisy Coding Theorem 3.4 Proof of the Noisy Coding Theorem and Its Strong Converse Part 2 Coding Theory Chapter 4 General Remarks on Codes 4.1 Error Detection and Correction 4.2 Minimum Distance Decoding 4.3 Families of Codes 4.4 Codes and Designs 4.5 The Main Coding Theory Problem Chapter 5 Linear Codes 5.1 Linear Codes and Their Duals 5.2 Weight Sistributions 5.3 Maximum Distance Separable Codes 5.4 Invariant Theory and Self-Dual Codes Chapter 6 Some Linear Codes 6.1 Hamming and Golay Codes 6.2 Reed-Muller Codes Chapter 7 Finite Fields and Cyclic Codes Chapter 8 Some Cyclic Codes Appendix Preliminaries Tables References Symbol Index Index