Informationstheorie und Codierung
该笔记是对课程Informationstheorie und Codierung的课堂记录,主要是一些知识点的总结。
授课老师
主要内容
- Introduction: binomial distribution, (7,4)-Hamming code, parity-check matrix, generator matrix
- Probability, entropy, and inference: entropy, conditional probability, Bayes’ law, likelihood, Jensen’s inequality
- Inference: inverse probability, statistical inference
- The source coding theorem: information content, typical sequences, Chebychev inequality, law of large numbers
- Symbol codes: unique decidability, expected codeword length, prefix-free codes, Kraft inequality, Huffman coding
- Stream codes: arithmetic coding, Lempel-Ziv coding, Burrows-Wheeler transform
- Dependent random variables: mutual information, data processing lemma
- Communication over a noisy channel: discrete memory-less channel, channel coding theorem, channel capacity
- The noisy-channel coding theorem: jointly-typical sequences, proof of the channel coding theorem, proof of converse, symmetric channels
- Error-correcting codes and real channels: AWGN channel, multivariate Gaussian pdf, capacity of AWGN channel
- Binary codes: minimum distance, perfect codes, why perfect codes are bad, why distance isn’t everything
- Message passing: distributed counting, path counting, low-cost path, min-sum (=Viterbi) algorithm
- Exact marginalization in graphs: factor graphs, sum-product algorithm
- Low-density parity-check codes: density evolution, check node degree, regular vs. irregular codes, girth
- Lossy source coding: transform coding and JPEG compression