Informationstheorie und Codierung

该笔记是对课程Informationstheorie und Codierung的课堂记录,主要是一些知识点的总结。

授课老师

主要内容

  1. Introduction: binomial distribution, (7,4)-Hamming code, parity-check matrix, generator matrix
  2. Probability, entropy, and inference: entropy, conditional probability, Bayes’ law, likelihood, Jensen’s inequality
  3. Inference: inverse probability, statistical inference
  4. The source coding theorem: information content, typical sequences, Chebychev inequality, law of large numbers
  5. Symbol codes: unique decidability, expected codeword length, prefix-free codes, Kraft inequality, Huffman coding
  6. Stream codes: arithmetic coding, Lempel-Ziv coding, Burrows-Wheeler transform
  7. Dependent random variables: mutual information, data processing lemma
  8. Communication over a noisy channel: discrete memory-less channel, channel coding theorem, channel capacity
  9. The noisy-channel coding theorem: jointly-typical sequences, proof of the channel coding theorem, proof of converse, symmetric channels
  10. Error-correcting codes and real channels: AWGN channel, multivariate Gaussian pdf, capacity of AWGN channel
  11. Binary codes: minimum distance, perfect codes, why perfect codes are bad, why distance isn’t everything
  12. Message passing: distributed counting, path counting, low-cost path, min-sum (=Viterbi) algorithm
  13. Exact marginalization in graphs: factor graphs, sum-product algorithm
  14. Low-density parity-check codes: density evolution, check node degree, regular vs. irregular codes, girth
  15. Lossy source coding: transform coding and JPEG compression

相关资料