Advanced Information Theory - 情報理論特論
電気通信大学 — University of Electro-Communications, Tokyo
2010 Fall Semester 後学期
- Class meeting: Thursdays, 10:40 to 12:10. 木曜日2時限
- Classroom (October-December 2010): West Building 1, Room 217 (西1号館 217室)
- Classroom (January-February 2011): West Building 5, Room 101 (西5号館101室)
- Course web site: http://www.lit.ice.uec.ac.jp/kurkoski/ait
- Instructor: Brian KURKOSKI ブライアン クルカスキー (West Building 1, Room 301) kurkoski@ice.uec.ac.jp 0424-43-5295
- Instructor: Hideki Yagi 八木秀樹
教務課のシラバス Official Syllabus
Textbook 教科書
Recommended (but not required) textbooks:
- Thomas M. Cover and Joy A. Thomas, Elements of Information Theory 2nd Edition, Wiley-Interscience, Aug. 1991. book
- David J. C. MacKay, Information Theory, Inference & Learning Algorithms, Cambridge University Press, June 2002. Free PDF
Pre-requirement
- A previous course on information theory or coding theory is required.
- Previous courses in discrete mathematics or linear algebra are desirable.
- 前もって履修しておくべき科目 情報理論、符号理論
- 前もって履修しておくことが望ましい科目 離散数学、線型代数学
Objectives - 主題
Information theory is a field concerned with mathematical aspects of information and communication. Modern communication systems are based on these ideas, which are normally divided into the areas of "channel coding" and "source coding".
The focus of this class is channel coding. In particular, the goal of this class is understanding of Shannon's well-known channel coding theorem, as well as the understanding of how it applies to practical communication systems. Mutual information and channel capacity, used to evaluate channel reliability are explained. The channel coding theorem will be explained. Recent developments in error-correcting codes and decoding algorithms will be reviewed and explained. Codes based upon lattices, and their decoding algorithms will be discussed.
情報理論は情報・通信の様々な性質を数学的に論じる学問分野である.特に,C.E. Shannonによって示された2つの符号化定理(情報源符号化定理と通信路符号化定理)は現在のデジタル通信システムの礎となっている.本授業では,情報理論の2大テーマのうち「通信路符号化」に焦点を当てて講義する.ここでは,Shannonが示した符号化定理を理解し,符号化定理を実現するための様々な技術を理解することを目標とする.
デジタル通信の信頼性の評価に必要な相互情報量や通信路容量について解説した後,通信路符号化定理について述べる.その後,具体的な誤り訂正符号や復号アルゴリズムを最新の技術も取り入れながら概説する.さらに,通信路符号化の発展として,通信路等化アルゴリズムやLattice符号を用いた通信システムなどに言及する.
Outline - 授業内容
- Lecture 1: Overview, models of digital communications systems
- Lecture 2: Low-density party-check (LDPC) code construction
- Lecture 3: Belief-propgation decoding of LDPC codes
- Lecture 4: Encoding of LDPC codes
- Lecture 5: Probabilistic inference: Interference channels and the BCJR algorithm
- Lecture 6: Probabilistic inference: Turbo codes
- Lecture 7: Probabilistic inference: Lattice codes
- Lecture 8: Probabilistic inference: Gaussian mixtures for lattices
- Lecture 9: Lattices for the AWGN channel
- Lecture 10: Capacity of the memoryless channel
- Lecture 11: Channel coding theorem: proof
- Lecture 12: Channel coding theorem: proof converse
- Lecture 13: Notable linear codes: Hamming codes and Reed-Solomon codes
- Lecture 14: Capacity-achieving codes: iterative codes
- Lecture 15: Capacity-achieving codes: concatenated codes
|
- 第1回: ガイダンス、デジタル情報通信と通信路符号化のモデル
- 第2回: 確率的復号可能な符号:LDPC符号の構成
- 第3回: LDPC符号:確率伝搬復号
- 第4回: LDPC符号:符号化
- 第5回: 統計的推論アルゴリズム:符号間干渉通信路とBCJR検出
- 第6回: 統計的推論アルゴリズム:Turbo等化
- 第7回: 統計的推論アルゴリズム:Lattice等
- 第8回: 統計的推論アルゴリズム:ガウス混合分布
- 第9回: AWGN通信路に対するLattice
- 第10回: 無記憶通信路の通信路容量
- 第11回: 通信路符号化定理:順定理
- 第12回: 通信路符号化定理:逆定理
- 第13回: 重要な線形符号:Hamming符号とReed-Solomon符号)
- 第14回: 通信路容量を達成する符号:繰り返し符号
- 第15回: 通信路容量を達成する符号:連接符号
|
Schedule — スケージェル
Note: Lectures 1-9 are given by Brian Kurkoski. Lectures 10-15 are given by Hideki Yagi.
Lecture 1, October 7
Lecture: Overview ガイダンス
Lecture 2, October 14
Lecture (1) Basic probability terms (2) Bayesian networks and Pearl's example, p. 36 (3) Belief propagation (4) Introduction to coding theory (Justensen text, Chapter 1, vector spaces and definition of a code).
Homework
Lecture 3, October 21
Lecture (1) Additional tips on homework (2) generator and parity-check matrices
Lecture 4, October 28
Lecture (1) Decoding (7,4) Hamming code. (2) Singleton bound. (3) Comparison of RS/BCH, convolutional, LDPC Codes (4) Introduction to LDPC codes: code definition, Tanner graph.
No homework
Lecture 5, November 4
Lecture Sum-Product/Belief-Propagation decoding of LDPC codes
Homework 2 pdf
Tip for Problem 1:
- It is not known if the decoder will converge in the 5 steps or not. You don't need many iterations, you only need to follow the 5 steps in the problem. [Added Nov. 9]
Tips for Problem 2:
- Use block length N=96 instead of 100 (Because the code rate is 1/2, M = N/2. To construct the column-weight 3 Gallager parity check matrix, it is easier if M is divisible by 3) [added Nov. 4]
- If you are having problems with your decoder, make sure it works for the cases of no error/few errors. For example, use the all-zeros vector (that is, all the Pr(xi | yi) are 1-p) as the input to your decoder. Then, try adding just one error (one is p the other N-1 are 1-p). The "correct" behavior should be obvious when there are few errors. [added Nov. 5]
- There is one important point about LPDC implementations that I did not mention in class. If the probabilities get too close to 0 or 1, then the decoding can fail. Somewhere in your code, perhaps at the output of the variable node, add statements like this:
- Plim = 1E-4;
- if (q < Plim)
- q = Plim;
- end
- if (q > 1-Plim)
- q = 1-Plim;
- end
This way, q is never smaller than Plim, and never bigger than 1-Plim. The exact value of Plim is not important; Plim = 1E-3 to 1E-7 should be OK. [added Nov. 5]
Lecture 6, November 11
Lecture 7, November 25
Lecture LDPC density evolution for the erasure channel (1) degree distributions (2) messages and decoding rules for BEC (3) all-zeros codeword, ensembles, concentration (4) density evolution and noise threshold
Homework 3 pdf
Lecture 8, December 2
Lecture Encoding LDPC codes. Finding canonical form. Richardson-Urbanke encoding. Code rate vs. design rate. Modified array codes.
Homework (1) Use Matlab to implement Richardson-Urbanke Encoding (2) Find parameters of modified array codes (Details are in the slides)
Lecture 9, December 16
Lecture 10, Tuesday, January 18
Lectures 10-12 are held at W5-101 on Tuesday, 9:00 to 10:30
Lecture Entropy. Mutual Information. Capacity. Channel Coding Theorem.
Homework (1) Calculation of entropy (2) Relationship between mutual information and entropy
Lecture 11, Tuesday, January 25
Lecture Capacity. (Direct) Channel Coding Theorem. Random code ensemble. Jointly typical decoding. Law of large numbers.
Homework (1) Conditional entropy on BSC (2) Weak law of large numbers
Lecture 12, Tuesday, February 1
Lecture (Converse) Channel Coding Theorem. Properties of Entropy and Mutual Information, Fano's Inequality, Data Processing Inequality
Homework (1) Mutual information for a Markov chain (2) Property of Entropy
Lectures 13, Tuesday, February 8
- Begins at 9 a.m. at W5-101
Final Report
- Report PDF
- Deadline: Feb. 21 (Mon), 2011.
- Submission: Hideki Yagi(八木秀樹)’s mailbox on the 2nd floor, W1 building
Assessment Policy - 成績評価方法
- Homework 課題: 40%
- Midterm project 中間レポート: 30%
- Final project 期末レポート: 30%