Information Theory

2014/15
TA: Stefan DziembowskiDaniel Malinowski


Lecture: Wednesdays 12:15 - 13:45 (room 5440)
Exercises:
SD: Tuesdays 16:15 - 17:45 (room 3150)
DM: Wednesdays 14:15 - 15:45 (room 2280)

Assessment methods and assessment criteria: there will be a mid-term exam, and a final written exam consisting of two parts: the "theory" part, and the "exercises". The final grade will be a function of the following sum: 0.2 * A + 0.8 * B, where A and B are the numbers of points obtained from the mid-term and the final exams (respectively).

Lectures:
  • 7.10.2015 Source Coding

    We followed Chapter 1 of [1]. We defined: the codes, the uniquely decodable codes, and the instantaneous (prefix) codes. We proven the Kraft's and McMillan's Inequalities. We skipped the material on page 6,7, and 8 (in particular: we didn't do the Sardinas-Patterson Theorem).

  • 14.10.2015 Optimal Codes

    We followed Chapter 2 of [1] (without a Section 2.6). We defined average word-lengths of a code and optimal codes and we proved that optimal codes always exist. Then we introduced the Huffman Codes (we showed the algorithm for generating them) and we proved that Huffman Codes are optimal (for any alphabet).

  • 21.10.2015 Entropy

    We talked about the extensions of sources (Section 2.6 of [1]). Then we followed Chapter 3 of [1] up to Corollary 3.12. In particular we introduced the notions of entropy and binary entropy, and proven some basic facts about them. (In Section 3.1 we used the term Inf instead of I to avoid confusion about the "mutual information" that will be introduced later.)

  • 28.10.2015 - 16.12.2015: we followed [1] except of the following:
    • we covered introduction to entropy (like the chain rules and Markov chains) from [3]
    • we skipped Section 4.5 Extension of Shannon's First Theorem to Information Channels 
    • Section 6.6 Hadamard Matrices and Codes was covered on the exercises
    • Fano's inequality was done from [3], Section 2.10,
    • we skipped Section 7.5 The Golay Codes 

  • 13.01.2016 Quantum cryptography (see [2]), cryptography from noisy channels (the definition of secrecy capacity an Lemma 1 from Section II, [6])  
  • 20.01.2016 List-decoding of Hadamard codes [9]

  • 27.01.2016 Average-case extractors, two-source extractors [4], two-party communication complexity.
Exercises:
  • 07.10 (DM) and 13.10 (SD): We introduced the notion of maximal codes and maximal prefix codes. We solved exercises 1-3 and task 1 from [2] (in exercise 1 we added two examples: f) 0, 010, 11 and g) 00, 10, 111, 001, 101, 0011, 1011). We proved that for the finite prefix code C the conditions are equivalent: a) C is a maximal code b) C is a maximal prefix code c) C satisfy equality in Kraft's inequality. We constructed the prefix code for the given code-words lengths: 1, 2, 2, 2, 2, 2, 3, 3 for the ternary alphabet. We also showed an example of the Uniquely Decodable Code and an infinite word that can be decoded in two different ways. Exercise 4 from [2] was a homework.
  • 14.10 (DM): We solved the homework from previous exercises. We said why Huffman Codes are not perfect in real life* - we showed that we can code tuples of symbols and it can be better, as in example 2.12 from [1]. We solved exercises 2.3, 2.10, 2.13, 2.14 from [1] and exercise 3 from [2] (task 1 was a homework for next week). We also compared the average word-lengths of the codes for sources with distributions: a) 1/3, 2/3 b) 0, 1/3, 2/3.
  • 20.10 (SD): We solved the homework from previous exercises. We said why Huffman Codes are not perfect in real life* - we showed that we can code tuples of symbols and it can be better, as in example 2.12 from [1]. We solved exercises 2.3, 2.10, 2.13, 2.14 from [1].
  • 27.10 (SD)
  • 28.10 (DM): we solved the homework from last week, we showed that H(X, Y) <= H(X) + H(Y) (with equality if X and Y are independent), we did Exercise 3.3, 3.4, 3.6 from [1] and Section 3.7 from [1]. Homework: 2.10 (b) from [3], and Exercise 5.7 from [2]
  • 03.11 (SD)
  • 04.11 (DM) We solved the homework, Exercise 2.12 z [3], Exercise 1 from the mid-term exam from last year, and Exercises 1 and 2 from [here]
  • 10.11 (SD)
  • 17.11 (SD)
  • 18.11 (DM): We did Exercise 3 from [here], and Exercise 4.10 from [1]
  • 25.11 (DM)Exercises 4.12, 4.13, 4.15, Exercise 2 from [here], example of a random variable X such that H(X) > M, H_inf(X) < 1/M,, Exercises 5.5 and 5.8 from [1] . Homework: Exercise 1 from [here]
  • 02.12 (DM)Exercise 1 from [here], Exercises 5.9, 6.1, 6.5, 6.18, 6.20, 6.21 from [1].
  • 1.12 (SD): Exercises 6.1, 6.20, 6.18 from [1]. Section 6.6 from [1] until (and including) Lemma 6.28 with the proof.
  • 8.12 (SD)Example 6.6, Exercise 6.4, Exercise 6.8, Example 6.20. Example 6.22, Exercise 6.10, Exercise 6.21
  • 15.12 (SD): Exercise 6.6, Exercise 7.5, Exercise 7.7 (and the paragraph above it), we also showed how to obtain the generation matrix of the Hadamard code of length 7, then we started to talk about cryptography: we used slides 15,19,20,31-35 from [pptx,pdf], and the cryptography section from [2]
  • 22.12 (SD): One-time pad, Shannon's theorem (Lecture 6 of [2], Shannon's theorem -- we did only the first part), pseudorandom generators, CPA-secure encryption, stream ciphers (see [pptx,pdf] and [pptx,pdf]), statistical distance (see [4])
  • 12.01 (SD)One-Time MAC, Strongly Universal Hash Family (Chapter 4.5 of [5]), Blom's scheme.
  • 19.01 (SD): Introduction to randomness in cryptography (we used [4] up to Sect. 4.4, skipping the probabilistic constructions)
  • 26.01 (SD): Multiparty communication complexity (Proposition 2.3 from [7]), Private Information Retrieval (Sect 3.1, 3.2, 5.1 from [8])

* They need independence and known distribution and sometimes it is better to code something more than a single symbol.

Mid-term exam: 
  • exercises with solutions [pdf]
  • results [pdf]

Bibliography
  1. G. A. Jones. J. M. Jones: Information and Coding Theory freely available online from the departmental IP address
  2. D. Niwiński, M. Strojnowski, M. Wojnarski: Teoria Informacji.
  3. Thomas M. Cover, Joy A. Thomas: Elements of Information Theory, 2nd Edition
  4. Sebastian Faust: Randomness in Cryptography
  5. Douglas R. Stinson: Cryptography: Theory and Practice, Third Edition
  6. Ueli M. Maurer: 
  7. Secret Key Agreement by Public Discussion 
  8. from Common Information [pdf]
  9. Pavel Pudlák, Vojtech Rödl, Jirí Sgall: Boolean Circuits, Tensor Ranks, and Communication Complexity. [pdf]
  10. Benny Chor, Eyal Kushilevitz, Oded Goldreich, Madhu Sudan: Private Information Retrieval. [pdf]
  11. Thomas Holenstein: Complexity Theoretic Cryptography, Chapter 4 [pdf]
Previous course: 2014/15  (in Polish).
  • Example of a mid-term exam [pdf].
  • Mid-term exam from the last year [pdf].
Exams from the past course: [pdf,pdf].

Final results of the exam: [pdf]