CS4450 - Coding and Information Theory

Midterm Review

Fall, 2004

 

These are some questions to help you review for the midterm. They are intended to help you study, but they will not replace studying the book and your notes.

  1. What is our model of a signaling system? Briefly discuss important characteristics and limitations of the model.

  2. What are our two primary goals in encoding data? How are they related to each other?

  3. What is a (finite) probability distribution? What is an event E in a sample space S? What is the probability of an event P(E)? What are independent events?

  4. What is conditional probability? Why are we interested in such things? What is Bayes' Theorem?

  5. What is a code? What is a codeword? What is an encoding function?

  6. What are some basic properties of ASCII code? In what ways might Morse coding be considered superior to ASCII code?

  7. What are uniquely decodable codes, instantaneous codes, comma codes, Huffman codes? What is the prefix property? How is it related to decoding?

  8. What are the Kraft and McMillan inequalities? What do they tell us?

  9. What are some important issues in data compression? What would we expect "fully compressed" data to look like?

  10. What is the definition of average code length? Calculate some average code lengths.

  11. What are the basic properties of white noise? What is the probability of k errors?

  12. What are extensions of a code?

  13. How do we define the information transmitted by one symbol? What properties does this function have?

  14. What is the average information transmitted by a single symbol from a set of symbols? What did we call this average?

  15. Given some probabilities, calculate the entropy.

  16. What is the Gibbs inequality?

  17. What probability distribution gives maximum entropy?

  18. What is the relationship between entropy and average code length?

  19. What is Shannon-Fano coding? Given some probabilities, write down a Shannon-Fano code.

  20. What is the entropy of the extension of a code?

  21. What does the noiseless coding theorem say? What does it mean?

  22. What is a communication channel? What is a memoryless channel? What is a binary symmetric channel? What are forward channel probabilities?

  23. What is a decision rule? What is an ideal observer? What is the maximum likelihood decision rule?

  24. What is nearest neighbor decoding? When is it appropriate to use?

  25. What are the geometric interpretations of the requirements for single and double error detection and correction? What is "Hamming distance" and how is it related to error detection and correction?

  26. How is sphere packing related to error detection/correction?

  27. What other important and/or interesting topics have we covered that ought to be on this list?
There are surely other items -- review the books and your notes.