CS4450 - Coding and Information Theory
Midterm Review
Fall, 2004
These are some questions to help you review for the midterm. They are
intended to help you study, but they will not replace studying the book and
your notes.
- What is our model of a signaling system? Briefly discuss
important characteristics and limitations of the model.
- What are our two primary goals in encoding data? How are they
related to each other?
- What is a (finite) probability distribution? What is an event E
in a sample space S? What is the probability of an event P(E)? What are
independent events?
- What is conditional probability? Why are we interested in such
things? What is Bayes' Theorem?
- What is a code? What is a codeword? What is an encoding
function?
- What are some basic properties of ASCII code? In what ways might
Morse coding be considered superior to ASCII code?
- What are uniquely decodable codes, instantaneous codes, comma
codes, Huffman codes? What is the prefix property? How is it related to
decoding?
- What are the Kraft and McMillan inequalities? What do they tell
us?
- What are some important issues in data compression? What would
we expect "fully compressed" data to look like?
- What is the definition of average code length? Calculate some
average code lengths.
- What are the basic properties of white noise? What is the
probability of k errors?
- What are extensions of a code?
- How do we define the information transmitted by one symbol? What
properties does this function have?
- What is the average information transmitted by a single symbol
from a set of symbols? What did we call this average?
- Given some probabilities, calculate the entropy.
- What is the Gibbs inequality?
- What probability distribution gives maximum entropy?
- What is the relationship between entropy and average code
length?
- What is Shannon-Fano coding? Given some probabilities, write
down a Shannon-Fano code.
- What is the entropy of the extension of a code?
- What does the noiseless coding theorem say? What does it mean?
- What is a communication channel? What is a memoryless channel?
What is a binary symmetric channel? What are forward channel probabilities?
- What is a decision rule? What is an ideal observer? What is the
maximum likelihood decision rule?
- What is nearest neighbor decoding? When is it appropriate to
use?
- What are the geometric interpretations of the requirements for
single and double error detection and correction? What is "Hamming distance"
and how is it related to error detection and correction?
- How is sphere packing related to error detection/correction?
- What other important and/or interesting topics have we covered
that ought to be on this list?
There are surely other items -- review the books and your notes.