Review of probability theory, Definition of information measure and entropy : Measure
of information, Average information content of symbols in long independent
sequences, Average information content of symbols in long dependent sequences.
Mark off statistical model for information source, Entropy and information rate of
mark-off source, Mutual information, Asymptotic properties of entropy and problem
solving in entropy.
Block code and its properties, Data compression, Kraft-Mcmillan equality and compact
codes, Encoding of the source output, Shannon's encoding algorithm, Coding
strategies, Huffman coding, Shannon-Fano-Elias coding and introduction to arithmetic
Introduction to information channels, Communication channels, Discrete
communication channels, Continuous channels, Discrete memory less channels, Mutual
information, Channel capacity, Channel coding theorem, Differential entropy and
mutual information for continuous ensembles, Channel capacity theorem.
Introduction to error control coding : Introduction, Types of errors, Examples, Types of
codes linear block codes : Matrix description, Error detection and correction, Standard
arrays and table look up for decoding.
Binary cycle codes, Algebraic structures of cyclic codes, Encoding using an (n-k) bit
shift register, Syndrome calculation, BCH codes, RS codes, Golay codes, Shortened
cyclic codes, Burst error correcting codes, Burst and random error correcting codes
Convolution codes, Time domain approach. Transform domain approach.