Decode Information Theory and Coding for APJAKTU (Semester IV IT course 13)

Review of probability theory
Block code and its properties
Introduction to information channels
Introduction to error control coding
Binary cycle codes
Second , by Decode

Book Details

Review of probability theory, Definition of information measure and entropy : Measure of information, Average information content of symbols in long independent sequences, Average information content of symbols in long dependent sequences. Mark off statistical model for information source, Entropy and information rate of mark-off source, Mutual information, Asymptotic properties of entropy and problem solving in entropy. Block code and its properties, Data compression, Kraft-Mcmillan equality and compact codes, Encoding of the source output, Shannon's encoding algorithm, Coding strategies, Huffman coding, Shannon-Fano-Elias coding and introduction to arithmetic coding. Introduction to information channels, Communication channels, Discrete communication channels, Continuous channels, Discrete memory less channels, Mutual information, Channel capacity, Channel coding theorem, Differential entropy and mutual information for continuous ensembles, Channel capacity theorem. Introduction to error control coding : Introduction, Types of errors, Examples, Types of codes linear block codes : Matrix description, Error detection and correction, Standard arrays and table look up for decoding. Binary cycle codes, Algebraic structures of cyclic codes, Encoding using an (n-k) bit shift register, Syndrome calculation, BCH codes, RS codes, Golay codes, Shortened cyclic codes, Burst error correcting codes, Burst and random error correcting codes Convolution codes, Time domain approach. Transform domain approach.

Additional Information

Edition Second
Publisher Technical Publications
ISBN 9789333210706
No. of Pages (Printed Book) 148

Be the first to review this product

₹ 80.00

Decode Information Theory and Coding for APJAKTU (Semester IV IT course 13)

Newsletter