Abdellatif Zaidi
Assistant Professor
Universite Paris-Est Marne la Vallée
Navigation
Links
|
|
Lecture Notes (Spring 2012)
- EE-TDI: Information Theory: Lecure Notes, Part I
- EE-TDI: Information Theory: Lecure Notes, Part II
Lectures: Part I
- Introduction
- Entropy, Relative Entropy and Mutual Information
- Entropy, joint entropy and conditional entropy, relative entropy and mutual information, divergence
- Chain rules for entropy, relative entropy and mutual information
- Jensen's inequality and its consequences
- Data processing inequality
- Fano's inequality
- Data Processing and Markov Chains
- Data processing and inverse processing
- Data processing and Markov chains
- Staionary data, memoryless data, Makovian and Gauss-Markovian models
- Examples of data processing
- Properties of Entropy, Mutual Information and Divergence
- Convexity
- Case of random vectors
- Link between entropy and differential entropy, quantization
- Entropy maximization under constraints
Lectures: Part II
- Fundamental limits of Source Coding
- The source coding problem
- Rate-Distortion function, properties
- Shannon source coding theorem
- Examples
- Fundamental limits of Channel Coding
- The channel coding problem
- Capacity-Cost function, properties
- Shannon channel coding theorem
- Examples
- Data Compression
- Examples of codes
- Kraft inequality, optimal codes, bounds on the optimal codelength
- Kraft inequality for uniquely decodable codes
- Huffman codes, comments on Huffman codes
- Arithmetic coding
- Quantization and Applications
- Scalar and vector quantization
- Algorithms for quantization
- Examples, JPEG, JPEG 2000
|