This is the course overview site for the ESSLLI course in information theory, which will be taught by Mathias Winther Madsen in the week of 26 July 2021.
The course is about information theory, a branch of probability theory that investigates the fundamental limits on our ability to communicate reliably. By quantifying the concepts of uncertainty and information in a mathematically meaningful way, it allows us to formulate precise results about the amount of information we can hope to transmit in the presence of noise, and evaluate whether a code achieves this maximal efficiency. These results have concrete applications for the design of communication systems, but also profound consequences for linguistics and epistemology.
The course will consist of five online 90-minute lectures starting each morning at 9:45 CEST. The plan is as follows:
Day |
Topic |
Monday |
Introduction, entropy, coding |
Tuesday |
Data compression, the source coding theorem |
Wednesday |
Entropy rates, arithmetic coding |
Thursday |
Divergence, gambling |
Friday |
Error correction, the channel coding theorem |
For general information about the ESSLLI program, see esslli2021.unibz.it.
The course will draw on several sources, but two important ones are
Cover and Thomas: Elements of Information Theory (Wiley, 1991)
McKay: Information Theory, Pattern Recognition, and Neural Networks (Cambridge University Press 2008, available at inference.org.uk)
More materials will be made available on Google Drive as the course progresses. Additional materials are available on Github.
The course will be taught by Mathias Winther Madsen (mathias.winther@gmail.com).
About me: I am a Danish mathematician who currently works as research engineer for a German robotics company. I got my PhD from the Institute for Logic, Language, and Computation (ILLC) in Amsterdam.