Select Page

Information Theory, Inference, and Learning Algorithms

Information Theory, Inference, and Learning Algorithms

This book is aimed at senior undergraduates and graduate students in Engineering, Science, Mathematics, and Computing. It expects familiarity with calculus, probability theory, and linear algebra as taught in a first or second year undergraduate course on mathematics for scientists and engineers.

Conventional courses on information theory cover not only the beautiful theoretical ideas of Shannon, but also practical solutions to communication problems. This book goes further, bringing in Bayesian data modelling, Monte Carlo methods, variational methods, clustering algorithms, and neural networks.

Why unify information theory and machine learning? Because they are two sides of the same coin. In the 1960s, a single field, cybernetics, was populated by information theorists, computer scientists, and neuroscientists, all studying common problems. Information theory and machine learning still belong together. Brains are the ultimate compression and communication systems. And the state-of-the-art algorithms for both data compression and error-correcting codes use the same tools as machine learning.

Information Theory, Inference, and Learning Algorithms

by David J.C. MacKay (PDF, Postscript, EPUB, DJVU, Latex) – 640 pages

Information Theory, Inference, and Learning Algorithms by David J.C. MacKay

About The Author


My name is John Eye and I’m obssessed with ebooks, loves to procrastinate, a bookworm and one that loves to share with the world what free ebooks have to offer.

Leave a reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Join our mailing list to receive the latest posts and news.

You have Successfully Subscribed!

Pin It on Pinterest

Share This