Information Theory: From Coding to Learning

$19.99

Add to wishlistAdded to wishlistRemoved from wishlist 0
Information Theory: From Coding to Learning

This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov’s metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science.

Publisher ‏ : ‎ Cambridge University Press; 1st edition (February 20, 2025)
Language ‏ : ‎ English
Hardcover ‏ : ‎ 748 pages
ISBN-10 ‏ : ‎ 1108832903
ISBN-13 ‏ : ‎ 978-1108832908
Item Weight ‏ : ‎ 3.65 pounds
Dimensions ‏ : ‎ 6.9 x 1.6 x 9.8 inches

User Reviews

0.0 out of 5
0
0
0
0
0
Write a review

There are no reviews yet.

Be the first to review “Information Theory: From Coding to Learning”

Your email address will not be published. Required fields are marked *

Information Theory: From Coding to Learning
Information Theory: From Coding to Learning

$19.99

Shop.survirtual
Logo
Shopping cart