AnyBook4Less.com
Find the Best Price on the Web
Order from a Major Online Bookstore
Developed by Fintix
Home  |  Store List  |  FAQ  |  Contact Us  |  
 
Ultimate Book Price Comparison Engine
Save Your Time And Money

Information Theory, Inference & Learning Algorithms

Please fill out form in order to compare prices
Title: Information Theory, Inference & Learning Algorithms
by David J. C. MacKay
ISBN: 0-521-64298-1
Publisher: Cambridge University Press
Pub. Date: 15 June, 2002
Format: Hardcover
Volumes: 1
List Price(USD): $50.00
Your Country
Currency
Delivery
Include Used Books
Are you a club member of: Barnes and Noble
Books A Million Chapters.Indigo.ca

Average Customer Rating: 4.33 (3 reviews)

Customer Reviews

Rating: 5
Summary: Brings theory to life
Comment: Fantastically good value, this wide-ranging textbook covers elementary information theory, data compression, and coding theory; machine learning, Bayesian inference, Monte Carlo methods; and state of the art error-correcting coding methods, including low-density parity-check codes, turbo codes, and digital fountain codes. Theory and practical examples are covered side by side. Hundreds of exercises are included, many with worked solutions.

Three things are distinctive about this book.
First, it emphasizes the connections between information theory and machine learning - for example data compression and Bayesian data modelling are two sides of the same coin.
Second, since 1993, there's been a revolution in communication theory, with classical algebraic codes being superceded by sparse graph codes; this text covers these recent developments in detail.


Third, the whole book is available for free online viewing at
www.inference.phy.cam.ac.uk/mackay/itila/.

I use this book in all my teaching! :-)

Rating: 4
Summary: Good book - but few arguments need revision from theorists
Comment: This review concerns only the coding theory part.

If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory and provides good information about LDPC codes, turbo codes and decoding algorithms. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong when one tries to attain shannon limit. He gives an argument based on GV bound (warning: This argument may not be entirely true).

Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.

Inspite of this, the book is a must have for engineers and computer scientists.

Rating: 4
Summary: A reservoir of information - Yet few problems
Comment: This review concerns only the coding theory part.

If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong. He gives an argument based on GV bound.

Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.

Another bad part is the book does not talk too much about new problems such as multi-access channels, broadcast channels, zero error information theory, communication complexity, upcoming challenges and open problems and what has been done in these fields in information theory and so on...what has been done in these. May be some author bright researcher in the area like Mackay could write a book to put a direction to these questions.

Inspite of this, the book is a must have for engineers and computer scientists.

Similar Books:

Title: Probability Theory : The Logic of Science
by E. T. Jaynes, G. Larry Bretthorst
ISBN: 0521592712
Publisher: Cambridge University Press
Pub. Date: 10 April, 2003
List Price(USD): $65.00
Title: Information Theory and Statistics
by Solomon Kullback
ISBN: 0486696847
Publisher: Dover Pubns
Pub. Date: 07 July, 1997
List Price(USD): $16.95
Title: The Elements of Statistical Learning
by T. Hastie, R. Tibshirani, J. H. Friedman
ISBN: 0387952845
Publisher: Springer Verlag
Pub. Date: 09 August, 2001
List Price(USD): $82.95
Title: Mathematical Foundations of Information Theory
by A. Ya. Khinchin
ISBN: 0486604349
Publisher: Dover Pubns
Pub. Date: 01 June, 1957
List Price(USD): $8.95
Title: Mathematical Theory of Communication
by Claude E. Shannon, Warren Weaver
ISBN: 0252725484
Publisher: Univ of Illinois Pr (Pro Ref)
Pub. Date: December, 1963
List Price(USD): $15.95

Thank you for visiting www.AnyBook4Less.com and enjoy your savings!

Copyright� 2001-2021 Send your comments

Powered by Apache