A First Course in Information Theory (Hardcover)

Raymond W. Yeung

  • 出版商: Harcourt
  • 出版日期: 2006-07-03
  • 定價: USD $109.00
  • 售價: $1,050
  • 貴賓價: 9.5$998
  • 語言: 英文
  • 頁數: 412
  • 裝訂: Hardcover
  • ISBN: 0306467917
  • ISBN-13: 9780306467912

下單後立即進貨 (5~7天)

買這商品的人也買了...

商品描述

Incepted half a century ago, information theory is a classical yet modern field which is more vibrant than ever before. In particular, there have been a number of major research results on the foundation of the theory during the last ten years. These results enable information theory to be understood and explored in a way which has not been possible before, and they open new dimensions in the theory. In short, the depth of information theory is far beyond what we used to know.

This book is an integration of the most fundamental topics in information theory plus a few selected advanced topics. All concepts and technicalities are explained with clarity. Except for a few classical results, all the results included here are not found elsewhere in book form. These include the theory of I-Measure, Shannon-type and non-Shannon-type information inequalities, and network coding theory. Some important implications of information theory in probability theory and group theory are also explained in this book.

ITIP, the software package that comes with the book, is the only software package of its kind which can prove all Shannon-type information inequalities. It is an essential tool for all information theorists.

This book is suitable for use as a textbook, or as a reference book with any other textbook in a course on information theory. It is also an essential reference for researchers working in areas related to this subject matter.

`No one since Shannon has had a better appreciation for the mathematical structure of information quantities than Prof. Yeung. ... Yeung unveils a smørgasbord of topics in modern information theory that heretofore have been available only in research papers.'

Toby Berger, Cornell University

Contents

1. The Science of Information. 2. Information Measures. 3. Zero-Error Data Compression. 4. Weak Typicality. 5. Strong Typicality. 6. The I-Measure. 7. Markov Structures. 8. Channel Capacity. 9. Rate Distortion Theory. 10. The Blahut-Arimoto Algorithms. 11. Single-Source Network Coding. 12. Information Inequalities. 13. Shannon-Type Inequalities. Appendix 13A: The Basic Inequalities and the Polymatroidal Axioms. 14. Beyond Shannon-Type Inequalities. 15. Multi-Source Network Coding. Appendix 15A: Approximation of Random Variables with Infinite Alphabets. 16. Entropy and Groups. Bibliography. Index.