Advances in Minimum Description Length: Theory and Applications (Hardcover)

Peter D. Grunwald, In Jae Myung, Mark A. Pitt

  • 出版商: MIT
  • 出版日期: 2005-02-25
  • 售價: $1,550
  • 語言: 英文
  • 頁數: 372
  • 裝訂: Hardcover
  • ISBN: 0262072629
  • ISBN-13: 9780262072625
  • 立即出貨(限量) (庫存=5)




The process of inductive inference -- to infer general laws and principles from particular instances -- is the basis of statistical modeling, pattern recognition, and machine learning. The Minimum Descriptive Length (MDL) principle, a powerful method of inductive inference, holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data -- that the more we are able to compress the data, the more we learn about the regularities underlying the data. Advances in Minimum Description Length is a sourcebook that will introduce the scientific community to the foundations of MDL, recent theoretical advances, and practical applications.

The book begins with an extensive tutorial on MDL, covering its theoretical underpinnings, practical implications as well as its various interpretations, and its underlying philosophy. The tutorial includes a brief history of MDL -- from its roots in the notion of Kolmogorov complexity to the beginning of MDL proper. The book then presents recent theoretical advances, introducing modern MDL methods in a way that is accessible to readers from many different scientific fields. The book concludes with examples of how to apply MDL in research settings that range from bioinformatics and machine learning to psychology.

Peter D. Grünwald is a researcher at CWI, the National Research Institute for Mathematics and Computer Science, Amsterdam, the Netherlands. He is also affiliated with EURANDOM, the European Research Institute for the Study of Stochastic Phenomena, Eindhoven, the Netherlands.

In Jae Myung is Professor in the Department of Psychology and a member of the Center for Cognitive Science at Ohio State University.

Mark A. Pitt is Professor in the Department of Psychology and a member of the Center for Cognitive Science at Ohio State University.


Table of Contents:

Series Foreword vii
Preface ix
I Introductory Chapters 1
1 Introducing the Minimum Description Length Principle
Peter D. Grünwald
2 Minimum Description Length Tutorial
Peter D. Grünwald
3 MDL, Bayesian Inference, and the Geometry of the Space of Probability Distributions
Vijay Balasubramanian
4 Hypothesis Testing for Poisson vs. Geometric Distributions Using Stochastic Complexity
Aaron D. Lanterman
5 Applications of MDL to Selected Families of Models
Andrew J. Hanson and Philip Chi-Wing Fu
6 Algorithmic Statistics and Kolmogorov's Structure Functions
Paul Vitányi
II Theoretical Advances 175
7 Exact Minimax Predictive Density Estimation and MDL
Feng Liang and Andrew Barron
8 The Contribution of Parameters to Stochastic Complexity
Dean P. Foster and Robert A. Stine
9 Extended Stochastic Complexity and Its Applications to Learning
Kenji Yamanishi
10 Kolmogorov's Structure Function in MDL Theory and Lossy Data Compression
Jorma Rissanen and Ioan Tabus
III Practical Applications 263
11 Minimum Message Length and Generalized Bayesian Nets with Asymmetric Languages
Joshua W. Comley and David L. Dowe
12 Simultaneous Clustering and Subset Selection via MDL
Rebecka Jörnsten and Bin Yu
13 An MDL Framework for Data Clustering
Petri Kontkanen, Petri Myllymäki, Wray Buntine, Jorma Rissanen and Henry Tirri
14 Minimum Description Length and Psychological Clustering Models
Michael D. Lee and Daniel J. Navarro
15 A Minimum Description Length Principle for Perception
Nick Chater
16 Minimum Description Length and Cognitive Modeling
Yong Su, In Jae Myung and Mark A. Pitt
Index 435