Introduction to the Mathematical and Statistical Foundations of Econometrics

Herman J. Bierens

  • 出版商: Cambridge
  • 出版日期: 2004-12-20
  • 售價: $900
  • 貴賓價: 9.5$855
  • 語言: 英文
  • 頁數: 344
  • 裝訂: Paperback
  • ISBN: 0521542243
  • ISBN-13: 9780521542241




The focus of this book is on clarifying the mathematical and statistical foundations of econometrics. Therefore, the text provides all the proofs, or at least motivations if proofs are too complicated, of the mathematical and statistical results necessary for understanding modern econometric theory. In this respect, it differs from other econometrics textbooks.


Table of Contents:

1. Probability and measure: 1.1 The Texas lotto; 1.2 Quality control; 1.3 Why do we need sigma-algebras of events?; 1.4 Properties of algebras and sigma-algebras; 1.5 Properties of probability measures; 1.6 The uniform probability measures; 1.7 Lebesque measure and Lebesque integral; 1.8 Random variables and their distributions; 1.9 Density functions; 1.10 Conditional probability, Bayes’s rule, and independence; 1.11 Exercises; 1.A Common structure of the proofs of Theorems 6 and 10; 1.B Extension of an outer measure to a probability measure; 2. Borel measurability, integration, and mathematical expectations: 2.1 Introduction; 2.2 Borel measurability; 2.3 Integral of Borel measurable functions with respect to a probability measure; 2.4 General measurability and integrals of random variables with respect to probability measures; 2.5 Mathematical expectation; 2.6 Some useful inequalities involving mathematical expectations; 2.7 Expectations of products of independent random variables; 2.8 Moment generating functions and characteristic functions; 2.9 Exercises; 2.A Uniqueness of characteristic functions; 3. Conditional expectations: 3.1 Introduction; 3.2 Properties of conditional expectations; 3.3 Conditional probability measures and conditional independence; 3.4 Conditioning on increasing sigma-algebras; 3.5 Conditional expectations as the best forecast schemes; 3.6 Exercises; 3.A Proof of theorem 3.12; 4. Distributions and transformations: 4.1 Discrete distributions; 4.2 Transformations of discrete random vectors; 4.3 Transformations of absolutely continuous random variables; 4.4 Transformations of absolutely continuous random vectors; 4.5 The normal distribution; 4.6 Distributions related to the normal distribution; 4.7 The uniform distribution and its relation to the standard normal distribution; 4.8 The gamma distribution; 4.9 Exercises; 4.A Tedious derivations; 4.B Proof of theorem 4.4; 5. The multivariate normal distribution and its application to statistical inference; 5.1 Expectation and variance of random vectors; 5.2 The multivariate normal distribution; 5.3 Conditional distributions of multivariate normal random variables; 5.4 Independence of linear and quadratic transformations of multivariate normal random variables; 5.5 Distribution of quadratic forms of multivariate normal random variables; 5.6 Applications to statistical inference under normality; 5.7 Applications to regression analysis; 5.8 Exercises; 5.A Proof of theorem 5.8; 6. Modes of convergence: 6.1 Introduction; 6.2 Convergence in probability and the weak law of large numbers; 6.3 Almost sure convergence, and the strong law of large numbers; 6.4 The uniform law of large numbers and its applications; 6.5 Convergence in distribution; 6.6 Convergence of characteristic functions; 6.7 The central limit theorem; 6.8 Stochastic boundedness, tightness, and the Op and op-notations; 6.9 Asymptotic normality of M-estimators; 6.10 Hypotheses testing; 6.11 Exercises; 6.A Proof of the uniform weak law of large numbers; 6.B Almost sure convergence and strong laws of large numbers; 6.C Convergence of characteristic functions and distributions; 7. Dependent laws of large numbers and central limit theorems: 7.1 Stationary and the world decomposition; 7.2 Weak laws of large numbers for stationary processes; 7.3 Mixing conditions; 7.4 Uniform weak laws of large numbers; 7.5 Dependent central limit theorems; 7.6 Exercises; 7.A Hilbert spaces; 8. Maximum likelihood theory; 8.1 Introduction; 8.2 Likelihood functions; 8.3 Examples; 8.4 Asymptotic properties if ML estimators; 8.5 Testing parameter restrictions; 8.6 Exercises.