《线性模型和广义线怀模型 第3版 英文》PDF下载

  • 购买积分:17 如何计算积分?
  • 作  者:(美)拉奥著
  • 出 版 社:世界图书出版公司北京公司
  • 出版年份:2014
  • ISBN:9787510086342
  • 页数:572 页
图书介绍:本书是著名的统计学家C.R.Rao的专著, 这是扩充修订的第三版,将最新的结果囊括其中,是学习线性模型理论和应用的不可多得的书籍。作者用尽量少的假设讲述了线性模型和广义线性模型。不仅运用了最小二乘理论、也有基于凸损失函数和广义估计方程的估计和检验备择方法。通过书中的各个章节和附录,理论研究和实践应用都包括其中,不仅适用于学生,而且也非常适于研究人员和专家学者。目次:导论;简单线性回归模型;多重线性回归模型及其扩展;广义线性回归模型;确定的和随机的线性约束;广义回归模型中的预测;不完全数据集分

1 Introduction 1

1.1 Linear Models and Regression Analysis 1

1.2 Plan of the Book 3

2 The Simple Linear Regression Model 7

2.1 The Linear Model 7

2.2 Least Squares Estimation 8

2.3 Direct Regression Method 10

2.4 Properties of the Direct Regression Estimators 12

2.5 Centered Model 14

2.6 No Intercept Term Model 15

2.7 Maximum Likelihood Estimation 15

2.8 Testing of Hypotheses and Confidence Interval Estimation 17

2.9 Analysis of Variance 20

2.10 Goodness of Fit of Regression 23

2.11 Reverse Regression Method 24

2.12 Orthogonal Regression Method 24

2.13 Reduced Major Axis Regression Method 27

2.14 Least Absolute Deviation Regression Method 29

2.15 Estimation of Parameters when X Is Stochastic 30

3 The Multiple Linear Regression Model and Its Extensions 33

3.1 The Linear Model 33

3.2 The Principle of Ordinary Least Squares(OLS) 35

3.3 Geometric Properties of OLS 36

3.4 Best Linear Unbiased Estimation 38

3.4.1 Basic Theorems 38

3.4.2 Linear Estimators 43

3.4.3 Mean Dispersion Error 44

3.5 Estimation (Prediction)of the Error Term ε andσ2 45

3.6 Classical Regression under Normal Errors 46

3.6.1 The Maximum-Likelihood(ML)Principle 47

3.6.2 Maximum Likelihood Estimation in Classical Normal Regression 47

3.7 Consistency of Estimators 49

3.8 Testing Linear Hypotheses 51

3.9 Analysis of Variance 57

3.10 Goodness of Fit 59

3.11 Checking the Adequacy of Regression Analysis 61

3.11.1 Univariate Regression 61

3.11.2 Multiple Regression 61

3.11.3 A Complex Example 65

3.11.4 Graphical Presentation 69

3.12 Linear Regression with Stochastic Regressors 70

3.12.1 Regression and Multiple Correlation Coefficient 70

3.12.2 Heterogenous Linear Estimation without Normality 72

3.12.3 Heterogeneous Linear Estimation under Normality 73

3.13 The Canonical Form 76

3.14 Identification and Quantification of Multicollinearity 77

3.14.1 Principal Components Regression 77

3.14.2 Ridge Estimation 79

3.14.3 Shrinkage Estimates 83

3.14.4 Partial Least Squares 84

3.15 Tests of Parameter Constancy 87

3.15.1 The Chow Forecast Test 88

3.15.2 The Hansen Test 91

3.15.3 Tests with Recursive Estimation 92

3.15.4 Test for Structural Change 93

3.16 Total Least Squares 96

3.17 Minimax Estimation 98

3.17.1 Inequality Restrictions 98

3.17.2 The Minimax Principle 101

3.18 Censored Regression 105

3.18.1 Overview 105

3.18.2 LAD Estimators and Asymptotic Normality 107

3.18.3 Tests of Linear Hypotheses 108

3.19 Simultaneous Confidence Intervals 110

3.20 Confidence Interval for the Ratio of Two Linear Parametric Functions 112

3.21 Nonparametric Regression 112

3.21.1 Estimation of the Regression Function 114

3.22 Classification and Regression Trees (CART) 117

3.23 Boosting and Bagging 121

3.24 Projection Pursuit Regression 124

3.25 Neural Networks and Nonparametric Regression 126

3.26 Logistic Regression and Neural Networks 127

3.27 Functional Data Analysis(FDA) 127

3.28 Restricted Regression 130

3.28.1 Problem of Selection 130

3.28.2 Theory of Restricted Regression 130

3.28.3 Efficiency of Selection 132

3.28.4 Explicit Solution in Special Cases 133

3.29 LINEX Loss Function 135

3.30 Balanced Loss Function 137

3.31 Complements 138

3.31.1 Linear Models without Moments:Exercise 138

3.31.2 Nonlinear Improvement of OLSE for Nonnormal Disturbances 139

3.31.3 A Characterization of the Least Squares Estimator 139

3.31.4 A Characterization of the Least Squares Estimator:A Lemma 140

3.32 Exercises 140

4 The Generalized Linear Regression Model 143

4.1 Optimal Linear Estimation of β 144

4.1.1 R1-Optimal Estimators 145

4.1.2 R2-Optimal Estimators 149

4.1.3 R3-Optimal Estimators 150

4.2 The Aitken Estimator 151

4.3 Misspecification of the Dispersion Matrix 153

4.4 Heteroscedasticity and Autoregression 156

4.5 Mixed Effects Model:Unified Theory of Linear Estimation 164

4.5.1 Mixed Effects Model 164

4.5.2 A Basic Lemma 164

4.5.3 Estimation of Xβ(the Fixed Effect) 166

4.5.4 Prediction of Uξ(the Random Effect) 166

4.5.5 Estimation of ε 167

4.6 Linear Mixed Models with Normal Errors and Random Effects 168

4.6.1 Maximum Likelihood Estimation of Linear Mixed Models 171

4.6.2 Restricted Maximum Likelihood Estimation of Linear Mixed Models 174

4.6.3 Inference for Linear Mixed Models 178

4.7 Regression-Like Equations in Econometrics 183

4.7.1 Econometric Models 186

4.7.2 The Reduced Form 190

4.7.3 The Multivariate Regression Model 192

4.7.4 The Classical Multivariate Linear Regression Model 195

4.7.5 Stochastic Regression 196

4.7.6 Instrumental Variable Estimator 197

4.7.7 Seemingly Unrelated Regressions 198

4.7.8 Measurement Error Models 199

4.8 Simultaneous Parameter Estimation by Empirical Bayes Solutions 209

4.8.1 Overview 209

4.8.2 Estimation of Parameters from Different Linear Models 211

4.9 Supplements 215

4.10 Gauss-Markov,Aitken and Rao Least Squares Estimators 216

4.10.1 Gauss-Markov Least Squares 216

4.10.2 Aitken Least Squares 217

4.10.3 Rao Least Squares 218

4.11 Exercises 220

5 Exact and Stochastic Linear Restrictions 223

5.1 Use of Prior Information 223

5.2 The Restricted Least-Squares Estimator 225

5.3 Maximum Likelihood Estimation under Exact Restrictions 227

5.4 Stepwise Inclusion of Exact Linear Restrictions 228

5.5 Biased Linear Restrictions and MDE Comparison with the OLSE 233

5.6 MDE Matrix Comparisons of Two Biased Estimators 236

5.7 MDE Matrix Comparison of Two Linear Biased Estimators 242

5.8 MDE Comparison of Two(Biased)Restricted Estimators 243

5.9 Stein-Rule Estimators under Exact Restrictions 251

5.10 Stochastic Linear Restrictions 252

5.10.1 Mixed Estimator 252

5.10.2 Assumptions about the Dispersion Matrix 254

5.10.3 Biased Stochastic Restrictions 257

5.11 Stein-Rule Estimators under Stochastic Restrictions 261

5.12 Weakened Linear Restrictions 262

5.12.1 Weakly(R,r)-Unbiasedness 262

5.12.2 Optimal Weakly(R,r)-Unbiased Estimators 262

5.12.3 Feasible Estimators—Optimal Substitution of β in β1(β,A) 266

5.12.4 RLSE instead of the Mixed Estimator 268

5.13 Exercises 269

6 Prediction in the Generalized Regression Model 271

6.1 Introduction 271

6.2 Some Simple Linear Models 271

6.2.1 The Constant Mean Model 271

6.2.2 The Linear Trend Model 272

6.2.3 Polynomial Models 273

6.3 The Prediction Model 274

6.4 Optimal Heterogeneous Prediction 275

6.5 Optimal Homogeneous Prediction 277

6.6 MDE Matrix Comparisons between Optimal and Classical Predictors 280

6.6.1 Comparison of Classical and Optimal Prediction with Respect to the y Superiority 283

6.6.2 Comparison of Classical and Optimal Predictors with Respect to the X βSuperiority 285

6.7 Prediction Regions 287

6.7.1 Concepts and Definitions 287

6.7.2 On q-Prediction Intervals 289

6.7.3 On q-Intervals in Regression Analysis 291

6.7.4 On(p,q)-Prediction Intervals 292

6.7.5 Linear Utilitv Functions 294

6.7.6 Normally Distributed Populations-Two-Sided Symmetric Intervals 296

6.7.7 Onesided Infinite Intervals 298

6.7.8 Utility and Length of Intervals 298

6.7.9 Utility and coverage 300

6.7.10 Maximal Utility and Optimal Tests 300

6.7.11 Prediction Ellipsoids Based on the GLSE 302

6.7.12 Comparing the Efficiency of Prediction Ellipsoids 305

6.8 Simultaneous Prediction of Actual and Average Values of y 306

6.8.1 Specification of Target Function 307

6.8.2 Exact Linear Restrictions 308

6.8.3 MDEP Using Ordinary Least Squares Estimator 309

6.8.4 MDEP Using Restricted Estimator 309

6.8.5 MDEP Matrix Comparison 310

6.8.6 Stein-Rule Predictor 310

6.8.7 Outside Sample Predictions 311

6.9 Kalman Filter 314

6.9.1 Dynamical and Observational Equations 314

6.9.2 Some Theorems 314

6.9.3 Kalman Model 317

6.10 Exercises 318

7 Sensitivity Analysis 321

7.1 Introduction 321

7.2 Prediction Matrix 321

7.3 Effect of Single Observation on Estimation of Parameters 327

7.3.1 Measures Based on Residuals 328

7.3.2 Algebraic Consequences of Omitting an Observation 329

7.3.3 Detection of Outliers 330

7.4 Diagnostic Plots for Testing the Model Assumptions 334

7.5 Measures Based on the Confidence Ellipsoid 335

7.6 Partial Regression Plots 341

7.7 Regression Diagnostics for Removing an Observation with Graphics 343

7.8 Model Selection Criteria 350

7.8.1 Akaikes Information Criterion 351

7.8.2 Bayesian Information Criterion 353

7.8.3 Mallows Cp 353

7.8.4 Example 355

7.9 Exercises 356

8 Analysis of Incomplete Data Sets 357

8.1 Statistical Methods with Missing Data 358

8.1.1 Complete Case Analysis 358

8.1.2 Available Case Analysis 358

8.1.3 Filling in the Missing Values 359

8.1.4 Model-Based Procedures 359

8.2 Missing-Data Mechanisms 360

8.2.1 Missing Indicator Matrix 360

8.2.2 Missing Completely at Random 360

8.2.3 Missing at Random 360

8.2.4 Nonignorable Nonresponse 360

8.3 Missing Pattern 360

8.4 Missing Data in the Response 361

8.4.1 Least-Squares Analysis for Filled-up Data—Yates Procedure 362

8.4.2 Analysis of Covariance—Bartlett's Method 363

8.5 Shrinkage Estimation by Yates Procedure 364

8.5.1 Shrinkage Estimators 364

8.5.2 Efficiency Properties 365

8.6 Missing Values in the X-Matrix 367

8.6.1 General Model 367

8.6.2 Missing Values and Loss in Efficiency 368

8.7 Methods for Incomplete X-Matrices 371

8.7.1 Complete Case Analysis 371

8.7.2 Available Case Analysis 371

8.7.3 Maximum-Likelihood Methods 372

8.8 Imputation Methods for Incomplete X-Matrices 373

8.8.1 Maximum-Likelihood Estimates of Missing Values 373

8.8.2 Zero-Order Regression 374

8.8.3 First-Order Regression 375

8.8.4 Multiple Imputation 377

8.8.5 Weighted Mixed Regression 378

8.8.6 The Two-Stage WMRE 382

8.9 Assumptions about the Missing Mechanism 384

8.10 Regression Diagnostics to Identify Non-MCAR Processes 384

8.10.1 Comparison of the Means 384

8.10.2 Comparing the Variance-Covariance Matrices 385

8.10.3 Diagnostic Measures from Sensitivity Analysis 385

8.10.4 Distribution of the Measures and Test Procedure 385

8.11 Treatment of Nonignorable Nonresponse 386

8.11.1 Joint Distribution of(X,Y)with Missing Values Only in Y 386

8.11.2 Conditional Distribution of Y Given X with Missing Values Only in Y 388

8.11.3 Conditional Distribution of Y Given X with Missing Values Only in X 389

8.11.4 Other Approaches 390

8.12 Further Literature 391

8.13 Exercises 391

9 Robust Regression 393

9.1 Overview 393

9.2 Least Absolute Deviation Estimators—Univariate Case 394

9.3 M-Estimates:Univariate Case 398

9.4 Asymptotic Distributions of LAD Estimators 401

9.4.1 Univariate Cage 401

9.4.2 Multivariate Case 402

9.5 General M-Estimates 403

9.6 Tests of Significance 407

10 Models for Categorical Response Variables 411

10.1 Generalized Linear Models 411

10.1.1 Extension of the Regression Model 411

10.1.2 Structure of the Generalized Linear Model 413

10.1.3 Score Function and Information Matrix 416

10.1.4 Maximum-Likelihood Estimation 417

10.1.5 Testing of Hypotheses and Goodness of Fit 420

10.1.6 Overdispersion 421

10.1.7 Quasi Loglikelihood 423

10.2 Contingency Tables 425

10.2.1 Overview 425

10.2.2 Ways of Comparing Proportions 427

10.2.3 Sampling in Two-Way Contingency Tables 429

10.2.4 Likelihood Function and Maximum-Likelihood Estimates 430

10.2.5 Testing the Goodness of Fit 432

10.3 GLM for Binary Response 435

10.3.1 Logit Models and Logistic Regression 435

10.3.2 Testing the Model 437

10.3.3 Distribution Function as a Link Function 438

10.4 Logit Models for Categorical Data 439

10.5 Goodness of Fit—Likelihood-Ratio Test 440

10.6 Loglinear Models for Categorical Variables 441

10.6.1 Two-Way Contingency Tables 441

10.6.2 Three-Way Contingency Tables 444

10.7 The Special Case of Binary Response 448

10.8 Coding of Categorical Explanatory Variables 450

10.8.1 Dummy and Effect Coding 450

10.8.2 Coding of Response Models 453

10.8.3 Coding ofModels for the Hazard Rate 455

10.9 Extensions to Dependent Binary Variables 457

10.9.1 Overview 458

10.9.2 Modeling Approaches for Correlated Response 460

10.9.3 Quasi-Likelihood Approach for Correlated Binary Response 460

10.9.4 The GEE Method by Liang and Zeger 462

10.9.5 Properties ofthe GEE Estimate βG 463

10.9.6 Efficiency ofthe GEE and IEE Methods 465

10.9.7 Choice of the Quasi-Correlation Matrix Rt(α) 465

10.9.8 Bivariate Binary Correlated Response Variables 466

10.9.9 The GEE Method 467

10.9.10 The IEE Method 468

10.9.11 An Example from the Field of Dentistry 469

10.9.12 Full Likelihood Approach for Marginal Models 474

10.10 Exercises 486

A Matrix Algebra 489

A.1 Overview 489

A.2 Trace of a Matrix 491

A.3 Determinant of a Matrix 492

A.4 Inverse of a Matrix 494

A.5 Orthogonal Matrices 495

A.6 Rank of a Matrix 495

A.7 Range and Null Space 496

A.8 Eigenvalues and Eigenvectors 496

A.9 Decomposition of Matrices 498

A.10 Definite Matrices and Quadratic Forms 501

A.11 Idempotent Matrices 507

A.12 Generalized Inverse 508

A.13 Projectors 516

A.14 Functions of Normally Distributed Variables 517

A.15 Differentiation of Scalar Functions of Matrices 520

A.16 Miscellaneous Results,Stochastic Convergence 523

B Tables 527

C Software for Linear Regression Models 531

C.1 Software 531

C.2 Special-Purpose Software 536

C.3 Resources 537

References 539

Index 563