当前位置:首页 > 工业技术
自适应滤波器原理  第5版  英文版
自适应滤波器原理  第5版  英文版

自适应滤波器原理 第5版 英文版PDF电子书下载

工业技术

  • 电子书积分:24 积分如何计算积分?
  • 作 者:(加)Simon Haykin著
  • 出 版 社:北京:电子工业出版社
  • 出版年份:2017
  • ISBN:9787121322518
  • 页数:908 页
图书介绍:本书是自适应信号处理领域的一本经典教材。全书共17章,系统全面、深入浅出地讲述了自适应信号处理的基本理论与方法,充分反映了近年来该领域的新理论、新技术和新应用。内容包括:随机过程与模型、维纳滤波器、线性预测、最速下降法、随机梯度下降法、最小均方(LMS)算法、归一化LMS自适应算法及其推广、分块自适应滤波器、最小二乘法、递归最小二乘(RLS)算法、鲁棒性、有限字长效应、非平衡环境下的自适应、卡尔曼滤波器、平方根自适应滤波算法、阶递归自适应滤波算法、盲反卷积,以及它们在通信与信息系统中的应用。
《自适应滤波器原理 第5版 英文版》目录

Background and Preview 19

1.The Filtering Problem 19

2.Linear Optimum Filters 22

3.Adaptive Filters 22

4.Linear Filter Structures 24

5.Approaches to the Development of Linear Adaptive Filters 30

6.Adaptive Beamforming 31

7.Four Classes of Applications 35

8.Historical Notes 38

Chapter 1 Stochastic Processes and Models 48

1.1 Partial Characterization of a Discrete-Time Stochastic Process 48

1.2 Mean Ergodic Theorem 50

1.3 Correlation Matrix 52

1.4 Correlation Matrix of Sine Wave Plus Noise 57

1.5 Stochastic Models 58

1.6 Wold Decomposition 64

1.7 Asymptotic Stationarity of an Autoregressive Process 67

1.8 Yule-Walker Equations 69

1.9 Computer Experiment:Autoregressive Process of Order Two 70

1.10 Selecting the Model Order 78

1.11 Complex Gaussian Processes 81

1.12 Power Spectral Density 83

1.13 Properties of Power Spectral Density 85

1.14 Transmission of a Stationary Process Through a Linear Filter 87

1.15 Cramér Spectral Representation for a Stationary Process 90

1.16 Power Spectrum Estimation 92

1.17 Other Statistical Characteristics of a Stochastic Process 95

1.18 Polyspectra 96

1.19 Spectral-Correlation Density 99

1.20 Summary and Discussion 102

Problems 103

Chapter 2 WienerFilters 108

2.1 Linear Optimum Filtering:Statement of the Problem 108

2.2 Principle of Orthogonality 110

2.3 Minimum Mean-Square Error 114

2.4 Wiener-Hopf Equations 116

2.5 Error-Performance Surface 118

2.6 Multiple Linear Regression Model 122

2.7 Example 124

2.8 Linearly Constrained Minimum-Variance Filter 129

2.9 Generalized Sidelobe Cancellers 134

2.10 Summary and Discussion 140

Problems 142

Chapter 3 Linear Prediction 150

3.1 Forward Linear Prediction 150

3.2 Backward Linear Prediction 157

3.3 Levinson-Durbin Algorithm 162

3.4 Properties of Prediction-Error Filters 171

3.5 Schur-Cohn Test 180

3.6 Autoregressive Modeling of a Stationary Stochastic Process 182

3.7 Cholesky Factorization 185

3.8 Lattice Predictors 188

3.9 All-Pole,A11-Pass Lattice Filter 193

3.10 Joint-Process Estimation 195

3.11 Predictive Modeling of Speech 199

3.12 Summary and Discussion 206

Problems 207

Chapter 4 Method of Steepest Descent 217

4.1 Basic Idea of the Steepest-Descent Algorithm 217

4.2 The Steepest-Descent Algorithm Applied to the Wiener Filter 218

4.3 Stability of the Steepest-Descent Algorithm 222

4.4 Example 227

4.5 The Steepest-Descent Algorithm Viewed as a Deterministic Search Method 239

4.6 Virtue and Limitation of the Steepest-Descent Algorithm 240

4.7 Summary and Discussion 241

Problems 242

Chapter 5 Method of Stochastic Gradient Descent 246

5.1 Principles of Stochastic Gradient Descent 246

5.2 Application 1:Least-Mean-Square(LMS)Algorithm 248

5.3 Application 2:Gradient-Adaptive Lattice Filtering Algorithm 255

5.4 Other Applications of Stochastic Gradient Descent 262

5.5 Summary and Discussion 263

Problems 264

Chapter 6 The Least-Mean-Square(LMS)Algorithm 266

6.1 Signal-Flow Graph 266

6.2 Optimality Considerations 268

6.3 Applications 270

6.4 Statistical Learning Theory 290

6.5 Transient Behavior and Convergence Considerations 301

6.6 Efficiency 304

6.7 Computer Experiment on Adaptive Prediction 306

6.8 Computer Experiment on Adaptive Equalization 311

6.9 Computer Experiment on a Minimum-Variance Distortionless-Response Beamformer 320

6.10 Summary and Discussion 324

Problems 326

Chapter 7 Normalized Least-Mean-Square(LMS)Algorithm and Its Generalization 333

7.1 Normalized LMS Algorithm:The Solution to a Constrained Optimization Problem 333

7.2 Stability of the Normalized LMS Algorithm 337

7.3 Step-Size Control for Acoustic Echo Cancellation 340

7. 4 Geometric Considerations Pertaining to the Convergence Process for Real-Valued Data 345

7.5 Affine Projection Adaptive Filters 348

7.6 Summary and Discussion 352

Problems 353

Chapter 8 Block-Adaptive Filters 357

8.1 Block-Adaptive Filters:Basic Ideas 358

8.2 Fast Block LMS Algorithm 362

8.3 Unconstrained Frequency-Domain Adaptive Filters 368

8.4 Self-Orthogonalizing Adaptive Filters 369

8.5 Computer Experiment on Adaptive Equalization 379

8.6 Subband Adaptive Filters 385

8.7 Summary and Discussion 393

Problems 394

Chapter 9 Method of Least-Squares 398

9.1 Statement of the Linear Least-Squares Estimation Problem 398

9.2 Data Windowing 401

9.3 Principle of Orthogonality Revisited 402

9.4 Minimum Sum of Error Squares 405

9.5 Normal Equations and Linear Least-Squares Filters 406

9.6 Time-Average Correlation Matrix Ф 409

9.7 Reformulation of the Normal Equations in Terms of Data Matrices 411

9.8 Properties of Least-Squares Estimates 415

9.9 Minimum-Variance Distortionless Response(MVDR)Spectrum Estimation 419

9.10 Regularized MVDR Beamforming 422

9.11 Singular-Value Decomposition 427

9.12 Pseudoinverse 434

9.13 Interpretation of Singular Values and Singular Vectors 436

9.14 Minimum-Norm Solution to the Linear Least-Squares Problem 437

9.15 Normalized LMS Algorithm Viewed as the Minimum-Norm Solution to an Underdetermined Least-Squares Estimation Problem 440

9.16 Summary and Discussion 442

Problems 443

Chapter 10 The Recursive Least-Squares(RLS)Algorithm 449

10.1 Some Preliminaries 449

10.2 The Matrix Inversion Lemma 453

10.3 The Exponentially Weighted RLS Algorithm 454

10.4 Selection of the Regularization Parameter 457

10.5 Updated Recursion for the Sum of Weighted Error Squares 459

10.6 Example:Single-Weight Adaptive Noise Canceller 461

10.7 Statistical Learning Theory 462

10.8 Efficiency 467

10.9 Computer Experiment on Adaptive Equalization 468

10.10 Summary and Discussion 471

Problems 472

Chapter 11 Robustness 474

11.1 Robustness,Adaptation.and Disturbances 474

11.2 Robustness:Preliminary Considerations Rooted in H∞ Optimization 475

11.3 Robustness of the LMS Algorithm 478

11.4 Robustness of the RLS Algorithm 483

11.5 Comparative Evaluations of the LMS and RLS Algorithms from the Perspective of Robustness 488

11.6 Risk-Sensitive Optimality 488

11.7 Trade-Offs Between Robustness and Efficiency 490

11.8 Summary and Discussion 492

Problems 492

Chapter 12 Finite-Precision Effects 497

12.1 Quantization Errors 498

12.2 Least-Mean-Square(LMS)Algorithm 500

12.3 Recursive Least-Squares(RLS)Algorithm 509

12.4 Summary and Discussion 515

Problems 516

Chapter 13 Adaptation in Nonstationary Environments 518

13.1 Causes and Consequences of Nonstationarity 518

13.2 The System Identification Problem 519

13.3 Degree of Nonstationarity 522

13.4 Criteria for Tracking Assessment 523

13.5 Tracking Performance of the LMS Algorithm 525

13.6 Tracking Performance of the RLS Algorithm 528

13.7 Comparison of the Tracking Performance of LMS and RLS Algorithms 532

13.8 Tuning of Adaptation Parameters 536

13.9 Incremental Delta-Bar-Delta(IDBD)Algorithm 538

13.10 Autostep Method 544

13.11 Computer Experiment:Mixture of Stationary and Nonstationary Environmental Data 548

13.12 Summary and Discussion 552

Problems 553

Chapter 14 Kalman Filters 558

14.1 Recursive Minimum Mean-Square Estimation for Scalar Random Variables 559

14.2 Statement of the Kalman Filtering Problem 562

14.3 The Innovations Process 565

14.4 Estimation of the State Using the Innovations Process 567

14.5 Filtering 573

14.6 Initial Conditions 575

14.7 Summary of the Kalman Filter 576

14.8 Optimality Criteria for Kalman Filtering 577

14.9 KalmanFilter as the Unifying Basis for RLS Algorithms 579

14.10 Covariance Filtering Algorithm 584

14.11 Information Filtering Algorithm 586

14.12 Summary and Discussion 589

Problems 590

Chapter 15 Square-Root Adaptive Filtering Algorithms 594

15.1 Square-Root Kalman Filters 594

15.2 Building Square-Root Adaptive Filters on the Two Kalman Filter Variants 600

15.3 QRD-RLS Algorithm 601

15.4 Adaptive Beamforming 609

15.5 Inverse QRD-RLS Algorithm 616

15.6 Finite-Precision Effects 619

15.7 Summary and Discussion 620

Problems 621

Chapter 16 Order-Recursive Adaptive Filtering Algorithm 625

16.1 Order-Recursive Adaptive Filters Using Least-Squares Estimation:An Overview 626

16.2 Adaptive Forward Linear Prediction 627

16.3 Adaptive Backward Linear Prediction 630

16.4 Conversion Factor 633

16.5 Least-Squares Lattice(LSL) Predictor 636

16.6 Angle-Normalized Estimation Errors 646

16.7 First-Order State-Space Models for Lattice Filtering 650

16.8 QR-Decomposition-Based Least-Squares Lattice(QRD-LSL)Filters 655

16.9 Fundamental Properties of the QRD-LSL Filter 662

16.10 Computer Experiment on Adaptive Equalization 667

16.11 Recursive(LSL)Filters Using A Posteriori Estimation Errors 672

16.12 Recursive LSL Filters Using A Priori Estimation Errors with Error Feedback 675

16.13 Relation Between Recursive LSL and RLS Algorithms 680

16.14 Finite-Precision Effects 683

16.15 Summary and Discussion 685

Problems 687

Chapter 17 Blind Deconvolution 694

17.1 Overview of Blind Deconvolution 694

17.2 Channel Identifiability Using Cyclostationary Statistics 699

17.3 Subspace Decomposition for Fractionally Spaced Blind Identification 700

17.4 Bussgang Algorithm for Blind Equalization 714

17.5 Extension of the Bussgang Algorithm to Complex Baseband Channels 731

17.6 Special Cases of the Bussgang Algorithm 732

17.7 Fractionally Spaced Bussgang Equalizers 736

17.8 Estimation of Unknown Probability Distribution Function of Signal Source 741

17.9 Summary and Discussion 745

Problems 746

Epilogue 750

1. Robustness,Efficiency,and Complexity 750

2. Kernel-Based Nonlinear Adaptive Filtering 753

Appendix A Theory of Complex Variables 770

A.1 Cauchy-Riemann Equations 770

A.2 Cauchy's Integral Formula 772

A.3 Laurent's Series 774

A.4 Singularities and Residues 776

A.5 Cauchy's Residue Theorem 777

A.6 Principle of the Argument 778

A.7 Inversion Integral for the z-Transform 781

A.8 Parseval's Theorem 783

Appendix B Wirtinger Calculus for Computing Complex Gradients 785

B.1 Wirtinger Calculus:Scalar Gradients 785

B.2 Generalized Wirtinger Calculus:Gradient Vectors 788

B.3 Another Approach to Compute Gradient Vectors 790

B.4 Expressions for the Partial Derivatives ?f/?z and ?f/?z* 791

Appendix C Method of Lagrange Multipliers 792

C.1 Optimization Involving a Single Equality Constraint 792

C.2 Optimization Involving Multiple Equality Constraints 793

C.3 Optimum Beamformer 794

Appendix D Estimation Theory 795

D.1 Likelihood Function 795

D.2 Cramér-Rao Inequality 796

D.3 Properties of Maximum-Likelihood Estimators 797

D.4 Conditional Mean Estimator 798

Appendix E Eigenanalysis 800

E.1 The Eigenvalue Problem 800

E.2 Properties of Eigenvalues and Eigenvectors 802

E.3 Low-Rank Modeling 816

E.4 Eigenfilters 820

E.5 Eigenvalue Computations 822

Appendix F Langevin Equation of Nonequilibrium Thermodynamics 825

F.1 Brownian Motion 825

F.2 Langevin Equation 825

Appendix G Rotations and Reflections 827

G.1 Plane Rotations 827

G.2 Two-Sided Jacobi Algorithm 829

G.3 Cyclic Jacobi Algorithm 835

G.4 Householder Transformation 838

G.5 The QR Algorithm 841

Appendix H Complex Wishart Distribution 848

H.1 Definition 848

H.2 The Chi-Square Distribution as a Special Case 849

H.3 Properties of the Complex Wishart Distribution 850

H.4 Expectation of the Inverse Correlation Matrix Ф-1(n) 851

Glossary 852

Text Conventions 852

Abbreviations 855

Principal Symbols 858

Bibliography 864

Suggested Reading 879

Index 897

相关图书
作者其它书籍
返回顶部