Part Ⅰ THEORY 1
1 Introduction 3
1.1 Distribution of extremes in random fields 3
1.2 Outline of the method 7
1.3 Gaussian and asymptotically Gaussian random fields 9
1.4 Applications 11
2 Basic examples 15
2.1 Introduction 15
2.2 A power-one sequential test 15
2.3 A kernel-based scanning statistic 24
2.4 Other methods 38
3 Approximation of the local rate 41
3.1 Introduction 41
3.2 Preliminary localization and approximation 43
3.2.1 Localization 43
3.2.2 A discrete approximation 46
3.3 Measure transformation 51
3.4 Application of the localization theorem 55
3.4.1 Checking Condition Ⅰ 57
3.4.2 Checking Condition Ⅴ 57
3.4.3 Checking Condition Ⅳ 58
3.4.4 Checking Condition Ⅱ 59
3.4.5 Checking Condition Ⅲ 63
3.5 Integration 67
4 From the local to the global 71
4.1 Introduction 71
4.2 Poisson approximation of probabilities 72
4.3 Average run length to false alarm 78
5 The localization theorem 87
5.1 Introduction 87
5.2 A simplified version of the localization theorem 88
5.3 The localization theorem 90
5.4 A local limit theorem 95
5.5 Edge effects and higher order approximations 100
Part Ⅱ APPLICATIONS 103
6 Nonparametric tests:Kolmogorov-Smirnov and Peacock 105
6.1 Introduction 105
6.1.1 Classical analysis of the Kolmogorov-Smirnov test 106
6.1.2 Peacock's test 108
6.2 Analysis of the one-dimensional case 109
6.2.1 Preliminary localization 110
6.2.2 An approximation by a discrete grid 111
6.2.3 Measure transformation 114
6.2.4 The asymptotic distribution of the local field and the global term 115
6.2.5 Application of the localization theorem and integration 117
6.2.6 Checking the conditions of the localization theorem 119
6.3 Peacock's test 120
6.4 Relations to scanning statistics 123
7 Copy number variations 125
7.1 Introduction 125
7.2 The statistical model 127
7.3 Analysis of statistical properties 131
7.3.1 The alternative distribution 131
7.3.2 Preliminary localization and approximation 132
7.3.3 Measure transformation 132
7.3.4 The localization theorem and the local limit theorem 133
7.3.5 Checking Condition Ⅴ* 137
7.3.6 Checking Condition Ⅱ* 137
7.4 The false discovery rate 140
8 Sequential monitoring of an image 143
8.1 Introduction 143
8.2 The statistical model 146
8.3 Analysis of statistical properties 148
8.3.1 Preliminary localization 149
8.3.2 Measure transformation,the localization theorem,and integration 155
8.3.3 Checking the conditions of the localization theorem 157
8.3.4 Checking Condition Ⅴ 157
8.3.5 Checking Condition Ⅳ 158
8.3.6 Checking Condition Ⅱ 159
8.4 Optimal change-point detection 161
9 Buffer overflow 165
9.1 Introduction 165
9.2 The statistical model 169
9.2.1 The process of demand from a single source 169
9.2.2 The integrated process of demand 171
9.3 Analysis of statistical properties 172
9.3.1 The large deviation factor 172
9.3.2 Preliminary localization 174
9.3.3 Approximation by a cruder grid 175
9.3.4 Measure transformation 179
9.3.5 The localization theorem 180
9.3.6 Integration 183
9.3.7 Checking the conditions of the localization theorem 184
9.3.8 Checking Condition Ⅳ 184
9.3.9 Checking Condition Ⅴ 185
9.3.10 Checking Condition Ⅱ 185
9.4 Heavy tail distribution,long-range dependence,and self-similarity 186
10 Computing Pickands'constants 191
10.1 Introduction 191
10.1.1 The double-sum method 192
10.1.2 The method based on the likelihood ratio identity 193
10.1.3 Pickands'constants 195
10.2 Representations of constants 196
10.3 Analysis of statistical error 199
10.4 Enumerating the effect of local fluctuations 204
Appendix:Mathematical background 209
A.1 Transforms 209
A.2 Approximations of sum of independent random elements 211
A.3 Concentration inequalities 214
A.4 Random walks 215
A.5 Renewal theory 215
A.6 The Gaussian distribution 216
A.7 Large sample inference 217
A.8 Integration 218
A.9 Poisson approximation 219
A.10 Convexity 220
References 221
Index 223