1 Introduction to digital communication 1
1.1 Standardized interfaces and layering 3
1.2 Communication sources 5
1.2.1 Source coding 6
1.3 Communication channels 7
1.3.1 Channel encoding(modulation) 10
1.3.2 Error correction 11
1.4 Digital interface 12
1.4.1 Network aspects of the digital interface 12
1.5 Supplementary reading 14
2 Coding for discrete sources 16
2.1 Introduction 16
2.2 Fixed-length codes for discrete sources 18
2.3 Variable-length codes for discrete sources 19
2.3.1 Unique decodability 20
2.3.2 Prefix-free codes for discrete sources 21
2.3.3 The Kraft inequality for prefix-free codes 23
2.4 Probability models for discrete sources 26
2.4.1 Discrete memoryless sources 26
2.5 Minimum ? for prefix-free codes 27
2.5.1 Lagrange multiplier solution for the minimum ? 28
2.5.2 Entropy bounds on ? 29
2.5.3 Huffman's algorithm for optimal source codes 31
2.6 Entropy and fixed-to-variable-length codes 35
2.6.1 Fixed-to-variable-length codes 37
2.7 The AEP and the source coding theorems 38
2.7.1 The weak law of large numbers 39
2.7.2 The asymptotic equipartition property 40
2.7.3 Source coding theorems 43
2.7.4 The entropy bound for general classes of codes 44
2.8 Markov sources 46
2.8.1 Coding for Markov sources 48
2.8.2 Conditional entropy 48
2.9 Lempel-Ziv universal data compression 51
2.9.1 The LZ77 algorithm 51
2.9.2 Why LZ77 works 53
2.9.3 Discussion 54
2.10 Summary of discrete source coding 55
2.11 Exercises 56
3 Quantization 67
3.1 Introduction to quantization 67
3.2 Scalar quantization 68
3.2.1 Choice of intervals for given representation points 69
3.2.2 Choice of representation points for given intervals 69
3.2.3 The Lloyd-Max algorithm 70
3.3 Vector quantization 72
3.4 Entropy-coded quantization 73
3.5 High-rate entropy-coded quantization 75
3.6 Differential entropy 76
3.7 Performance of uniform high-rate scalar quantizers 78
3.8 High-rate two-dimensional quantizers 81
3.9 Summary of quantization 84
3.10 Appendixes 85
3.10.1 Nonuniform scalar quantizers 85
3.10.2 Nonuniform 2D quantizers 87
3.11 Exercises 88
4 Source and channel waveforms 93
4.1 Introduction 93
4.1.1 Analog sources 93
4.1.2 Communication channels 95
4.2 Fourier series 96
4.2.1 Finite-energy waveforms 98
4.3 ?2 functions and Lebesgue integration over[-T/2,T/2] 101
4.3.1 Lebesgue measure for a union of intervals 102
4.3.2 Measure for more general sets 104
4.3.3 Measurable functions and integration over[-T/2,T/2] 106
4.3.4 Measurability of functions defined by other functions 108
4.3.5 ?1 and ?2 functions over[-T/2,T/2] 108
4.4 Fourier series for ?2 waveforms 109
4.4.1 The T-spaced truncated sinusoid expansion 111
4.5 Fourier transforms and ?2 waveforms 114
4.5.1 Measure and integration over R 116
4.5.2 Fourier transforms of ?2 functions 118
4.6 The DTFT and the sampling theorem 120
4.6.1 The discrete-time Fourier transform 121
4.6.2 The sampling theorem 122
4.6.3 Source coding using sampled waveforms 124
4.6.4 The sampling theorem for[△-W,△+W] 125
4.7 Aliasing and the sinc-weighted sinusoid expansion 126
4.7.1 The T-spaced sinc-weighted sinusoid expansion 127
4.7.2 Degrees of freedom 128
4.7.3 Aliasing-a time-domain approach 129
4.7.4 Aliasing-a frequency-domain approach 130
4.8 Summary 132
4.9 Appendix:Supplementary material and proofs 133
4.9.1 Countable sets 133
4.9.2 Finite unions of intervals over[-T/2,T/2] 135
4.9.3 Countable unions and outer measure over[-T/2,T/2] 136
4.9.4 Arbitrary measurable sets over[-T/2,T/2] 139
4.10 Exercises 143
5 Vector spaces and signal space 153
5.1 Axioms and basic properties of vector spaces 154
5.1.1 Finite-dimensional vector spaces 156
5.2 Inner product spaces 158
5.2.1 The inner product spaces Rn and Cn 158
5.2.2 One-dimensional projections 159
5.2.3 The inner product space of ?2 functions 161
5.2.4 Subspaces of inner product spaces 162
5.3 Orthonormal bases and the projection theorem 163
5.3.1 Finite-dimensional projections 164
5.3.2 Corollaries of the projection theorem 165
5.3.3 Gram-Schmidt orthonormalization 166
5.3.4 Orthonormal expansions in ?2 167
5.4 Summary 169
5.5 Appendix:Supplementary material and proofs 170
5.5.1 The Plancherel theorem 170
5.5.2 The sampling and aliasing theorems 174
5.5.3 Prolate spheroidal waveforms 176
5.6 Exercises 177
6 Channels,modulation,and demodulation 181
6.1 Introduction 181
6.2 Pulse amplitude modulation(PAM) 184
6.2.1 Signal constellations 184
6.2.2 Channel imperfections:a preliminary view 185
6.2.3 Choice of the modulation pulse 187
6.2.4 PAM demodulation 189
6.3 The Nyquist criterion 190
6.3.1 Band-edge symmetry 191
6.3.2 Choosing{p(t-kT);k∈Z}as an orthonormal set 193
6.3.3 Relation between PAM and analog source coding 194
6.4 Modulation:baseband to passband and back 195
6.4.1 Double-sideband amplitude modulation 195
6.5 Quadrature amplitude modulation(QAM) 196
6.5.1 QAM signal set 198
6.5.2 QAM baseband modulation and demodulation 199
6.5.3 QAM:baseband to passband and back 200
6.5.4 Implementation of QAM 201
6.6 Signal space and degrees of freedom 203
6.6.1 Distance and orthogonality 204
6.7 Carrier and phase recovery in QAM systems 206
6.7.1 Tracking phase in the presence of noise 207
6.7.2 Large phase errors 208
6.8 Summary of modulation and demodulation 208
6.9 Exercises 209
7 Random processes and noise 216
7.1 Introduction 216
7.2 Random processes 217
7.2.1 Examples of random processes 218
7.2.2 The mean and covariance of a random process 220
7.2.3 Additive noise channels 221
7.3 Gaussian random variables,vectors,and processes 221
7.3.1 The covariance matrix of a jointly Gaussian random vector 224
7.3.2 The probability density of a jointly Gaussian random vector 224
7.3.3 Special case of a 2D zero-mean Gaussian random vector 227
7.3.4 Z=AW,where A is orthogonal 228
7.3.5 Probability density for Gaussian vectors in terms of principal axes 228
7.3.6 Fourier transforms for joint densities 230
7.4 Linear functionals and filters for random processes 231
7.4.1 Gaussian processes defined over orthonormal expansions 232
7.4.2 Linear filtering of Gaussian processes 233
7.4.3 Covariance for linear functionals and filters 234
7.5 Stationarity and related concepts 235
7.5.1 Wide-sense stationary(WSS)random processes 236
7.5.2 Effectively stationary and effectively WSS random processes 238
7.5.3 Linear functionals for effectively WSS random processes 239
7.5.4 Linear filters for effectively WSS random processes 239
7.6 Stationarity in the frequency domain 242
7.7 White Gaussian noise 244
7.7.1 The sinc expansion as an approximation to WGN 246
7.7.2 Poisson process noise 247
7.8 Adding noise to modulated communication 248
7.8.1 Complex Gaussian random variables and vectors 250
7.9 Signal-to-noise ratio 251
7.10 Summary of random processes 254
7.11 Appendix:Supplementary topics 255
7.11.1 Properties of covariance matrices 255
7.11.2 The Fourier series expansion of a truncated random process 257
7.11.3 Uncorrelated coefficients in a Fourier series 259
7.11.4 The Karhunen-Loeve expansion 262
7.12 Exercises 263
8 Detection,coding,and decoding 268
8.1 Introduction 268
8.2 Binary detection 271
8.3 Binary signals in white Gaussian noise 273
8.3.1 Detection for PAM antipodal signals 273
8.3.2 Detection for binary nonantipodal signals 275
8.3.3 Detection for binary real vectors in WGN 276
8.3.4 Detection for binary complex vectors in WGN 279
8.3.5 Detection of binary antipodal waveforms in WGN 281
8.4 M-ary detection and sequence detection 285
8.4.1 M-ary detection 285
8.4.2 Successive transmissions of QAM signals in WGN 286
8.4.3 Detection with arbitrary modulation schemes 289
8.5 Orthogonal signal sets and simple channel coding 292
8.5.1 Simplex signal sets 293
8.5.2 Biorthogonal signal sets 294
8.5.3 Error probability for orthogonal signal sets 294
8.6 Block coding 298
8.6.1 Binary orthogonal codes and Hadamard matrices 298
8.6.2 Reed-Muller codes 300
8.7 Noisy-channel coding theorem 302
8.7.1 Discrete memoryless channels 303
8.7.2 Capacity 304
8.7.3 Converse to the noisy-channel coding theorem 306
8.7.4 Noisy-channel coding theorem,forward part 307
8.7.5 The noisy-channel coding theorem for WGN 311
8.8 Convolutional codes 312
8.8.1 Decoding of convolutional codes 314
8.8.2 The Viterbi algorithm 315
8.9 Summary of detection,coding,and decoding 317
8.10 Appendix:Neyman-Pearson threshold tests 317
8.11 Exercises 322
9 Wireless digital communication 330
9.1 Introduction 330
9.2 Physical modeling for wireless channels 334
9.2.1 Free-space,fixed transmitting and receiving antennas 334
9.2.2 Free-space,moving antenna 337
9.2.3 Moving antenna,reflecting wall 337
9.2.4 Reflection from a ground plane 340
9.2.5 Shadowing 340
9.2.6 Moving antenna,multiple reflectors 341
9.3 Input/output models of wireless channels 341
9.3.1 The system function and impulse response for LTV systems 343
9.3.2 Doppler spread and coherence time 345
9.3.3 Delay spread and coherence frequency 348
9.4 Baseband system functions and impulse responses 350
9.4.1 A discrete-time baseband model 353
9.5 Statistical channel models 355
9.5.1 Passband and baseband noise 358
9.6 Data detection 359
9.6.1 Binary detection in flat Rayleigh fading 360
9.6.2 Noncoherent detection with known channel magnitude 363
9.6.3 Noncoherent detection in flat Rician fading 365
9.7 Channel measurement 367
9.7.1 The use of probing signals to estimate the channel 368
9.7.2 Rake receivers 373
9.8 Diversity 376
9.9 CDMA:the IS95 standard 379
9.9.1 Voice compression 380
9.9.2 Channel coding and decoding 381
9.9.3 Viterbi decoding for fading channels 382
9.9.4 Modulation and demodulation 383
9.9.5 Multiaccess interference in IS95 386
9.10 Summary of wireless communication 388
9.11 Appendix:Error probability for noncoherent detection 390
9.12 Exercises 391
References 398
Index 400