当前位置:首页 > 文化科学教育体育
国外信息科学与技术优秀图书系列  信息论基础  英文
国外信息科学与技术优秀图书系列  信息论基础  英文

国外信息科学与技术优秀图书系列 信息论基础 英文PDF电子书下载

文化科学教育体育

  • 电子书积分:14 积分如何计算积分?
  • 作 者:(加)RaymondW.Yeung著
  • 出 版 社:北京:科学出版社
  • 出版年份:2012
  • ISBN:9787030344564
  • 页数:412 页
图书介绍:本书不仅介绍了信息论的经典内容,还首次综合讨论了信息度量(I-Measure)、网络编码、信息不等式等信息论的最新理论,以及信息论与群论之间的关系。本书还包含一个提供信息不等式的软件包ITIP。本书配有大量的实例、插图和习题,适合作为信息论专业研究生或本科高年级学生的教材或参考书,也可供相关领域的技术人员参考学习。
《国外信息科学与技术优秀图书系列 信息论基础 英文》目录

1.THE SCIENCE OF INFORMATION 1

2.INFORMATION MEASURES 5

2.1 Independence and Markov Chains 5

2.2 Shannon's Information Measures 10

2.3 Continuity of Shannon's Information Measures 16

2.4 Chain Rules 17

2.5 Informational Divergence 19

2.6 The Basic Inequalities 23

2.7 Some Useful Information Inequalities 25

2.8 Fano's Inequality 28

2.9 Entropy Rate of Stationary Source 32

Problems 36

Historical Notes 39

3.ZERO-ERROR DATA COMPRESSION 41

3.1 The Entropy Bound 42

3.2 Prefix Codes 45

3.2.1 Definition and Existence 45

3.2.2 Huffman Codes 48

3.3 Redundancy of Prefix Codes 54

Problems 58

Historical Notes 59

4.WEAK TYPICALITY 61

4.1 The Weak AEP 61

4.2 The Source Coding Theorem 64

4.3 Efficient Source Coding 66

4.4 The Shannon-McMillan-Breiman Theorem 68

Problems 70

Historical Notes 71

5.STRONG TYPICALITY 73

5.1 Strong AEP 73

5.2 Strong Typicality Versus Weak Typicality 81

5.3 Joint Typicality 82

5.4 An Interpretation of the Basic Inequalities 92

Problems 93

Historical Notes 94

6.THE I-MEASURE 95

6.1 Preliminaries 96

6.2 The I-Measure for Two Random Variables 97

6.3 Construction of the I-Measure μ* 100

6.4 μ*Can be Negative 103

6.5 Information Diagrams 105

6.6 Examples of Applications 112

Appendix 6.A:A Variation of the Inclusion-Exclusion Formula 119

Problems 121

Historical Notes 124

7.MARKOV STRUCTURES 125

7.1 Conditional Mutual Independence 126

7.2 Full Conditional Mutual Independence 135

7.3 Markov Random Field 140

7.4 Markov Chain 143

Problems 146

Historical Notes 147

8.CHANNEL CAPACITY 149

8.1 Discrete Memoryless Channels 153

8.2 The Channel Coding Theorem 158

8.3 The Converse 160

8.4 Achievability of the Channel Capacity 166

8.5 A Discussion 171

8.6 Feedback Capacity 174

8.7 Separation of Source and Channel Coding 180

Problems 183

Historical Notes 186

9.RATE-DISTORTION THEORY 187

9.1 Single-Letter Distortion Measures 188

9.2 The Rate-Distortion Function R(D) 191

9.3 The Rate-Distortion Theorem 196

9.4 The Converse 204

9 5 Achievability of RI(D) 206

Problems 212

Historical Notes 214

10.THE BLAHUT-ARIMOTO ALGORITHMS 215

10.1 Alternating Optimization 216

10.2 The Algorithms 218

10.2.1 Channel Capacity 218

10.2.2 The Rate-Distortion Function 223

10.3 Convergence 226

10.3.1 A Sufficient Condition 227

10.3.2 Convergence to the Channel Capacity 230

Problems 231

Historical Notes 231

11.SINGLE-SOURCE NETWORK CODING 233

11.1 A Point-to-Point Network 234

11.2 What is Network Coding? 236

11.3 A Network Code 240

11.4 The Max-Flow Bound 242

11.5 Achievability of the Max-Flow Bound 245

11.5.1 Acyclic Networks 246

11.5.2 Cyclic Networks 251

Problems 259

Historical Notes 262

12.INFORMATION INEQUALITIES 263

12.1 The Region Γ* n 265

12.2 Information Expressions in Canonical Form 267

12.3 A Geometrical Framework 269

12.3.1 Unconstrained Inequalities 269

12.3.2 Constrained Inequalities 270

12.3.3 Constrained Identities 272

12.4 Equivalence of Constrained Inequalities 273

12.5 The Implication Problem of Conditional Independence 276

Problems 277

Historical Notes 278

13.SHANNON-TYPE INEQUALITIES 279

13.1 The Elemental Inequalities 279

13.2 A Linear Programming Approach 281

13.2.1 Unconstrained Inequalities 283

13.2.2 Constrained Inequalities and Identities 284

13.3 A Duality 285

13.4 Machine Proving-ITIP 287

13.5 Tackling the Implication Problem 291

13.6 Minimality of the Elemental Inequalities 293

Appendix 13 A:The Basic Inequalities and the Polymatroidal Axioms 297

Problems 298

Historical Notes 300

14.BEYOND SHANNON-TYPE INEQUALITIES 301

14.1 Characterizations of Γ* 2,Γ* 3,and ? 302

14.2 A Non-Shannon-Type Unconstrained Inequality 310

14.3 A Non-Shannon-Type Constrained Inequality 315

14.4 Applications 321

Problems 324

Historical Notes 325

15.MULTI-SOURCE NETWORK CODING 327

15.1 Two Characteristics 328

15.1.1 The Max-Flow Bounds 328

15.1.2 Superposition Coding 330

15.2 Examples of Application 335

15.2.1 Multilevel Diversity Coding 335

15.2.2 Satellite Communication Network 336

15.3 A Network Code for Acyclic Networks 337

15.4 An Inner Bound 340

15.5 An Outer Bound 342

15.6 The LP Bound and Its Tightness 346

15.7 Achievability of Rin 350

Appendix 15.A:Approximation of Random Variables with Infinite Alphabets 360

Problems 361

Historical Notes 364

16.ENTROPY AND GROUPS 365

16.1 Group Preliminaries 366

16.2 Group-Characterizable Entropy Functions 372

16.3 A Group Characterization of ? 377

16.4 Information Inequalities and Group Inequalities 380

Problems 384

Historical Notes 387

Bibliography 389

Index 403

返回顶部