当前位置:首页 > 工业技术
工程应用编码与信息理论  英文版
工程应用编码与信息理论  英文版

工程应用编码与信息理论 英文版PDF电子书下载

工业技术

  • 电子书积分:12 积分如何计算积分?
  • 作 者:(美)Richard B. Wells著
  • 出 版 社:北京:机械工业出版社
  • 出版年份:2002
  • ISBN:711110983X
  • 页数:305 页
图书介绍:
《工程应用编码与信息理论 英文版》目录

1.DISCRETESOURCESANDENTROPY 1

1.1OverviewofDigitalCommunicationandStorageSystems 1

1.2DiscreteInformationSourcesandEntropy  2

1.2.1Sourcealphabetsandentropy, 2

1.2.2Jointandconditionalentropy, 6

1.2.3Entropyofsymbolblocksandthechainrule, 8

1.3SourceCoding 10

1.3.1Mappingfunctionsandefficiency, 10

1.3.2Mutualinformation, 12

1.3.3Abriefdigressiononencryption, 14

1.3.4Summaryofsection1.3, 15

1.4.1Prefixcodesandinstantaneousdecoding, 16

1.4HuffmanCoding 16

1.4.2ConstructionofHuffmancodes, 17

1.4.3Hardwareimplementationapproaches, 19

1.4.4RobustnessofHuffmancodingefficiency, 20

1.5DictionaryCodesandLempel-ZivCoding 21

1.5.1Therationalebehinddynamicdictionarycoding, 21

1.5.2Alinked-listLZalgorithm, 22

1.5.3Thedecodingprocess, 25

1.5.4Large-blockrequirementofLZcompression, 26

1.6ArithmeticCoding 28

1.6.1Code-wordlengthandtheasymptoticequipartitionproperty, 28

1.6.2Thearithmeticcodingmethod, 30

1.6.3Decodingarithmeticcodes, 32

1.6.4Otherissuesinarithmeticcoding, 33

1.7SourceModelsandAdaptiveSourceCoding 34

1.8ChapterSummary 35

References 36

Exercises 37

2.CHANNELSANDCHANNELCAPACITY 39

2.1TheDiscreteMemorylessChannelModel 39

2.1.1Thetransitionprobabilitymatrix, 39

2.1.2Outputentropyandmutualinformation, 41

2.2ChannelCapacityandtheBinarySymmetricChannel 43

2.2.1Maximizationofmutualinformationandchannelcapacity, 43

2.2.2Symmetricchannels, 45

2.3.1Equivocation, 48

2.3BlockCodingandShannon sSecondTheorem 48

2.3.2Entropyrateandthechannel-codingtheorem, 49

2.4MarkovProcessesandSourceswithMemory 51

2.4.1Markovprocesses, 51

2.4.2Steady-stateprobabilityandtheentropyrate, 54

2.5MarkovChainsandDataProcessing 56

2.6ConstrainedChannels 58

2.6.1Modulationtheoryandchannelconstraints, 58

2.6.2Linearandtime-invariantchannels, 60

2.7AutocorrelationandPowerSpectrumofSequences 62

2.7.1Statisticsoftimesequences, 62

2.7.2Thepowerspectrum, 64

2.8.1Constraintsondatasequences, 68

2.8DataTranslationCodes 68

2.8.2Statespaceandtrellisdescriptionsofcodes, 70

2.8.3Capacityofadatatranslationcode, 73

2.9(d,k)Sequences 75

2.9.1Run-length-limitedcodesandmaxentropicsequences, 75

2.9.2Powerspectrumofmaxentropicsequences, 77

2.10ChapterSummary 82

References 83

Exercises 83

3.RUN-LENGTH-LIMITEDCODES 89

3.1GeneralConsiderationsforDataTranslationCoding 89

3.2PrefixCodesandBlockCodes 91

3.2.1Fixed-lengtnblockcodes, 91

3.2.2Variable-lengthblockcodes, 92

3.2.3PrefixcodesandtheKraftinequality, 94

3.3State-DependentFixed-LengthBlockCodes 96

3.4Variable-LengthFixed-RateCodes 98

3.5Look-AheadCodes 102

3.5.1Code-wordconcatenation, 102

3.5.2Thekconstraint, 104

3.5.3Informalandformaldesignmethods, 105

3.6DC-FreeCodes 107

3.6.1Therunningdigitalsumandthedigitalsumvariation, 107

3.6.2State-splittingandmatchedspectralnullcodes, 109

3.7ChapterSummary 114

Exercises 115

References 115

4.LINEARBLOCKERROR-CORRECTINGCODES 117

4.1GeneralConsiderations 117

4.1.1Channelcodingforerrorcorrection, 117

4.1.2Errorratesanderrordistributionforthebinarysymmetricchannel, 118

4.1.3Errordetectionandcorrection, 121

4.1.4Themaximumlikelihooddecodingprinciple, 123

4.1.5Hammingdistancecodecapability, 124

4.2BinaryFieldsandBinaryVectorSpaces 126

4.2.1Thebinaryfield, 126

4.2.2Representinglinearcodesinavectorspace, 130

4.3.1Elementarypropertiesofvectorspaces, 131

4.3LinearBlockCodes 131

4.3.2Hammingweight,Hammingdistance,andtheHammingcube, 133

4.3.3TheHammingsphereandboundsonredundancyrequirements, 135

4.4DecodingLinearBlockCodes 136

4.4.1Completedecodersandbounded-distancedecoders, 136

4.4.2Syndromedecodersandtheparity-checktheorem, 138

4.5HammingCodes 140

4.5.1ThedesignofHammingcodes, 140

4.5.2ThedualcodeofaHammingcode, 143

4.5.3TheexpandedHammingcode, 144

4.6ErrorRatePerformanceBoundsforLinearBlockError-CorrectingCodes 147

4.6.1Blockerrorrates, 147

4.6.2Biterrorrate, 148

4.7PerformanceofBounded-DistanceDecoderswithRepeatRequests 149

4.7.1Approximateerrorperformance, 152

4.7.2EffectivecoderateofARQsystems, 154

4.7.3ARQprotocols, 156

4.8ChapterSummary 157

References 158

Exercises 158

5.CYCLICCODES 160

5.1DefinitionandPropertiesofCyclicCodes 160

5.2PolynomialRepresentationofCyclicCodes 162

5.3.1Polynomialrings, 164

5.3PolynomialModuloArithmetic 164

5.3.2Someimportantalgebraicidentities, 166

5.4GenerationandDecodingofCyclicCodes 169

5.4.1Generator,parity-check,andsyndromepolynomials, 169

5.4.2Systematiccycliccodes, 169

5.4.3Hardwareimplementationofencodersforsystematiccycliccodes, 171

5.4.4Hardwareimplementationofdecodersforcycliccodes, 174

5.4.5TheMeggittdecoder, 175

5.5Error-TrappingDecoders 178

5.5.1Updatingthesyndromeduringcorrection, 178

5.5.2Bursterrorpatternsanderrortrapping, 180

5.6SomeStandardCyclicBlockCodes 184

5.6.1TheHammingcodes, 184

5.6.2BCHcodes, 185

5.6.3Burst-correctingcodes, 186

5.6.4Cyclicredundancycheckcodes, 187

5.7SimpleModificationstoCyclicCodes 189

5.7.1Expandingacode, 189

5.7.2Shorteningacode, 190

5.7.3Noncyclicityofshortenedcodes, 193

5.7.4Interleaving, 194

5.8ChapterSummary 197

References 197

Exercises 198

6.1DefinitionofConvolutionalCodes 201

6.2.1Thestatediagramandtrellisrepresentations, 205

6.2StructuralPropertiesofConvolutionalCodes 205

6.CONVOLUTIONALCODES 207

6.2.2Transferfunctionsofconvolutionalcodes, 207

6.3TheViterbiAlgorithm 210

6.4WhytheViterbiAlgorithmWorksⅠ:Hard-DecisionDecoding 215

6.4.1Maximumlikelihoodunderhard-decisiondecoding, 215

6.4.2Erroreventprobability, 217

6.4.3Boundsonbiterrorrate, 219

6.5SomeKnownGoodConvolutionalCodes 221

6.6WhytheViterbiAlgorithmWorksⅡ:Soft-DecisionDecoding 223

6.6.1Euclideandistanceandmaximumlikelihood, 223

6.6.2Eliminationoftiesandinformationloss, 226

6.6.3Calculationofthelikelihoodmetric, 228

6.7TheTracebackMethodofViterbiDecoding 229

6.8PuncturedConvolutionalCodes 234

6.8.1Puncturing, 234

6.8.2Goodpuncturedconvolutionalcodes, 236

6.9ChapterSummary 238

References 239

Exercises 239

7.TRELLIS-CODEDMODULATION 242

7.1Multiamplitude/MultiphaseDiscreteMemoryless 242

Channels 242

7.1.1I—Qmodulation, 242

7.1.2Then-aryPSKsignalconstellation, 243

7.1.3PSKerrorrate, 245

7.1.4Quadratureamplitudemodulation, 247

7.2SystematicRecursiveConvolutionalEncoders 248

7.3SignalMappingandSetPartitioning 251

7.4KnownGoodTrellisCodesforPSKandQAM 254

7.5ChapterSummary 257

References 257

Exercises 258

8.INFORMATIONTHEORYANDCRYPTOGRAPHY 259

8.1Cryptosystems 259

8.1.1Basicelementsofciphersystems, 259

8.1.2Somesimpleciphersystems, 261

8.2AttacksonCryptosystems 265

8.3PerfectSecrecy 266

8.4LanguageEntropyandSuccessfulCiphertextAttacks 269

8.4.1Thekey-equivocationtheorem, 269

8.4.2Spuriouskeysandkeyequivocation, 270

8.4.3Languageredundancyandunicitydistance, 271

8.5ComputationalSecurity 272

8.6DiffusionandConfusion 274

8.7ProductCipherSystems 275

8.7.1Commuting,noncommuting,andidempotentproductciphers, 276

8.7.2Mixingtransformationsandgoodproductciphers, 278

8.8Codes 279

8.9Public-KeyCryptosystems 280

8.11ChapterSummary 281

8.10OtherIssues 281

References 282

Exercises 283

9.SHANNON SCODINGTHEOREMS 285

9.1RandomCoding 285

9.2TheAverageRandomCode 287

9.3ADiscussionofShannon sSecondTheorem 289

9.4Shannon-FanoCoding 290

9.5Shannon sNoiseless-CodingTheorem 292

9.6AFewFinalWords 293

References 294

ANSWERSTOSELECTEDEXERCISES 295

INDEX 299

返回顶部