注冊(cè) | 登錄讀書好,好讀書,讀好書!
讀書網(wǎng)-DuShu.com
當(dāng)前位置: 首頁出版圖書科學(xué)技術(shù)計(jì)算機(jī)/網(wǎng)絡(luò)計(jì)算機(jī)科學(xué)理論與基礎(chǔ)知識(shí)工程應(yīng)用編碼與信息理論:英文版

工程應(yīng)用編碼與信息理論:英文版

工程應(yīng)用編碼與信息理論:英文版

定 價(jià):¥36.00

作 者: (美)Richard B.Wells著
出版社: 機(jī)械工業(yè)出版社
叢編項(xiàng): 信息科學(xué)經(jīng)典原版教材
標(biāo) 簽: 暫缺

購(gòu)買這本書可以去


ISBN: 9787111109839 出版時(shí)間: 2002-10-01 包裝: 平裝
開本: 26cm 頁數(shù): 305 字?jǐn)?shù):  

內(nèi)容簡(jiǎn)介

  本書從工程應(yīng)用角度,用大量的實(shí)例系統(tǒng)地講述了信息理論和編碼技術(shù),從而使得其講述的內(nèi)容容易被工程技術(shù)人員和電子,通信類在校生所接受。這一特點(diǎn)是近年來國(guó)內(nèi)外出版的同類教科書中很難見到的。

作者簡(jiǎn)介

暫缺《工程應(yīng)用編碼與信息理論:英文版》作者簡(jiǎn)介

圖書目錄

PREFACE
1 DISCRETE SOURCES AND ENTROPY
1.1 Overview of Digital Communication and Storage Systems
1.2 Discrete Information Sources and Entropy
1.2.1 Source alphabets and entropy
1.2.2 Joint and Conditional entropy
1.2.3 Entropy of symbol blocks and the chain rule
1.3 Source Coding
1.3.1 Mapping functions and effciency
1.3.2 Mutual information
1.3.3 A brief digression on encryption
1.3.4 Summary of section
1.4 Huffman Coding
1.4.1 Prefix codes and instantaneous decoding
1.4.2 Construction of Huffman codes
1.4.3 Hardware implementation approaches
1.4.4 Robustness of Huffman coding efficiency
1.5 Dictionary Codes and Lempel-Ziv Coding
1.5.1 The retionale behind dynamic dictionary coding
1.5.2 A linked-list LZ algorithm
1.5.3 The decoding process
1.5.4 Large-block requirement of LZ compression
1.6 Arithmetic Coding
1.6.1 Code-word length and the asymptotic equipartition property
1.6.2 The arichmetic coding method
1.6.3 Decoding arithmetic codes
1.6.4 Other issues in arithmetic coding
1.7 Source Models and Adaptive Source Coding
1.8 Chapter Summary
References
Exercises
2 CHANNELS AND CHANNEL CPAPCITY
2.1 The Discrete Memoryless Channel Model
2.1.1 The transition probability matrix
2.1.2 Output entropy and mutual information
2.2 Channel Capacity and the Binary Symmetric Channel
2.2.1 Maximization of mutual information and channel capacity
2.2.2 Symmetric channels
2.3 Block Coding and Shannon's Second Theorem
2.3.1 Equivocation
2.3.2 Ehntropy rate and the channel-coding theorem
2.4 Markov Processes and Sources with Memory
2.4.1 Markov processes
2.4.2 Steady-state probability and the entropy rate
2.5 Markov Chains and Data Processing
2.6 Constrained Channels
2.6.1 Modulation theory and channel constraints
2.6.2 Linear and time-invariant channels
2.7 Autocorrelation and Power Spectrum of Sequences
2.7.1 Statistics of time sequences
2.7.2 The power spectrum
2.8 Data Translation Codes
2.8.1 Constraints on data sequences
2.8.2 State space and trellis descriptions of codes
2.8.3 Capacity of a data translation code
2.9 (d,k)Sequences
2.9.1 Run-length-limited codes and maxentropic sequences
2.9.2 Power spectrum of maxentropic sequences
2.10 Chapter Summary
References
Exercises
3 RUN-LENGTH-LIMITED CODES
3.1 General Considerations for Data Translation Coding
3.2 Prefix Codes and Block Codes
3.2.1 Fixed-lengtn block codes
3.2.2 Variable-length bolck codes
3.2.3 Prefix codes and the Kraft inequality
3.3 State-Dependent Fixed-Length Block Codes
3.4 Variable-Length Fixed-Rate Codes
3.5 Look-Ahead Codes
3.5.1 Code-word concatenation
3.5.2 The K constraint
3.5.3 Informal and formal design methods
3.6 DC-Free Codes
3.6.1 The running digital sum and the digital sum variation
3.6.2 State-splitting and matched spectral null codes
3.7 Chapter Summary
References
Exercises
4 LINEAR BLOCK ERROR-CORRECTING CODES
4.1 General Considerations
4.1.1 Channel coding for error correction
4.1.2 Error rates and error distribution for the binary symmetric Channel
4.1.3 Error detection and correction
4.1.4 The maximum likelihood decoding principle
4.1.5 Hamming distance code capability
4.2 Binary Fields and Binary Vector Spaces
4.2.1 The binary field
4.2.2 Representing linear codes in a vector space
4.3 Linear Block Codes
4.3.1 Elementary properties of vector spaces
4.3.2 Hamming weight,Hamming distance,and the Hamming cube
4.3.3 The Hamming sphere and bounds on redundancy requirements
4.4 Decoding Linear Block Codes
4.4.1 Complete decoders and bounded-distance decoders
4.4.2 Syndrome decoders and the Parity-check theorem
4.5 Hamming Codes
4.5.1 The design of Hamming codes
4.5.2 The dual code of a Haaming code
4.5.3 The expanded Hamming code
4.6 Error Rate Performance Bounds for Linear Block Error-Correcting Codes
4.6.1 Block error retes
4.6.2 Bit error rate
4.7 Performance of Bounded-Distance Decoders with Repeat Requests
4.7.1 Approximate error performance
4.7.2 Effective code rate of ARQ systems
4.7.3 ARQ protocols
4.8 Chapter Summary
References
Exercises
5 CYCLIC CODES
5.1 Definition and Properties of Cyclic Codes
5.2 Polynomial Representation of Cyclic Codes
5.3 Polynomial Modulo Arithmetic
5.3.1 Polynomial rings
5.3.2 Some inportant algebrati identities
5.4 Generation and Decoding of Cyclic Codes
5.4.1 Generator,parity-check,and syndrome polynomials
5.4.2 Systematic cyclic codes
5.4.3 Hardware implementation of encoders for systematic cyclic codes
5.4.4 Hardware implementation of decoders for cyclic codes
5.4.5 The Meggitt decoder
5.5 Error-Trapping Decoders
5.5.1 Updating the syndrome during correction
5.5.2 Burst error patterns and error trapping
5.6 Some Standard Cyclic Block Codes
5.6.1 The Hamming codes
5.6.2 BCH codes
5.6.3 Burst-correcting codes
5.6.4 Cyclic redundancy check codes
5.7 Simple Modifications to Cyclic Clodes
5.7.1 Expanding a code
5.7.2 Shortening a code
5.7.3 Noncyclicity of shortened codes
5.7.4 Interleaving
5.8 Chapter Summary
References
Exercises
6 CONVOLUTIONAL CODES
6.1 Definition of Convolutional Codes
6.2 Structural Properties of Convolution Codes
6.2.1 The state diagram and trellis representations
6.2.2 Transfer functions of convolutional codes
6.3 The Viterbi Algorithm
6.4 Why the Viterbi Alogorithm Works I:Hard-Decision Deconding
6.4.1 Maximum likelihood under hard-decision decoding
6.4.2 Error event probability
6.4.3 Bounds on bit error rate
6.5 Some Known Good Convolutional Codes
6.6 Why the Viterbi Algorithm Works II:Soft-Decision Decoding
6.6.1 Euclidean distance and maximum likelihood
6.6.2 Elimination of ties and information loss
6.6.3 Calculation of the likelihood metric
6.7 The Traceback Method of Viterbi Decoding
6.8 Punctured Convolutional Codes
6.8.1 Puncturing
6.8.2 Good punctured convolutional codes
6.9 Chapter Summary
References
Exercises
7 TRELLIS-CODED MODULATION
7.1 Multiamplitude/Multiphase Discrete Memoryless Channels
7.1.1 I-Qmodulation
7.1.2 The n-ary PSK signal constellation
7.1.3 PSK error rate
7.1.4 Quadrature amplitude modulation
7.2 Systematic Recursive Convolutional Encoders
7.3 Signal Mapping and Set Partitioning
7.4 Known Good Trellis Codes for PSK and QAM
7.5 Chapter Summary
References
Exercises
8 INFORMATION THEORY AND CRYPTOGRAPHY
8.1 Cryptosystems
8.1.1 Basic elements of ciphersystems
8.1.2 Some simple ciphersystems
8.2 Attacks on Cryptosystems
8.3 Perfect Secrecy
8.4 Language Entropy and Successful Ciphertext Attacks
8.4.1 The key-equivocation theorem
8.4.2 Spurious keys and key equivocation
8.4.3 Language redundancy and unicity distance
8.5 Computational Security
8.6 Diffusion and Confusion
8.7 Product Cipher Systems
8.7.1 Commuting,noncommuting,and idempotent product ciphers
8.7.2 Mixing transformations and good product ciphers
8.8 Codes
8.9 Public-Key Cryptosystems
8.10 Other Issues
8.11 Chapter Summary
References
Exercises
9 SHANNON'S CODING THEOREMS
9.1 Random Coding
9.2 The Average Random
9.3 A Discussion of Channon's Second Theorem
9.4 Shannon-Fano Coding
9.5 Shannon's Noiseless-Coding Theorem
9.6 A Few Final Words
References
ANSWERS TO SELECTED EXERCISES
INDEX

本目錄推薦

掃描二維碼
Copyright ? 讀書網(wǎng) www.talentonion.com 2005-2020, All Rights Reserved.
鄂ICP備15019699號(hào) 鄂公網(wǎng)安備 42010302001612號(hào)