注冊(cè) | 登錄讀書(shū)好,好讀書(shū),讀好書(shū)!
讀書(shū)網(wǎng)-DuShu.com
當(dāng)前位置: 首頁(yè)出版圖書(shū)科學(xué)技術(shù)計(jì)算機(jī)/網(wǎng)絡(luò)計(jì)算機(jī)科學(xué)理論與基礎(chǔ)知識(shí)信息論與編碼(英文版)

信息論與編碼(英文版)

信息論與編碼(英文版)

定 價(jià):¥24.00

作 者: 梁建武 等編著
出版社: 水利水電出版社
叢編項(xiàng): 21世紀(jì)高等院校規(guī)劃教材
標(biāo) 簽: 計(jì)算機(jī)理論

購(gòu)買(mǎi)這本書(shū)可以去


ISBN: 9787508455693 出版時(shí)間: 2008-07-01 包裝: 平裝
開(kāi)本: 16開(kāi) 頁(yè)數(shù): 196 字?jǐn)?shù):  

內(nèi)容簡(jiǎn)介

  本書(shū)重點(diǎn)介紹經(jīng)典信息論的基本理論,并力圖將信息論的基本理論和工程應(yīng)用的編碼理論聯(lián)系起來(lái),介紹一些關(guān)于這些理論的實(shí)際應(yīng)用。全書(shū)分為7章,內(nèi)容包括信息度量的基本理論、無(wú)失真信源編碼、限失真信源編碼、信道編碼及其應(yīng)用等。本書(shū)注重基本概念,并且用通俗易懂的語(yǔ)言對(duì)它們加以詮釋。在當(dāng)前信息、通信系統(tǒng)飛速發(fā)展的大背景下,本書(shū)力圖用較多的例子和圖表來(lái)闡述概念和理論,同時(shí)盡量避免糾纏于煩瑣難懂的公式證明之中。為了加深讀者對(duì)所講述知識(shí)的理解,每章最后都配有適量的練習(xí)。題供讀者選用。本書(shū)可作為高等院校電子信息類(lèi)學(xué)生雙語(yǔ)教學(xué)的教材或參考書(shū),也可作為通信、電信、電子等領(lǐng)域從業(yè)人員的參考資料。

作者簡(jiǎn)介

  梁建武,中南大學(xué)教師。合編著有《網(wǎng)頁(yè)制作與設(shè)計(jì)實(shí)訓(xùn)》等。

圖書(shū)目錄

Chapter 1 Introduction
Contents
Before it starts, there is something must be known
 1.1 What is Information
 1.2 What’s Information Theory?
  1.2.1 Origin and Development of Information Theory
  1.2.2 The application and achievement of Information Theory methods
 1.3 Formation and Development of Information Theory
 Questions and Exercises
 Biography of Claude Elwood Shannon
Chapter 2 Basic Concepts of Information Theory
 Contents
 Preparation knowledge
 2.1 Self-information and conditional self-information
  2.1.1 Self-Information
  2.1.2 Conditional Self-Information
 2.2 Mutual information and conditional mutual information
 2.3 Source entropy
  2.3.1 Introduction of entropy
  2.3.2 Mathematics description of source entropy
  2.3.3 Conditional entropy
  2.3.4 Union entropy (Communal entropy)
  2.3.5 Basic nature and theorem of source entropy
 2.4 Average mutual information
  2.4.1 Definition
  2.4.2 Physics significance of average mutual information
  2.4.3 Properties of average mutual information
 2.5 Continuous source
  2.5.1 Entropy of the continuous source (also called differential entropy)
  2.5.2 Mutual information of the continuous random variable
 Questions and Exercises
 Additional reading materials
Chapter 3 Discrete Source Information
 Contents
 3.1 Mathematical model and classification of the source
 3.2 The discrete source without memory
 3.3 Multi-marks discrete steady source
 3.4 Source entropy of discrete
  4.2.4 Relationship between entropy, channel doubt degree and mutual information
 4.3 The discrete channel without memory and its channel capacity
 4.4 Channel capacity
  4.4.1 Concept of channel capacity
  4.4.2 Discrete channel without memory and its channel capacity
  4.4.3 Continuous channel and its channel capacity
Chapter 5 kossless source coding
 Contents
 5.1 Lossless coder
 5.2 Lossless source coding
  5.2.1 Fixed length coding theorem
  5.2.2 Unfixed length source coding
 5.3 Lossless source coding theorems
  5.3.1 Classification of code and main coding method
  5.3.2 Kraft theorem
  5.3.3 Lossless unfixed source coding theorem (Shannon First theorem)
 5.4 Pragmatic examples of lossless source coding
  5.4.1 Huffman coding
  5.4.2 Shannon coding and Fano coding
 5.5 The Lempel-ziv algorithm
 5.6 Run-Length Encoding and the PCX format
 Questions and Exercises
Chapter 6 Limited distortion source coding
 Contents
 6.1 The start point of limit distortion theory
 6.2 Distortion measurement
  6.2.1 Distortion function
  6.2.2 Average distortion
 6.3 Information rate distortion function
 6.4 Property of R(D)
  6.4.1 Minimum of D and R(D)
  6.4.2 Dmax and R(Dmax)
  6.4.3 The under convex function of R(D)
  6.4.4
 Questions and exercises
Bibliography

本目錄推薦

掃描二維碼
Copyright ? 讀書(shū)網(wǎng) www.talentonion.com 2005-2020, All Rights Reserved.
鄂ICP備15019699號(hào) 鄂公網(wǎng)安備 42010302001612號(hào)