簡易檢索 / 詳目顯示

研究生: 林子鈞
Lin, Tzu-Chun
論文名稱: 負熵與資訊理論詮釋
Negative entropy and information theory explanation
指導教授: 黃吉川
Hwang, Chi-Chuan
學位類別: 碩士
Master
系所名稱: 工學院 - 工程科學系
Department of Engineering Science
論文出版年: 2013
畢業學年度: 101
語文別: 中文
論文頁數: 80
中文關鍵詞: 資訊理論負熵信道容量數據壓縮信源編碼
外文關鍵詞: Information theory, Negative entropy, Channel capacity, data compression, source coding
相關次數: 點閱:97下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在資訊系統廣泛地存在的現代,為了將資訊系統推向新的世代,藉以仰賴的便是資訊理論的發展,在計算科學的世界裡信息、熵與不確定性,皆可視為等價的,信息及是負熵,由此概念開始對信息做一連串數學的探討。本文將從熵的起源熱力學熵,逐漸的導向資訊理論,從機率與統計的基礎,點出資訊理論中許多定理的重要物理意義。此外,本文將試著當負熵理論發生熵增加的情況,對資訊理論加以重新詮釋,運用令一種角度的方式敘述經典的資訊理論,希望能藉由不同觀點的信息和熵之間的討論,替我們提供了一種跨學科的探討角度,並可以幫助其他領域去釐清這兩者之間的關聯性,且快速理解該領域的核心內容。

    Information, entropy and randomness is equivalent in the computational science world. This concept began to make a series mathematical of information theory discussed. This thesis describe from thermodynamic entropy to information entropy, use mathematical foundations of probability theory describe entropy the physical significance. We will try to explain the negative entropy theory occurs when case of entropy increase, other views about classically information theory. A discussion of the relationship between information and entropy also gives we an interdisciplinary perspective, would understanding the properties of information be helpful in other fields.

    摘要 I Abstract II 誌謝 III 目錄 IV 圖目錄 VIII 表目錄 X 符號說明 XI 第一章 緒論 1 1-1 研究背景 1 1-2 文獻回顧 2 1-3 研究動機 5 1-4 論文架構 5 第二章 熱力與資訊熵之理論 7 2-1 古典熱力理論介紹 7 2-1-1 熱力學第二定律 7 2-2 統計熱力學熵介紹 8 2-2-1 吉布斯熵公式 8 2-2-2 波茲曼原理 8 2-3 資訊熵介紹 9 2-3-1 資訊的定義 9 2-3-2 熵與資訊 10 2-3-3 機率的理論 10 2-3-4 隨機變數(Random Variable, r.v) 11 2-3-5 相同且獨立的分佈(i.i.d) 12 2-3-6 機率質量函數(Probability mass function, pmf) 13 2-3-7 資訊理論的基礎定義 15 2-3-8 資訊熵 15 2-3-9 二元熵 16 2-3-10 最大熵值 18 2-4 編碼概論 22 2-4-1 編碼介紹 22 2-4-2 訊源編碼的數學基礎定義 23 2-4-3 唯一可譯訊碼 24 2-4-4 即時碼 27 2-4-5 訊碼的類別 28 2-5 無失真信源編碼 29 2-5-1 Shannon-Fano-Elias編碼 29 2-5-2 Huffman編碼 33 2-6 信息傳輸模型 36 第三章 資訊量的重新定義 39 3-1 信息熵與負熵 39 3-2 資訊量的重新定義 40 第四章 資訊理論新觀點的重新解釋 42 4-1 資訊熵 42 4-1-1 Shannon熵(離散型) 42 4-1-2 聯合熵 43 4-1-3 條件熵 45 4-1-4 相對熵 49 4-1-5 互訊息 51 4-2 數據處理不等式 55 4-3 費諾不等式 57 4-4 壓縮理論 58 4-4-1 典型序列 58 4-4-2 Kraft不等式與平均長度 60 4-4-3 Shannon第一定理(無噪音信源編碼定理) 64 4-5 數據傳輸 66 4-5-1 Shannon第二定理(有噪音信道定理) 66 4-6 結果與討論 71 第五章 總結與未來展望 73 參考文獻 75

    [1] H. Nyquist, "Certain factors affecting telegraph speed," Bell System Technical Journal, vol. 3, pp. 324-346, Apr 1924.
    [2] R. V. L. Hartley, "Transmission of information," Bell System Technical Journal, vol. 7, pp. 535-563, Jul 1928.
    [3] C. E. Shannon, "A MATHEMATICAL THEORY OF COMMUNICATION," Bell System Technical Journal, vol. 27, pp. 379-423, 1948.
    [4] C. E. Shannon, "COMMUNICATION IN THE PRESENCE OF NOISE," Proceedings of the Institute of Radio Engineers, vol. 37, pp. 10-21, 1949.
    [5] W. J. McGill, "MULTIVARIATE INFORMATION TRANSMISSION," Psychometrika, vol. 19, pp. 97-116, 1954.
    [6] G. A. Miller and W. G. Madow, On the Maximum Likelihood Estimate of the Shannon-Wiener Measure of Information: Bolling Air Force Base, 1954.
    [7] D. V. Lindley, "ON A MEASURE OF THE INFORMATION PROVIDED BY AN EXPERIMENT," Annals of Mathematical Statistics, vol. 27, pp. 986-1005, 1956.
    [8] C. E. Shannon, "THE ZERO ERROR CAPACITY OF A NOISY CHANNEL," Ire Transactions on Information Theory, vol. 2, pp. 8-19, 1956.
    [9] L. Breiman, "THE INDIVIDUAL ERGODIC THEOREM OF INFORMATION-THEORY," Annals of Mathematical Statistics, vol. 28, pp. 809-811, 1957.
    [10] A. I. A. KHINCHIN, Mathematical Foundations of Information Theory: Dover Publications, Incorporated, 1957.
    [11] A. T. B. Reid, Elements of the Theory of Markov Processes and Their Applications: DOVER PUBN Incorporated, 1960.
    [12] A. Birnbaum, "ON THE FOUNDATIONS OF STATISTICAL-INFERENCE - BINARY EXPERIMENTS," Annals of Mathematical Statistics, vol. 32, pp. 414-435, 1961.
    [13] S. Golomb, "A new derivation of the entropy expressions," Information Theory, IRE Transactions on, vol. 7, pp. 166-167, 1961.
    [14] J. Karush, "A SIMPLE PROOF OF AN INEQUALITY OF MCMILLAN," Ire Transactions on Information Theory, vol. 7, pp. 118-&, 1961.
    [15] W. W. Harman, Principles of the statistical theory of communication: McGraw-Hill, 1963.
    [16] S. Kullback, Information Theory and Statistics: Dover Publications, 1997.
    [17] R. W. Hamming, "ERROR DETECTING AND ERROR CORRECTING CODES," Bell System Technical Journal, vol. 29, pp. 147-160, 1950.
    [18] D. A. Huffman, "A METHOD FOR THE CONSTRUCTION OF MINIMUM-REDUNDANCY CODES," Proceedings of the Institute of Radio Engineers, vol. 40, pp. 1098-1101, 1952.
    [19] A. Feinstein, "ERROR BOUNDS IN NOISY CHANNELS WITHOUT MEMORY," Ire Transactions on Information Theory, vol. 1, pp. 13-14, 1955.
    [20] B. McMillan, "Two inequalities implied by unique decipherability," Information Theory, IRE Transactions on, vol. 2, pp. 115-116, 1956.
    [21] A. Feinstein, "On the coding theorem and its converse for finite-memory channels," Information and Control, vol. 2, pp. 25-44, 4// 1959.
    [22] R. Karp, "Minimum-Redundancy Coding for the Discrete Noiseless Channel," IEEE Transactions on Information Theory, vol. 7, pp. 27-38, // 1961.
    [23] W. W. Peterson, "ERROR CORRECTING CODES," Current Science, vol. 30, pp. 480-&, 1961.
    [24] J. Wozencraft, Reiffen,B, "SEQUENTIAL-DECODING " Data Processing, vol. 3, pp. 57-57, 1961.
    [25] S. Golomb, "Run-length encodings (Corresp.)," Information Theory, IEEE Transactions on, vol. 12, pp. 399-401, 1966.
    [26] F. Jelinek and K. Schneider, "On variable-length-to-block coding," Information Theory, IEEE Transactions on, vol. 18, pp. 765-774, 1972.
    [27] J. Rissanen and G. G. Langdon, Jr., "Arithmetic Coding," IBM Journal of Research and Development, vol. 23, pp. 149-162, 1979.
    [28] C. M. Hoffmann, "A NOTE ON UNIQUE DECIPHERABILITY," Lecture Notes in Computer Science, vol. 176, pp. 50-63, 1984.
    [29] I. H. Witten, R. M. Neal, and J. G. Cleary, "Arithmetic coding for data compression," Commun. ACM, vol. 30, pp. 520-540, 1987.
    [30] E. Schrödinger, What is life? The physical aspect of the living cell. Cambridge Eng.New York,: The University press;The Macmillan company, 1945.
    [31] L. Brillouin, Science and information theory, 1956.
    [32] L. Brillouin, "THERMODYNAMICS, STATISTICS, AND INFORMATION," American Journal of Physics, vol. 29, pp. 318-&, 1961.
    [33] E. T. Jaynes, "INFORMATION THEORY AND STATISTICAL MECHANICS," Physical Review, vol. 106, pp. 620-630, 1957.
    [34] L. Szilard, "ON THE DECREASE OF ENTROPY IN A THERMODYNAMIC SYSTEM BY THE INTERVENTION OF INTELLIGENT BEINGS," Behavioral Science, vol. 9, pp. 301-310, 1964.
    [35] R. J. Solomonoff, "A formal theory of inductive inference. Part I," Information and Control, vol. 7, pp. 1-22, 3// 1964.
    [36] R. J. Solomonoff, "A formal theory of inductive inference. Part II," Information and Control, vol. 7, pp. 224-254, 6// 1964.
    [37] Kolmogor.An, "3 APPROACHES TO QUANTITATIVE DEFINITION OF INFORMATION," International Journal of Computer Mathematics, vol. 2, pp. 157-&, 1968.
    [38] G. J. Chaitin, "On the Length of Programs for Computing Finite Binary Sequences: statistical considerations," J. ACM, vol. 16, pp. 145-159, 1969.
    [39] G. J. Chaitin, "RANDOMNESS AND MATHEMATICAL PROOF," Scientific American, vol. 232, pp. 47-52, 1975.
    [40] G. J. Chaitin, "ALGORITHMIC INFORMATION-THEORY," Ibm Journal of Research and Development, vol. 21, pp. 350-359, 1977.
    [41] C. H. Bennett, "THE THERMODYNAMICS OF COMPUTATION - A REVIEW," International Journal of Theoretical Physics, vol. 21, pp. 905-940, 1982.
    [42] W. H. Zurek, "ALGORITHMIC RANDOMNESS AND PHYSICAL ENTROPY," Physical Review A, vol. 40, pp. 4731-4751, Oct 1989.
    [43] W. H. Zurek, Algorithmic information content, Church-Turing thesis, physical entropy, and Maxwell's demon, 1990.
    [44] M. Li and P. M. B. Vitnyi, An Introduction to Kolmogorov Complexity and Its Applications: Springer Publishing Company, Incorporated, 2008.
    [45] I. Prigogine, "DISSIPATIVE STRUCTURES, DYNAMICS AND ENTROPY," International Journal of Quantum Chemistry, pp. 443-456, 1975.
    [46] M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information: Cambridge University Press, 2000.
    [47] T. M. Cover and J. A. Thomas, Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing): Wiley-Interscience, 2006.
    [48] I. E. Richardson, Video Codec Design: Developing Image and Video Compression Systems: John Wiley & Sons, Inc., 2002.
    [49] S. Kullback and R. A. Leibler, "ON INFORMATION AND SUFFICIENCY," Annals of Mathematical Statistics, vol. 22, pp. 79-86, 1951.
    [50] N. Abramson, Information theory and coding: McGraw-Hill, 1963.
    [51] M. Fannes, "A continuity property of the entropy density for spin lattice systems," Communications in Mathematical Physics vol. 31, pp. 291-294, 1973.

    下載圖示 校內:2019-09-01公開
    校外:2019-09-01公開
    QR CODE