Information theory 信息论
(重定向自Shannon theory)
Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification, storage, and communication of information. Information theory was originally developed by Claude E. Shannon to find fundamental limits on signal processing and communication operations such as data compression. Since its inception in a landmark 1948 paper by Shannon entitled "A Mathematical Theory of Communication", it has been broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography, neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, anomaly detection and other forms of data analysis.