"The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s."
It studies the processing, transmission, and storage of information, including how complex information can self-organize over time.
Entropy: A measure of uncertainty or randomness in a system.
Information: A measure of reduction in uncertainty or increase in knowledge.
Coding Theory: The study of efficient encoding and decoding of information.
Data Compression: The process of reducing the size of digital data without losing information.
Entropy Coding: A family of compression techniques that use entropy-based models to assign shorter codes to frequently occurring symbols.
Shannon's Information Theory: The foundational theory of modern information theory, introduced by Claude Shannon in 1948.
Channel Capacity: The maximum amount of information that can be transmitted through a communication channel.
Noisy Channel Coding: The study of encoding techniques that can correct errors introduced by a noisy communication channel.
Lossless Compression: Data compression that retains all the original information without any loss.
Lossy Compression: Data compression that sacrifices some quality for a smaller file size.
Information Theory in Machine Learning: The application of information theory to algorithms that learn from data.
Kolmogorov Complexity: A measure of the complexity of an object that is the length of the shortest program that can recreate it.
Information Theory in Cryptography: The use of information theory to design secure communication protocols.
Algorithmic Information Theory: The study of the information content of a message that is conveyed by the algorithm that generates it.
"The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering."
"Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process."
"For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy, less uncertainty) than specifying the outcome from a roll of a die (with six equally likely outcomes)."
"Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy."
"Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security."
"Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL)."
"Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, and the development of the Internet."
"The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, molecular dynamics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection, and even art creation."