List of information theory topics
Encyclopedia
This is a list of information theory topics, by Wikipedia page.
- A Mathematical Theory of CommunicationA Mathematical Theory of Communication"A Mathematical Theory of Communication" is an influential 1948 article by mathematician Claude E. Shannon. As of November 2011, Google Scholar has listed more than 48,000 unique citations of the article and the later-published book version...
- algorithmic information theoryAlgorithmic information theoryAlgorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information...
- arithmetic encoding
- channel capacityChannel capacityIn electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
- Communication Theory of Secrecy SystemsCommunication Theory of Secrecy SystemsCommunication Theory of Secrecy Systems is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments of modern cryptography...
- conditional entropyConditional entropyIn information theory, the conditional entropy quantifies the remaining entropy of a random variable Y given that the value of another random variable X is known. It is referred to as the entropy of Y conditional on X, and is written H...
- conditional quantum entropyConditional quantum entropyThe conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory...
- confusion and diffusionConfusion and diffusionIn cryptography, confusion and diffusion are two properties of the operation of a secure cipher which were identified by Claude Shannon in his paper Communication Theory of Secrecy Systems, published in 1949....
- cross entropyCross entropyIn information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution q, rather than the "true" distribution p.The cross entropy...
- data compressionData compressionIn computer science and information theory, data compression, source coding or bit-rate reduction is the process of encoding information using fewer bits than the original representation would use....
- entropy encodingEntropy encodingIn information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium....
- Fisher informationFisher informationIn mathematical statistics and information theory, the Fisher information is the variance of the score. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior...
- Hick's lawHick's lawHick's Law, named after British psychologist William Edmund Hick, or the Hick–Hyman Law , describes the time it takes for a person to make a decision as a result of the possible choices he or she has. The Hick-Hyman Law assesses cognitive information capacity in choice reaction experiments...
- Hirchman uncertainty
- Huffman encoding
- information bottleneck methodInformation bottleneck methodThe information bottleneck method is a technique introduced by Naftali Tishby et al. [1] for finding the best tradeoff between accuracy and complexity when summarizing a random variable X, given a joint probability distribution between X and an observed relevant variable Y...
- information entropyInformation entropyIn information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
- information theoretic securityInformation theoretic securityA cryptosystem is information-theoretically secure if its security derives purely from information theory. That is, it is secure even when the adversary has unlimited computing power. The adversary simply does not have enough information to break the security...
- information theoryInformation theoryInformation theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
- joint entropy
- Kullback-Leibler divergence
- lossless data compressionLossless data compressionLossless data compression is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be reconstructed, in exchange...
- negentropyNegentropyThe negentropy, also negative entropy or syntropy, of a living system is the entropy that it exports to keep its own entropy low; it lies at the intersection of entropy and life...
- principle of maximum entropyPrinciple of maximum entropyIn Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints , the probability distribution which best represents the current state of knowledge is the one with largest entropy.Let some testable information about a probability distribution...
- quantum information scienceQuantum information scienceQuantum information science is an area of study based on the idea that information science depends on quantum effects in physics. It includes theoretical issues in computational models as well as more experimental topics in quantum physics including what can and cannot be done with quantum...
- range encodingRange encodingRange encoding is a data compression method defined by G. Nigel N. Martin in a 1979 paper Range encoding is a form of arithmetic coding that was historically of interest for avoiding some patents on particular later-developed arithmetic coding techniques...
- redundancyRedundancy (information theory)Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message. Informally, it is the amount of wasted "space" used to transmit certain data...
- Rényi entropyRényi entropyIn information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system...
- self-informationSelf-informationIn information theory, self-information is a measure of the information content associated with the outcome of a random variable. It is expressed in a unit of information, for example bits,nats,or...
- Shannon limit
- Shannon's lawShannon's lawShannon's law may refer to:* Shannon–Hartley theorem, any statement defining the theoretical maximum rate at which error-free digits can be transmitted over a bandwidth-limited channel in the presence of noise...
- Shannon's theorem