1 minute read

Information theory



Information theory, or communication theory, mathematical discipline that aims to maximize the information that can be conveyed by communications systems and minimize the errors that arise in the course of transmission. The information content of a message is conventionally quantified in terms of bits (binary digits). Each bit represents a simple alternative: in terms of a message, a yes or no; in terms of the components in an electrical circuit, that a switch is open or closed. Mathematically the bit is usually represented as 0 or 1. Complex messages can be represented as a series of bit alternatives. Five bits of information only are needed to specify any letter of the alphabet, given an appropriate code. Thus able to quantify information, the theory employs statistical methods to analyze practical communications problems. The errors that arise in the transmission of signals, often termed noise, can be minimized by the incorporation of redundancy, wherein more bits of information than are strictly necessary to encode a message are transmitted so that if some are altered in transmission there is still enough information to allow the signal to be correctly interpreted. The handling of redundant information costs something in reduced speed of or capacity for transmission, but the reduction in message errors compensates for this loss. Information theoreticians often point to an analogy between the thermodynamic concept of entropy and the degree of misinformation in a signal.



See also: Mathematics.

Additional topics

21st Century Webster's Family Encyclopedia21st Century Webster's Family Encyclopedia - Inert gas to Jaruzelski, Wojciech