A branch of cybernetics (q.v.) that attempts to define the amount of information required to control a process of given complexity. Information in this narrow technical sense is measured in bits. A bit is the amount of information required to specify one of two alternatives, for example to distinguish between 1 and 0 in the binary notation used in computers.
Information theory is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of data, known as information entropy, is usually expressed by the average number of bits needed for storage or communication. For example, if a daily weather description has 3 bits of entropy, then, over enough days, we can describe daily weather with an average of approximately 3 bits per day (each bit being a 0 or a 1).