Information theory is the branch of probability theory that deals with uncertainty, accuracy, and information content in the transmission of messages. It can be applied to any system of communication (electric signals, fiber optic pulses, speech, etc.). Random signals, known as noise, are often added to a message during the transmission process, altering the signal received from that sent. Information theory is used to work out the probability that a particular signal received is the same as the signal sent. In transmitting a sequence of numbers, their sum might also be transmitted so that the receiver will know that there is an error when the sum does not correspond to the rest of the message. The sum itself gives no extra information, simply a confirmation. The statistics of choosing a message out of all possible messages (letters like the alphabet or binary digits for example) determines the amount of information contained in it. Information is measured in bits (binary digits). If one out of two possible signals are sent then the information content is one bit. A choice of one out of four possible signals contains more information although the signal itself might be the same.