What is Information Theory?

I am thankful to Max Andrews for this info:

Information theory is the branch of probability theory that deals with uncertainty, accuracy, and information content in the transmission of messages. It can be applied to any system of communication (electric signals, fiber optic pulses, speech, etc.). Random signals, known as noise, are often added to a message during the transmission process, altering the signal received from that sent.  Information theory is used to work out the probability that a particular signal received is the same as the signal sent.  In transmitting a sequence of numbers, their sum might also be transmitted so that the receiver will know that there is an error when the sum does not correspond to the rest of the message.  The sum itself gives no extra information, simply a confirmation.  The statistics of choosing a message out of all possible messages (letters like the alphabet or binary digits for example) determines the amount of information contained in it.  Information is measured in bits (binary digits). If one out of two possible signals are sent then the information content is one bit.  A choice of one out of four possible signals contains more information although the signal itself might be the same.

For more information see John Daintith and John Clark’s The Facts on File Dictionary of Mathematics (New York: Market Book House, 1999), 97.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.