EssaysForStudent.com - Free Essays, Term Papers & Book Notes
Search

Information Theory

By:   •  Term Paper  •  1,349 Words  •  February 11, 2010  •  912 Views

Page 1 of 6

Join now to read essay Information Theory

1. Introduction

Information theory is the mathematical theory of data communication and

storage generally considered to have been founded in 1948 by Claude E.

Shannon. The central paradigm of classic information theory is the

engineering problem of the transmission of information over a noisy channel.

The main result of this theory is Shannon's noisy-channel coding theorem,

which states that reliable communication is possible over unreliable channels. It

is possible to surround a noisy channel with appropriate encoding and

decoding systems, such that messages can be communicated at any rate less

than (but arbitrarily close to) the channel capacity with an arbitrarily small

probability of error.

Information theory in the 1950s was sometimes classified as a branch of the

then voguish field called "cybernetics", which included many aspects of

potential machine representation of the world; it is a broad and deep

mathematical theory, with equally broad and deep applications, chief among

them coding theory.

Coding theory is concerned with finding explicit methods, called codes, of

increasing the efficiency and fidelity of data communication over a noisy

channel up near the limit that Shannon proved is all but possible. These codes

can be roughly subdivided into data compression and error-correction codes.

It took many years to find the good codes whose existence Shannon proved.

A third class of codes are cryptographic ciphers; concepts from coding theory

and information theory are much used in cryptography and cryptanalysis; see

the article on deciban for an interesting historical application.

Information theory is also used in intelligence, gambling, statistics, and even

music composition.

2. Redundancy

Redundancy in information theory is the number of bits used to transmit a

message minus the number of bits of actual information in the message. Data

compression is a way to eliminate such redundancy, while checksums are a

way of adding redundancy.

3. Entropy

Entropy is a concept in thermodynamics (see thermodynamic entropy),

statistical mechanics and information theory. The concepts of information and

entropy have deep links with one another, although it took many years for the

development of the theories of statistical mechanics and information theory to

3

make this apparent. This article is about information entropy, the

information-theoretic formulation of entropy.

The basic concept of entropy in information theory has to do with how much

randomness there is in a signal or random event. An alternative way to look

at this is to talk about how much information is carried by the signal.

As an example consider some English text, encoded as a string of letters,

spaces and punctuation (so our signal is a string of characters). Since some

characters are not very likely (e.g. 'z') while others are very common (e.g. 'e')

the string of characters is not really as random as it might be. On the other

hand, since we cannot predict what the next character will be, it does have

some 'randomness'. Entropy is a measure of this randomness, suggested by

Claude E. Shannon in his 1948 paper A Mathematical Theory

Download as (for upgraded members)  txt (9.4 Kb)   pdf (131.5 Kb)   docx (14.9 Kb)  
Continue for 5 more pages »