IAS online test series
 Home » Subject » Management » Notes » Information theory

Information Theory

Humans live in the world of full of information. People gain knowledge because they receive, collect and produce information. Information is key concept in social sciences as well as digital communication (Mark Burgin, 2010). Information Theory is one of the arena of scientific studies and outcome of excellent work of Claude Shannon's 1948 paper. Shannon used information only in descriptive sense to the output of the information source and he stays resolutely within the framework of telecommunication. Claude Shannon's Mathematical Theory of Communications has guided communications experts and engineers in their mission for faster, more efficient, and more vigorous communications systems.

Information theory grew in the decades of 1940s and 1950s with the requirement of electrical engineers to design practical communication devices. The theory despite its practical origins is a profound mathematical theory (Shannon and Weaver 1949) concerned with the more basic aspects of the communication processes. In fact, it is a framework for investigating vital issues such as efficiency of information representation and its restrictions in consistent communication. The practical use of this theory stems from its multitude of powerful theorems that are used to compute optimal efficiency bounds for any given communication process. These ideal bounds serve as benchmarks to guide the design of better information systems.

Claude Shannon was not the only academician who created Information Theory. It was the result of many contributions made by many distinct individuals, from different backgrounds, who gave their ideas and expanded upon them. Indeed the diversity and directions of their perspectives and interests formed the direction of Information Theory. In simple terms, Information theory denotes to the mathematics of communications in a broad sense by addressing the theoretical and experimental aspects of information transmission and processing. In information theory, any device, system or process that generates messages as its output is commonly denotes to as an information source. Although each source has its own representation that it uses to put out messages, generally speaking sources represent their messages as combinations of symbols selected from their alphabets, the list of all possible symbols they are capable of producing. The symbols are often called the source symbols or the representation dements. The choice of alphabet and the way the symbols are used to construct messages constitutes a representation or a code-source coding. An important fact about 'natural' information sources is that it never produces messages which are random combinations of their symbols. Instead, their messages tend to possess regularities or what is known as statistical structure. It can be said that the way symbols are put together to form messages obeys certain statistical rules that are source specific.

Information Theory

Information theory basically is the splendid contribution of Shannon. To explain the notion of information, he recognised the critical relationships among the elements of a communication system the power at the source of a signal, the bandwidth or frequency range of an information channel through which the signal travels, and the noise of the channel, such as unpredictable static on a radio, which will alter the signal by the time it reaches the last element of the system, the receiver, which must decode the signal.

In telecommunication field, a channel is the path over a wire or fibre, or in wireless systems, the slice of radio spectrum used to transmit the message through free space. Shannon's equations explained to engineers how much information could be transmitted over the channels of an ideal system. He also explained mathematically the principles of "data compression," which distinguish what the end of this sentence demonstrates, that "only information essential to understand must be transmitted." Shannon represented how experts could transmit information over noisy channels at error rates they could control.

The most vital part in information theory is entropy (Shannon and Weaver, 1949). Shannon took the concept of entropy from thermodynamics where it describes the amount of disorder of a system. In information theory, entropy measures the amount of uncertainty of an unknown or random quantity.

To summarize, Information theory from technical perspective is used currently was created by Claude Shannon and was introduced as a means to study and resolve problems of communication or transmission of signals over channels.