Learning Information Theory work project make money

Information Theory



Information is the source of a communication system, whether it is analog or digital. Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information.

Conditions of Occurrence of Events

If we consider an event, there are three conditions of occurrence.

  • If the event has not occurred, there is a condition of uncertainty.

  • If the event has just occurred, there is a condition of surprise.

  • If the event has occurred, a time back, there is a condition of having some information.

Hence, these three occur at different times. The difference in these conditions, help us have a knowledge on the probabilities of occurrence of events.

Entropy

When we observe the possibilities of occurrence of an event, whether how surprise or uncertain it would be, it means that we are trying to have an idea on the average content of the information from the source of the event.

Entropy can be defined as a measure of the average information content per source symbol. Claude Shannon, the “father of the Information Theory”, has given a formula for it as

$$H = -sum_{i} p_ilog_{b}p_i$$

Where $p_i$ is the probability of the occurrence of character number i from a given stream of characters and b is the base of the algorithm used. Hence, this is also called as Shannon’s Entropy.

The amount of uncertainty remaining about the channel input after observing the channel output, is called as Conditional Entropy. It is denoted by $H(x arrowvert y)$

Discrete Memoryless Source

A source from which the data is being emitted at successive intervals, which is independent of previous values, can be termed as discrete memoryless source.

This source is discrete as it is not considered for a continuous time interval, but at discrete time intervals. This source is memoryless as it is fresh at each instant of time, without considering the previous values.

Source Coding

According to the definition, “Given a discrete memoryless source of entropy $H(delta)$, the average code-word length $bar{L}$ for any source encoding is bounded as $bar{L}geq H(delta)$”.

In simpler words, the code-word (For example: Morse code for the word QUEUE is -.- ..- . ..- . ) is always greater than or equal to the source code (QUEUE in example). Which means, the symbols in the code word are greater than or equal to the alphabets in the source code.

Channel Coding

The channel coding in a communication system, introduces redundancy with a control, so as to improve the reliability of the system. Source coding reduces redundancy to improve the efficiency of the system.

Channel coding consists of two parts of action.

  • Mapping incoming data sequence into a channel input sequence.

  • Inverse mapping the channel output sequence into an output data sequence.

The final target is that the overall effect of the channel noise should be minimized.

The mapping is done by the transmitter, with the help of an encoder, whereas the inverse mapping is done at the receiver by a decoder.

Learning working make money

Leave a Reply

Your email address will not be published. Required fields are marked *