Shannon entropy

Shannon’s entropy quantifies the quantity of data during a variable, thus providing the inspiration for a theory around the notion of information.

Shannon entropy

Input

$$ If\ we\ have\ a\ symbol\ set\ {A,B,C,D,E} where\ the\ symbol\ occurance\ frequencies\ are:$$ $$ A=0.5$$ $$ B=0.2$$ $$ C=0.1$$ $$ D=0.1$$ $$ E=0.1$$

Solution

$$ The\ average\ minimum\ number\ of\ bits\ needed\ to\ represent\ a\ symbol\ is$$ $$H(X) = -[(0.5log20.5 + 0.2log20.2 + (0.1log20.1)*3)]$$ $$H(X) = -[-0.5 + (-0.46438) + (-0.9965)]$$ $$H(X) = -[-1.9]$$ $$H(X) = 1.9$$

What is Shannon’s entropy ?

Shannon’s entropy quantifies the quantity of data during a variable, thus providing the inspiration for a theory around the notion of information.

Storage and transmission of data can intuitively be expected to be tied to the quantity of data involved. for instance, information may be about the result of a coin toss. This information is often stored during a Boolean variable which will combat the values 0 or 1. We will use the variable to represent the data like the coin toss, viz.,

Whether the coin toss came up heads or not. In digital storage and transmission technology, this Boolean variable is often represented during a single "bit", the essential unit of digital information storage/transmission.

However, this bit directly stores the worth of the variable, i.e. the raw data like the result of the coin toss. It doesn't succinctly capture the knowledge within the coin toss, e.g., whether the coin is biased or unbiased, and, if biased, how biased.

Whereas, Shannon’s entropy metric quantifies, among other things, the absolute minimum amount of storage and transmission needed for succinctly capturing any information (as against raw data), and in typical cases that amount is a smaller amount than what's required to store or transmit the raw data behind the knowledge. Shannon’s Entropy metric also suggests away of representing the knowledge within the calculated fewer number of bits.

The formula of Shannon Entropy

So here is the formula for calculating the Shannon entropy.

Shannon Entropy E = -∑i(p(i)×log2(p(i)))

How to use this tool Shannon’s entropy

This tool is really easy to use and even our tool has a really very simple layout so that it will be easy to understand for people.

Now as you can see this tool doesn’t require any registration and it's a totally free online tool that anyone can use from anywhere. They can use it on the phone and they can use it on a desktop however they feel comfortable.

You can see on your screen you have only one text box in this tool right but you have another different number.

You must be confused about how you are going to type other numbers but you don’t need to worry too much because we have already mentioned how you will type numbers in them.

Just typing space or typing a comma at the end of each number will be fine.

And then you will be able to calculate your Function map single arrow.

Simply click on the calculate button to get the answer.

Tips: you should bookmark this tool so that you can use it in the future.