Understanding Information Entropy in Information Theory
Information entropy, a key concept in information theory, measures the average amount of information in a message. Source entropy and binary source entropy are explained with examples, along with maximum source entropy for both binary and non-binary sources. Learn how to calculate entropy for different sources and grasp the concept of uncertainty in information transmission.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Al-Mustaqbal University College Department of Computer Engineering Techniques Information Theory and coding Fourth stage By: MSC. Ridhab Sami
Lecture 5 Average information (entropy)
Average information (entropy): In information theory, entropy is the average amount of information contained in each message received. 1. Source Entropy: If the source produces not equal probability messages then ?(?? ),?= 1, 2, 3, n are different. Then the statistical average of ?(?? ) over i will give the average amount of uncertainty associated with source X. This average is called source entropy and denoted by ?(?), given by: ? ) ? ? = ? ?? ? ?? ? ?? = ?????(?? ?=1 ? ?? ? ?????? ? ? = ? ?? ????? ?? ?=1
Example: Find the entropy of the source producing the following messages: ??1 = 0.25, ??2 = 0.1, ??3 = 0.15, ?????4 = 0.5 Solution: 4 ? ? = ? ?? ???2? ?? ?=1 H ? = 0.25???20.25 + 0.1???20.1 + 0.15???20.15 + 0.5???20.5 0.25??0.25 + 0.1??0.1 + 0.15??0.15 + 0.5??0.5 ??2 = ?(?) = 1.742 ?? ? ??????
2. Binary Source entropy: In information theory, the binary entropy function, denoted or H(X) or Hb(X), is defined as the entropy of a Bernoulli process with probability p of one of two values. Mathematically, the Bernoulli trial is modeled as a random variable X that can take on only two values: 0 and 1: ?(0) + ?(1) = 1 ? We have: ? ? = ? ?? ????? ?? ?=1 2 ??? = ? ?? ????? ?? ?=1 Then ??? = ? 0 ???2? 0 + ? 1 ???2? 1 ?? ? ??????
Example: Example: Find the entropy for binary source if P(0)=0.2. Solution: Solution: P(1) = 1-P(0) = 1-0.2 =0.8 Then 2 ??? = ? ?? ????? ?? ?=1 ??? = 0.2???20.2 + 0.8???20.8 0.2??0.2 + 0.8??0.8 ??2 = = 0.7 ?? ? ??????
3. Maximum Source Entropy: For binary source, if?(0) = ?(1) = 0.5, , then the entropy is: ??? = 0.5???20.5 + 0.5???20.5 1 2 = ???2 = ???22 = 1 ???
3. Maximum Source Entropy: For any non-binary source, if all messages are equiprobable ?(??) = 1/? Then 1 ????? 1 ? so that: ) ? ? = ?(????= ? 1 ? = ???? ) ?(????= ????? ????/?????? Which is the maximum value of source entropy. Also, ?(?) = 0 if one of the message has the probability of a certain event or p(x) =1.
Example: A source emits 8 characters with equal probability, Find the max entropy ?(?)???. Solution: n=8 ) ?(????= ???2? = ???28 = 3 ?? ? ??????
4. Source Entropy Rate: It is the average rate of amount of information produced per second. ?(x) = ?(x) ???? ?? ????????? ? ? ??????? = bits/sec =bps The unit of H(X) is bits/symbol and the rate of producing the symbols is symbol/sec, so that the unit of R(X) is bits/sec. ) ?(? ? ? ? = Where ? ? = ??? ?? ?=1 ? is the average time duration of symbols, ?? is the time duration of the symbol ??.
Example : A source produces dots . And dashes - with P(dot)=0.65. If the time duration of dot is 200ms and that for a dash is 800ms. Find the average source entropy rate. Solution: ) ?(? ? ?(??? ) = 1 ?(???) = 1 0.65 = 0.35 ? ? = ? ? ? = ? ?? ????? ?? ?=1 ? ? = 0.65???20.65 + 0.35???20.35 =0.934 bits/symbol ? ? = ??? ?? ?=1 200 1000 =0.2 sec ? = 0.2 0.65 + 0.8 0.35 = 0.41 ??c ) ?(? ? 0.934 0.41= 2.278 ??? ? ? = =
Example :A discrete source emits one of five symbols once every millisecond. The symbol probabilities are 1/2, 1/4, 1/8, 1/16 and 1/16 respectively. Calculate the information rate. Solution: 5 ? ? = ? ?? ????? ?? ?=1 1 2log2 1 2+1 1 4+1 1 8+ 1 1 1 1 = 4log2 8???2 16???2 16+ 16???2 16 1 2log22 +1 4log24 +1 1 1 = 8???28 + 16???216 + 16???216 = 0.5 + 0.5 + 0.375 + 0.25 + 0.25 = 1.875 ?? ? ?????? ) ?(? ? 1.875 10 3= 1.875 ???? ? ? = =