Information Entropy in Information Theory

 
Information Theory and coding
 Fourth stage
 
 
By:
MSC. Ridhab Sami
 
 
Al-Mustaqbal University College
Department of Computer  Engineering Techniques
 
Lecture 5
Average information (entropy)
 
Average information (entropy):
 
 
In information theory, 
entropy 
is the average amount of information contained in each message
received.
 
1.
Source Entropy:
 
 
If the source produces 
not equal probability messages 
not equal probability messages 
then 
𝐼
(
𝑥𝑖
 ),
 
𝑖 
=
 1, 2, 3,…n are
different. Then the statistical average of 
𝐼
(
𝑥𝑖
 ) 
over 
i 
will give the average amount of uncertainty
associated with source X. This average is called source entropy and denoted by 
𝐻
(
𝑋
), 
given by:
 
Example:
 
Find the entropy of the source producing the following messages:
𝑃𝑥
1 = 0.25, 
𝑃𝑥
2 = 0.1, 
𝑃𝑥
3 = 0.15, 
𝑎𝑛𝑑
 
𝑃𝑥
4 = 0.5
 
Solution
:
 
2. Binary Source entropy:
 
 
In information theory, the 
binary entropy function
, denoted or H(X) or Hb(X), is defined as the
entropy of a Bernoulli process with probability 
p 
of one of two values. Mathematically, the Bernoulli trial
is modeled as a random variable 
X 
that can take on only two values: 0 and 1:
 
𝑃
(0
 
) + 
𝑃
(1) = 1
 
We have:
 
Then
 
Example:
 
Find the entropy for binary source if P(0)=0.2.
 
Solution:
 
P(1) = 1-P(0) = 1-0.2 =0.8
 
Then
 
3. Maximum Source Entropy:
 
For binary source
For binary source
, if
 
𝑃
(0
 
) = 
𝑃
(1) = 0.5
, 
then the entropy is
:
 
For any non-binary source
For any non-binary source
, if all messages are equiprobable
 
Then
 
𝑃
(
𝑥𝑖
) = 1/
𝑛
 
so that:
 
Which is the maximum value of source entropy. Also, 
𝐻
(
𝑋
) = 0 if one of the message has the probability
of a certain event or p(x) =1.
 
3. Maximum Source Entropy:
 
Solution:
 
n=8
 
4. Source Entropy Rate:
 
It is the average rate of amount of information produced per second.
 
𝑅
(
x
) = 
𝐻
(
x
) 
×
 
𝑟𝑎𝑡𝑒
 
𝑜𝑓
 
𝑝𝑟𝑜𝑑𝑢𝑐𝑖𝑛𝑔
 
𝑡
ℎ
𝑒
 
𝑠𝑦𝑚𝑏𝑜𝑙𝑠 = bits/sec =bps
 
The unit of H(X) is bits/symbol and the rate of producing the symbols is symbol/sec, so that the unit
of 
R(X)
 is bits/sec.
 
Where
 
𝜏
̅ is the average time duration of symbols, 
𝜏𝑖
 is the time duration of the symbol 
𝑥𝑖
.
 
Example :
 
A source produces 
dots ‘.’ And dashes ‘
-
‘ with P(dot)=0.65. If the time duration of dot 
is
200ms and that for a dash is 800ms. Find the average source entropy rate.
 
Solution:
 
𝑃
(
𝑑𝑎𝑠
ℎ) = 1 − 
𝑃
(
𝑑𝑜𝑡
) = 1 − 0.65 = 0.35
 
 =0.934    
bits/symbol
 
Example :
A discrete source emits one of five symbols once every millisecond. The symbol probabilities
are 1/2, 1/4, 1/8, 1/16 and 1/16 respectively. Calculate the information rate.
 
Solution:
Slide Note
Embed
Share

Information entropy, a key concept in information theory, measures the average amount of information in a message. Source entropy and binary source entropy are explained with examples, along with maximum source entropy for both binary and non-binary sources. Learn how to calculate entropy for different sources and grasp the concept of uncertainty in information transmission.

  • Information Theory
  • Entropy
  • Source Entropy
  • Binary Source
  • Uncertainty

Uploaded on Sep 28, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Al-Mustaqbal University College Department of Computer Engineering Techniques Information Theory and coding Fourth stage By: MSC. Ridhab Sami

  2. Lecture 5 Average information (entropy)

  3. Average information (entropy): In information theory, entropy is the average amount of information contained in each message received. 1. Source Entropy: If the source produces not equal probability messages then ?(?? ),?= 1, 2, 3, n are different. Then the statistical average of ?(?? ) over i will give the average amount of uncertainty associated with source X. This average is called source entropy and denoted by ?(?), given by: ? ) ? ? = ? ?? ? ?? ? ?? = ?????(?? ?=1 ? ?? ? ?????? ? ? = ? ?? ????? ?? ?=1

  4. Example: Find the entropy of the source producing the following messages: ??1 = 0.25, ??2 = 0.1, ??3 = 0.15, ?????4 = 0.5 Solution: 4 ? ? = ? ?? ???2? ?? ?=1 H ? = 0.25???20.25 + 0.1???20.1 + 0.15???20.15 + 0.5???20.5 0.25??0.25 + 0.1??0.1 + 0.15??0.15 + 0.5??0.5 ??2 = ?(?) = 1.742 ?? ? ??????

  5. 2. Binary Source entropy: In information theory, the binary entropy function, denoted or H(X) or Hb(X), is defined as the entropy of a Bernoulli process with probability p of one of two values. Mathematically, the Bernoulli trial is modeled as a random variable X that can take on only two values: 0 and 1: ?(0) + ?(1) = 1 ? We have: ? ? = ? ?? ????? ?? ?=1 2 ??? = ? ?? ????? ?? ?=1 Then ??? = ? 0 ???2? 0 + ? 1 ???2? 1 ?? ? ??????

  6. Example: Example: Find the entropy for binary source if P(0)=0.2. Solution: Solution: P(1) = 1-P(0) = 1-0.2 =0.8 Then 2 ??? = ? ?? ????? ?? ?=1 ??? = 0.2???20.2 + 0.8???20.8 0.2??0.2 + 0.8??0.8 ??2 = = 0.7 ?? ? ??????

  7. 3. Maximum Source Entropy: For binary source, if?(0) = ?(1) = 0.5, , then the entropy is: ??? = 0.5???20.5 + 0.5???20.5 1 2 = ???2 = ???22 = 1 ???

  8. 3. Maximum Source Entropy: For any non-binary source, if all messages are equiprobable ?(??) = 1/? Then 1 ????? 1 ? so that: ) ? ? = ?(????= ? 1 ? = ???? ) ?(????= ????? ????/?????? Which is the maximum value of source entropy. Also, ?(?) = 0 if one of the message has the probability of a certain event or p(x) =1.

  9. Example: A source emits 8 characters with equal probability, Find the max entropy ?(?)???. Solution: n=8 ) ?(????= ???2? = ???28 = 3 ?? ? ??????

  10. 4. Source Entropy Rate: It is the average rate of amount of information produced per second. ?(x) = ?(x) ???? ?? ????????? ? ? ??????? = bits/sec =bps The unit of H(X) is bits/symbol and the rate of producing the symbols is symbol/sec, so that the unit of R(X) is bits/sec. ) ?(? ? ? ? = Where ? ? = ??? ?? ?=1 ? is the average time duration of symbols, ?? is the time duration of the symbol ??.

  11. Example : A source produces dots . And dashes - with P(dot)=0.65. If the time duration of dot is 200ms and that for a dash is 800ms. Find the average source entropy rate. Solution: ) ?(? ? ?(??? ) = 1 ?(???) = 1 0.65 = 0.35 ? ? = ? ? ? = ? ?? ????? ?? ?=1 ? ? = 0.65???20.65 + 0.35???20.35 =0.934 bits/symbol ? ? = ??? ?? ?=1 200 1000 =0.2 sec ? = 0.2 0.65 + 0.8 0.35 = 0.41 ??c ) ?(? ? 0.934 0.41= 2.278 ??? ? ? = =

  12. Example :A discrete source emits one of five symbols once every millisecond. The symbol probabilities are 1/2, 1/4, 1/8, 1/16 and 1/16 respectively. Calculate the information rate. Solution: 5 ? ? = ? ?? ????? ?? ?=1 1 2log2 1 2+1 1 4+1 1 8+ 1 1 1 1 = 4log2 8???2 16???2 16+ 16???2 16 1 2log22 +1 4log24 +1 1 1 = 8???28 + 16???216 + 16???216 = 0.5 + 0.5 + 0.375 + 0.25 + 0.25 = 1.875 ?? ? ?????? ) ?(? ? 1.875 10 3= 1.875 ???? ? ? = =

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#