
Understanding Mutual Information in Information Theory
Explore the concept of mutual information in information theory, focusing on noisy channels, properties of mutual information, joint entropy, conditional entropy, marginal entropies, and the relationship between joint, conditional, and transinformation in data transmission.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Al-Mustaqbal University College Department of Computer Engineering Techniques Information Theory and coding Fourth stage By: MSC. Ridhab Sami
Lecture 6 Mutual information for noisy channel
Mutual information for noisy channel: Consider the set of symbols ?1, ?2, .,??, the transmitter ?? my produce. The receiver ?? may receive ?1, ?2 .??. Theoretically, if the noise and jamming is neglected, then the set X=set Y. The amount of information that ?? provides about ?? is called the mutual information between ?? and ??.
Properties of ? ?(? ?? ? , ? ?? ?): 1. It is symmetric, ?(?? , ??) = ?(?? , ??). 2. ?(?? , ??) > 0 if aposteriori probability > a priori probability, ?? provides +ve information about ?? . 3. ?(?? , ??) = 0 if aposteriori probability = a priori probability, which is the case of statistical independence when ?? provides no information about ?? . 4. ?(?? , ??) < 0 if aposteriori probability < a priori probability, ?? provides -ve information about ?? , or ?? adds ambiguity.
1. Joint entropy: In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. P(x , y) joint probability n Entropy ? ? = ? ?? ???2? ?? ?=1 ? ? Joint Entropy ? ??,?? ???2? ??,?? ??? ? ?,? = ?(??) = ? ?????? ?=1 ?=1
2. Conditional entropy: In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known. P(x | y) conditional probability n Entropy ? ? = ? ?? ???2? ?? ?=1 ? ? ? ??,?? ???2? ?? | ?? ??? ? ? | ? = ? ?????? Conditional Entropy ?=1 ?=1
3. Marginal Entropies: Marginal entropies is a term usually used to denote both source entropy H(X) defined as before and the receiver entropy H(Y) given by: n ? ? = ? ?? ???2? ?? Source Entropy ?=1 ? ? ? = ? ?? ???2? ?? ???/?????? Receiver Entropy ?=1
4. Relationship between joint, conditional and transinformation: Noise entropy: ?(? ?) = ?(?,?) ?(?) Loss entropy: ?(? ?) = ?(?,?) ?(?) Also we have Also we havetransinformation transinformation (average mutual information): (average mutual information): ?(?,?) = ?(?) ?(? ?) ?(?,?) = ?(?) ?(? ?)
Example: The joint probability of a system is given by: ?1 ?2 ?3 0.5 0 0.0625 0.25 0.125 0.0625 ? ?,? = Find: 1. Marginal entropies. 2. Joint entropy. 3. Conditional entropies. 4. The transinformation.
x1 x2 x3 y1 y2 Solution: P(x)=[0.75 0.125 0.125], P(y)= [0.5625 0.4375] 1- Marginal entropies: ? 0.75??0.75 + 2 0.125??0.125 ??2 ? ? = ? ?? ???2? ?? = ?=1 = 1.06127 ?? ? ?????? ? 0.5625??0.5625 + 0.4375??0.4375 ??2 ? ? = ? ?? ???2? ?? = ?=1 = 0.9887 ?? ? ??????
?1 ?2 ?3 0.5 0 0.0625 0.25 0.125 0.0625 2- Joint entropy: ? ?,? = m n H x,y = P xi,yj log2P xi,yj j=1 i=1 0.5??0.5 + 0.25??0.25 + 0.125??0.125 + 2 0.0625??0.0625 ??2 = = 1.875 ?? ? ??????
3. Conditional entropies : ?? ? ? ? = ? ?,? ? ? = 1.875 1.06127 = 0.813 ? ?????? ?? ? ? ? = ? ?,? ? ? = 1.875 0.9887 = 0.886 ? ?????? 4. The transinformation : ? ?,? = ? ? ? ? ? = 1.06127 0.886 = 0.175 ?? ? ??????