
Belief Propagation Tensor Notation Guide
Explore an alternate notation using tensors for the Belief Propagation algorithm in section 2, including tensor multiplication, marginalization, and rank-r tensor concepts. Understand the matrix-vector product, pointwise product, and more through detailed examples. Dive into the Sum-Product Belief Propagation process, focusing on factor graphs, message initialization, and computation of exact marginals.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Section 2 Appendix Tensor Notation for BP 1
Tensor Notation for BP In section 2, BP was introduced with a notation which defined messages and beliefs as functions. This Appendix includes an alternate (and very concise) notation for the Belief Propagation algorithm using tensors.
Tensor Notation Tensor multiplication: Tensor marginalization: 3
Tensor Notation A rank-r tensor is Axis-labeled array with arbitrary indices Database with column headers A real function with r keyword arguments = = Tensor multiplication: (vector outer product) red blue 6 1 3 2 5 X Y 4 valu e valu e Y X 1 2 red blue 3 4 5 6 X 1 2 1 2 Y value Y red red blue blue 12 red blue 20 1 2 12 18 18 X 20 30 30 4
Tensor Notation A rank-r tensor is Axis-labeled array with arbitrary indices Database with column headers A real function with r keyword arguments = = Tensor multiplication: (vector pointwise product) a 3 b 5 a 4 b 6 X X valu e valu e X X a b a b 3 4 5 6 X valu e a b 12 X a b 12 30 30 5
Tensor Notation A rank-r tensor is Axis-labeled array with arbitrary indices Database with column headers A real function with r keyword arguments = = Tensor multiplication: (matrix-vector product) X Y Y value X valu e blue red red 1 2 1 2 3 1 2 7 red 5 1 1 7 3 4 X X 8 blue 4 2 2 8 5 6 blue 6 Y X Y value blue red red 1 2 1 2 21 1 red 21 28 40 X blue 2 28 40 48 6 blue 48
Tensor Notation A rank-r tensor is Axis-labeled array with arbitrary indices Database with column headers A real function with r keyword arguments = = Tensor marginalization: Y X 1 2 1 2 Y value red blue red red blue blue 3 1 2 3 4 5 X 5 6 4 6 Y Y valu e red blue red blue 8 8 10 7 10
Sum-Product Belief Propagation Input: a factor graph with no cycles Output: exact marginals for each variable and factor Algorithm: 1. Initialize the messages to the uniform distribution. 1. 2. Choose a root node. Send messages from the leaves to the root. Send messages from the root to the leaves. 1. Compute the beliefs (unnormalized marginals). 2. Normalize beliefs and return the exact marginals. 8
Sum-Product Belief Propagation Variables Factors 2 X2 Beliefs 3 1 X1 1 X1 X3 2 Messages X2 3 1 X1 1 X1 X3 9
Sum-Product Belief Propagation Variables Factors 2 X2 Beliefs 3 1 X1 1 X1 X3 2 Messages X2 3 1 X1 1 X1 X3 10
Sum-Product Belief Propagation Variable Belief 2 v n p 1 2 2 v n p 0.1 3 1 v n p 4 1 0 3 1 X1 v n p .4 6 0 11
Sum-Product Belief Propagation Variable Message 2 v n p 1 2 2 v n p 0.1 3 1 v n p 0.1 6 2 3 1 X1 12
Sum-Product Belief Propagation Factor Belief v n p d n 4 1 0 p 0.1 8 d 3 n 1 v n 0.2 8 0 1 1 X1 X3 v n p 3.2 6.4 d 18 0 n 0 0 13
Sum-Product Belief Propagation Factor Message v n p 0.8 + 0.16 d 24 + 0 n 8 + 0.2 p 0.1 8 d 3 n 1 v n 0.2 8 0 1 1 X1 X3 14
Loopy Belief Propagation Input: a factor graph with cycles Output: approximate marginals for each variable and factor Algorithm: 1. Initialize the messages to the uniform distribution. 1. Send messages until convergence. Normalize them when they grow too large. 1. Compute the beliefs (unnormalized marginals). 2. Normalize beliefs and return the approximate marginals. 15