My FREE book about Bayesian Networks, Bayesuvius, continues to grow. It currently has 33 chapters. The purpose of this blog post is to announce the release of a new Bayesuvius chapter on Belief Propagation (BP).

Belief Propagation (BP) (aka Message Passing)was first proposed in 1982 by Judea Pearl to simplify the exact evaluation of probability marginals of Bayesian Networks (bnets). It gives exact results for trees and polytrees (i.e. for bnets with a single connected component and no acyclic loops). For bnets with loops, it gives approximate results (loopy belief propagation), and it has been generalized to the junction tree (JT) algorithm which gives exact results for general bnets with loops.

The JT algo starts by clustering the loops of a bnet into bigger nodes so as to transform the bnet into a tree bnet. Then it applies BP to the ensuing tree. The first breakthrough paper to achieve this agenda in full was by Lauritzen, and Spiegelhalter (LS) in 1988. When it first came out, the LS algorithm caused quite a stir, and led to the creation of many bnet companies, many of which continue to exist and flourish today.

So why is BP important?

BP yields a huge reduction in the number of operations (additions and subtractions) necessary to calculate the marginals of the probability distributionassociated with a classical bnet with nodes. BP also works for quantum bnets. In the quantum case, quantum bnets have a complex probability amplitude associated with them, and one seeks to calculate coherent sums.

My open source program Quantum Fog has implemented BP for both classical and quantum bnets since its first release at github in Dec. 2015. Quantum Fog implements the junction tree algorithm (which uses BP) for both probabilities and probability amplitudes.

It is also possible, but so far Quantum Fog doesn't do it, to implement BP directly from the message passing equations invented by Judea Pearl, and then to use that approach to do loopy belief propagation for both classical and quantum bnets. This should be feasible because the message passing recursive equations of BP do not care if the messages being passed are complex valued (quantum bnet case) or real valued (classical bnet case). They only care about the graph structure of the bnet.

I won't encumber the reader of this blog post with an exact statement of those recursive equations. For that level of technicality, I refer the reader to Bayesuvius's chapter on BP. What I will do is to show the graphic that I give in that chapter to motivate those equations. Here it is, with its caption, which reads like a short story:

The yellow node is a gossip monger. It receives messages from all the green nodes, and then it relays a joint message to the red node. Union of green nodes and the red node = full neighborhood of yellow node. There are two possible cases: the red node is either a parent or a child of the yellow one. As usual, we use arrows with dashed (resp., dotted) shafts for downstream (resp., upstream) messages.

© 2020 TechTarget, Inc. Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions

## You need to be a member of Data Science Central to add comments!

Join Data Science Central