Article Teaser:Tensor networks can be used to denote many physical quantities other than probabilities so they are not tailor made for the job of representing probabilities like Bayesian networks are. Judea Pearl won the Turing prize for his work in Bayesian networks. Hinton, one of the most famous Neural Nets researchers, gives Judea Pearl and his Bayesian networks full credit for motivating the invention of probabilistic AI, which is the natural analogue of quantum AI, except in quantum AI, you replace the probabilities of Probabilistic AI by probability amplitudes. Probabilistic AI software: Probability module of Tensorflow, Pyro extension of Pytorch , PyMC, Stan, Edward, Winbugs, etc., etc.

Tensor Networks (tnets) have a very long and illustrious history in General Relativity and High Energy Physics: t’Hooof and Veltman used them in Diagrammar to study the renormalization of gauge theories, Penrose applied them to General Relativity, Cvitanovic used them in his “Birdtracks” book to study Group Theory.

Quantum circuits are clearly a form of tnet. Quantum circuit diagrams were pioneered by Deutsch, who extended the diagrams used by people (e.g., Toffoli and Bennett) working on classical reversible computation by making those diagrams complex valued instead of real valued, and adding single qubit rotations to the CNOTs and doubly controlled NOTS (aka, Toffoli gates) that were already in use in reversible computation.

Guifre Vidal is often credited with being one of the first to use tnets for quantum computing, although he appears to have started using tnets relatively late in their history. This is his first paper in arxiv, dated 2005, where he uses the term “Tensor Network” in the title

Classical simulation of quantum many-body systems with a tree tensor network, by

Yaoyun Shi, Luming Duan, Guifre Vidal.

The use of tnets in quantum computing software has exploded recently, with the first release on May 2019 by Google of a software library called “TensorNetwork”

https://github.com/google/TensorNetwork

A part of tnet history that is never mentioned by the writers and users of Google’s tnet software is its **strong** connection to a much deeper tool, Quantum Bayesian Networks (qbnets). Just like in General Relativity, where tensor notation can be replaced by the far more geometric language of differential forms, in quantum computing, tensor notation can be replaced by the far deeper language of qbnets. Why deeper? Briefly put, unlike tnets, qbnets are a natural generalization to quantum mechanics of classical bayesian nets. Thus, unlike tnets, qbnets are directly relatable to a rich vein of advances, dating back many decades, by Bayesian network pioneers like Judea Pearl and hierachical model pioneers like Andrew Gelman, and to an equally rich vein of software for Bayesian Networks, hierarchical models, MCMC, etc., such as WinBugs, Stan, Edward, PyMC, Tensorflow’s Probability module, PyTorch’s Pyro extension, etc. I’m afraid that the quantum tnet community is ignoring its precursors and taking credit for inventing the wheel.

*Original post can be viewed here.*

© 2020 Data Science Central ® Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Upcoming DSC Webinar**

- Optimization and The NFL’s Toughest Scheduling Problem - June 23

At first glance, the NFL’s scheduling problem seems simple: 5 people have 12 weeks to schedule 256 games over the course of a 17-week season. The scenarios are potentially well into the quadrillions. In this latest Data Science Central webinar, you will learn how the NFL began using Gurobi’s mathematical optimization solver to tackle this complex scheduling problem. Register today.

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Statistics -- New Foundations, Toolbox, and Machine Learning Recipes
- Book: Classification and Regression In a Weekend - With Python
- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Upcoming DSC Webinar**

- Optimization and The NFL’s Toughest Scheduling Problem - June 23

At first glance, the NFL’s scheduling problem seems simple: 5 people have 12 weeks to schedule 256 games over the course of a 17-week season. The scenarios are potentially well into the quadrillions. In this latest Data Science Central webinar, you will learn how the NFL began using Gurobi’s mathematical optimization solver to tackle this complex scheduling problem. Register today.

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions

## You need to be a member of Data Science Central to add comments!

Join Data Science Central