Subscribe to DSC Newsletter

third-man

Decision Trees (DTs) are certainly cool and it is not my intention to belittle them here. Compared to Bayesian Networks (bnets), they seem easier to construct. In fact, I just wrote a poem dedicated to Decision Trees. It starts "I think that I shall never see an AI as lovely as a decision tree". DTs do have some drawbacks, but, like I said, the purpose of this blog post is not to criticize them.

The real purpose of this post is to point out that there is a secret romance going on between Decision Trees and Bayesian Networks. Say you have a simple binary DT with YES and NO branches. Then you can construct an equivalent bnet with exactly the same tree graph. You turn the branches into arrows pointing down from the apex root node. Each fork in the tree becomes a node of the bnet. However, the nodes will have to have three states instead of two: NO, YES and NULL. This third state called NULL is a small overhead cost, a small price to pay. In return, you get to keep the tree structure in the equivalent bnet.

I explain all this more precisely in my book Bayesuvius, where I describe this technique in 2 new chapters, entitled:

  1. Decision Trees
  2. Binary Decision Diagrams

Views: 592

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Videos

  • Add Videos
  • View All

© 2020   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service

console.log("HostName");