Subscribe to DSC Newsletter

AI vs Deep Learning vs Machine Learning

Summary:  Which of these terms means the same thing:  AI, Deep Learning, Machine Learning?  Are you sure?  While there’s overlap none of these is a complete subset of the others and none completely explains the others.

 

Take this quiz.

Which of the following are substantially the same things?

A. AI

B. Deep Learning

C.  Machine Learning

(Select your answer)

1.  A and B

2.  B and C

3.  A and C

4.  All of the above.

For as precise a profession as we data scientists purport to be we are sometimes way too casual with our language.  Read several articles about AI, Deep Learning, and Machine learning and you will come away confused whether these are all the same or all different.  Imagine how confused non-data scientists must be.

The truth is that each of these terms has some overlap in a Venn diagram but none of these is a perfect subset of the other and none completely explains the others.  Let us explain.

 

(AI) Artificial Intelligence

For all the conversation and progress in the field of AI we are still nowhere near either the Terminator or even Rosie the Robot.  The core of fictional (or perhaps aspirational) AI is Artificial General Intelligence (AGI) which does not currently exist.  Artificial General Intelligence is the Holy Grail of AI.  If and when it arrives it will be the true killer app (we hope not literally) that allows machines to function as humans in our wildly chaotic environment.

Fully functional AGI would be able to pass a number of these increasingly difficult proposed tests:

The Turing Test (of Sci-Fi fame):  Convince a human through conversation that it is not a robot.

The Coffee Test (Goertzel):  An AGI enabled robot enters a strange house and successfully brew a cup of coffee.

The Robot College Student Test (Goertzel):  An AGI enabled robot enrolls in a college, taking and passing the same exams a human would, and obtain a degree.  Now that we can get degrees entirely on line, the robot may be optional.

The Employment Test (Nilsson):  A machine is given the task of working an economically important job, and must perform as well or better than the level that humans perform at in the same job.

As for AGI, you can see we’re not even close.

Researchers speak in terms of Broad versus Narrow AI.  The NEST thermostat is narrow, the Terminator is broad AI.  They also speak in terms of Strong versus Weak AI.  Strong AI would genuinely simulate human thinking.  Weak AIs are pragmatic solutions that result in human-like behavior but don’t simulate human reasoning.  See a good review article here. 

The part of AI that is rapidly being adopted into all sorts of consumer and business applications is based on the big three: image processing, text processing, and speech processing (a more complete list would break out image recognition and auto labeling, facial recognition, text to speech, speech to text, auto translation, sentiment analysis, and emotion analytics in image, video, text, and speech).

The breakthrough advances of the last 12 months have resulted in many of these functionalities reaching 99% accuracy up from 95% just a few years ago.  95% or less didn’t cut it for users.  99% makes these apps acceptable.

Image, text, and speech processing (plus robotics) are the eyes, ears, hands, and mouth of AI.  Now like the scare crow in the Wizard of Oz, if I only had a brain (AGI).

 

Deep Learning

Essentially all of the advancements in image, speech, and text AI made in the last five years have been the result of a class of algorithms known collectively as Deep Learning.  These are major advances in the general group of Artificial Neural Net algorithms which have advanced to be able to detect patterns without the prior definition of features or characteristics.  They are hybrid supervised learners in that you must still show the NN thousands of pictures of cats, but without the requirement for predefining the characteristics of fur, four legs, tail, etc.

The breakthrough technology here has two parts.  First, advances in the types and complexity of Neural Net algorithms.  Second hardware, the ability to run these over distributed parallel systems using superfast GPU and FPGA chips on increasingly large networks of processors.

 

AI versus Deep Learning

It is common today to equate AI and Deep Learning but this would be inaccurate on two counts.

1. AI is broader than just Deep Learning and text, image, and speech processing.  In fact AI has been around in many forms for much longer than Deep Learning, albeit in not quite such consumer-friendly forms.

Ten and even 20 years ago AI existed in the form of what was then most commonly called ‘expert systems’.  As the name implied, these systems distilled large quantities of domain-specific knowledge such as disease diagnosis or how to select a particularly complex multi-featured device like airbags, and allowed non-experts to reach an ‘expert level’ conclusion. 

Originally these decision tree-like apps were built manually based on the input of large numbers of human experts.  More recently they can be evolved more or less automatically using multi-objective decision tree algorithms which are part of legacy data science, not Deep Learning.

2. Deep Learning is broader than AI.  Deep Learning as an evolved form of neural nets can be used to solve regular data science problems in the same way that neural net algorithms have always been used.  A good example is Amazon’s current major investment in Deep Learning to create better recommenders that enhance shopping.  Any segmentation or regression problem from buyer behavior to value prediction is open to the application of Deep Learning.

 

Machine Learning

The least precise phrase in this triad is ‘machine learning’.  There are two distinct schools of thought and usage about this term and both are correct but their conflicting usage is confusing at least.

In the new usage, machine learning is defined to mean only algorithms used in unsupervised pattern identification.  This is intended to be essentially the same thing as Deep Learning leaving aside the inconvenient fact that Deep Learning still needs to be shown examples of correct output, making it not truly ‘unsupervised’.

In the more traditional usage still widely used including by me, machine learning means the application of any computer-enabled algorithm that can be applied against a data set to find a pattern in the data.  This encompasses basically all types of data science algorithms, supervised, unsupervised, segmentation, classification, or regression.

It’s clear that the traditional usage would be much broader than Deep Learning but would not encompass manually assembled expert systems in AI.

Next time you’re reading about AI, Deep Learning, and Machine Learning make your understanding more precise and more valuable by applying these explanations.

 

 

About the author:  Bill Vorhies is Editorial Director for Data Science Central and has practiced as a data scientist and commercial predictive modeler since 2001.  He can be reached at:

[email protected]

Views: 34200

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by mouloud amazouz on July 28, 2017 at 9:30am

Excellent! In some cases DL and ML are part of AI.

Comment by Gary Lyng on July 12, 2017 at 9:14am

Bill, an enjoyable read, thank you.  

Ironically I group up in Bletchley, England where Allan Turing developed his works.  

Your explanations are clear, however we often find ourselves in an elevator/lift to explain the differences.  Do you have thoughts on a short pithy statement to position each that people would repeat ideally albeit subjective based on application scenario :). ?

Comment by Frederick on July 10, 2017 at 3:56am

Good article... I agree with 4 tests to confirm AGI.  Makes me wonder if you should throw in the three rules of Robotics also.

Comment by Karim BANI on June 10, 2017 at 7:29am

An interesting article which explain the differences and common points between AI, Deep learning and Machine Learning

Well done Bill

Comment by ajit jaokar on December 20, 2016 at 10:25pm

This is one of my fav articles. very well written. happy holidays! 

Comment by Francisco Alba Andres on October 3, 2016 at 12:32am
Excellent Bill - thank you!

Back in the day (post Y2K, right before I joined Sun Microsystems - dotcom boom/bust, 9-11) when I was a Business Technologist for Computer Associates in NYC packaging and pitching "Neugents" - a sniglet for Neural Agents, we were very careful about the use of the term AI. Personally, I thought what we had then were highly functional decision trees. We also refer to it as Expert System. Today, it appears we freely use and package AI for marketing and start-up purposes. I think your Data Science article and comparative analysis does an excellent job in distinguishing AI or AGI (Artificial General Intelligence) from the rest...
Comment by Sai Prabhanjan Reddy on September 12, 2016 at 12:16am

Good differentiation.

Follow Us

Videos

  • Add Videos
  • View All

Resources

© 2017   Data Science Central   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service