The Paths to AI
There are three paths to “artificial intelligence” – algorithmic, heuristic, and inferential. Algorithmic learning is essentially programming – telling someone (or a computer) how to accomplish a particular task. It requires an expert (a programmer), but once the algorithm is written, it generally works very fast with comparatively little data required.
Heuristics involve the process of teaching someone (or something) how to learn by the examination of data to create models. Heuristics encompasses the use of statistics (data science) to determine the likelihood of specific events happening, in order to create decision trees, and machine learning in order to perform better classifications. Heuristics tends to be computation- and data-heavy to create the models, but can significantly cut down on classification times.
Inferencing isn’t talked about as much but is in many ways just as important. Inferencing involves the use of graphs to store, identify, and query logical patterns. Inferencing is useful in determining least-distance type problems when the number of data points is comparatively small, and it provides ways of encoding and classifying abstract properties and relationships that can be difficult to encode in more heuristic approaches. Moreover, inferencing involves the indexing of relationships so that they can be retrieved quickly with comparatively minimal complexity.
It is becoming increasingly evident that none of these approaches, taken by themselves, is sufficient to develop a good artificial intelligence system. Algorithms provide performance but at the cost of flexibility. Heuristics gives flexibility but at the cost of both explainability (complexity) and computation. Inferencing provides performance and speed in execution, at the cost of significant setup costs. The optimal solution takes place when all three are used together: heuristics can simplify the initial classification domain while relying upon inferencing to keep complexity down and provide a connection to human knowledge systems. Algorithms can both optimize the heuristic systems and reduce the overall computational load. Inferencing can capture this knowledge and store it for faster retrieval. Not surprisingly, there are indications that our own brain utilizes similar principles, making it possible for us to do incredible things with our three-pound human organic computers.
This is why we run Data Science Central, and why we are expanding its focus to consider the width and breadth of digital transformation in our society. Data Science Central is your community. It is a chance to learn from other practitioners, and a chance to communicate what you know to the data science community overall. I encourage you to submit original articles and to make your name known to the people that are going to be hiring in the coming year. As always let us know what you think.
In media res,
Data Science Central Editorial Calendar
DSC is looking for editorial content specifically in these areas for May, with these topics likely having higher priority than other incoming articles.
DSC Featured Articles
Picture of the Week
To make sure you keep getting these emails, please add [email protected] to your browser’s address book.
Join Data Science Central | Comprehensive Repository of Data Science and ML Resources
Videos | Search DSC | Post a Blog | Ask a Question
Follow us on Twitter: @DataScienceCtrl | @AnalyticBridge
This email, and all related content, is published by Data Science Central, a division of TechTarget, Inc.
275 Grove Street, Newton, Massachusetts, 02466 US
copyright 2021 TechTarget, Inc. all rights reserved. Designated trademarks, brands, logos and service marks are the property of their respective owners.