Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don’t fit well. Several alternatives are already in use and one that’s just been introduced, ODE net is a radical departure from our way of thinking about the solution.
Added by William Vorhies on March 11, 2019 at 7:30am — No Comments
Summary: There are some interesting use cases where combining CNNs and RNN/LSTMs seems to make sense and a number of researchers pursuing this. However, the latest trends in CNNs may make this obsolete.
Summary: Our starting assumption that sequence problems (language, speech, and others) are the natural domain of RNNs is being challenged. Temporal Convolutional Nets (TCNs) which are our workhorse CNNs with a few new features are outperforming RNNs on major applications today. Looks like RNNs may well be history.
Summary: Deep Learning, based on deep neural nets is launching a thousand ventures but leaving tens of thousands behind. Transfer Learning (TL), a method of reusing previously trained deep neural nets promises to make these applications available to everyone, even those with very little labeled data.
Added by William Vorhies on April 17, 2018 at 12:25pm — No Comments
Summary: There are several things holding back our use of deep learning methods and chief among them is that they are complicated and hard. Now there are three platforms that offer Automated Deep Learning (ADL) so simple that almost anyone can do it.
Summary: As a profession we do a pretty poor job of agreeing on good naming conventions for really important parts of our professional lives. “Machine Learning” is just the most recent case in point. It’s had a perfectly good definition for a very long time, but now the deep learning folks are trying to hijack the term. Come on folks. Let’s make up our minds.
As a profession we do a pretty poor job of agreeing on good naming conventions…Continue
Summary: There’s a three way technology race to bring faster, easier, cheaper, and smarter AI. High Performance Computing is available today but so are new commercial versions of actual Quantum computers and Neuromorphic Spiking Neural Nets. These two new entrants are going to revolutionize AI and deep learning starting now.Continue
Summary: We are approaching a time when we need to be concerned that our AI robots may indeed harm us. The rapid increase in the conversation about what ethics should apply to AI is appropriate but needs to be focused on the real threats, not just the wild imaginings of the popular press. Here are some data points to help you in thinking about this, what our concerns should be today, and what our concerns should be in the future.
Added by William Vorhies on October 24, 2017 at 9:26am — No Comments
Summary: The data science press is so dominated by articles on AI and Deep Learning that it has led some folks to wonder whether Deep Learning has made traditional machine learning irrelevant. Here we explore both sides of that argument.
Summary: Convolutional Neural Nets are getting all the press but it’s Recurrent Neural Nets that are the real workhorse of this generation of AI.
We’ve paid a lot of attention lately to Convolutional Neural Nets…Continue
Added by William Vorhies on October 24, 2016 at 3:53pm — No Comments
Summary: Here’s some background on how 3rd generation Spiking Neural Nets are progressing and news about a first commercial rollout.