Summary: As we have become ever more enamored with DNNs, and their accuracy and utility has been paced only by their complexity we will need to answer the question of whether we will ever really be able to explain what goes on inside.
As we have become ever more enamored with DNNs, and their accuracy and…
Added by William Vorhies on May 11, 2020 at 2:41pm — No Comments
Mobile health (mHealth) is considered one of the most transformative drivers for health informatics delivery of ubiquitous medical applications. Machine learning has proven to be a powerful tool in classifying medical images for detecting various diseases. However, supervised machine learning requires a large amount of data to train the model, whose storage and processing pose considerable system requirements challenges for mobile applications. Therefore, many studies focus on…
ContinueAdded by AI on August 16, 2019 at 6:00am — No Comments
Summary: The ability to train large scale CNNs directly on your cell phone without sending the data round trip to the cloud is the key to next gen AI applications like real time computer vision and safe self-driving cars. Problem is our current GPU AI chips won’t get us there. But neuromorphic chips look like they will.
…
ContinueAdded by William Vorhies on June 4, 2019 at 9:00am — No Comments
Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don’t fit well. Several alternatives are already in use and one that’s just been introduced, ODE net is a radical departure from our way of thinking about the solution.
…
ContinueAdded by William Vorhies on March 11, 2019 at 7:30am — No Comments
Summary: In the literal blink of an eye, image-based AI has gone from high cost, high risk projects to quick and reasonably reliable. C-level execs looking for AI techniques to exploit need to revisit their assumptions and move these up the list. Here’s what’s changed.
For data scientists these are miraculous times. We tend to think of miracles as something that occurs instantaneously but in our world that’s not quite so. Still the rate…
ContinueAdded by William Vorhies on March 4, 2019 at 9:41am — No Comments
Summary: Not enough labeled training data is a huge barrier to getting at the equally large benefits that could be had from deep learning applications. Here are five strategies for getting around the data problem including the latest in One Shot Learning.
For at least the last two years we’ve been in an…
Added by William Vorhies on January 28, 2019 at 9:56am — 1 Comment
Summary: This may be the golden age of deep learning but a lot can be learned by looking at where deep neural nets aren’t working yet. This can be a guide to calming the hype. It can also be a roadmap to future opportunities once these barriers are behind us.
We are living in the golden age of deep…
Added by William Vorhies on November 18, 2018 at 11:14am — No Comments
Summary: There are some interesting use cases where combining CNNs and RNN/LSTMs seems to make sense and a number of researchers pursuing this. However, the latest trends in CNNs may make this obsolete.
There are things that just don’t seem to go together. Take oil and water for instance. Both valuable, but try putting…
Added by William Vorhies on August 14, 2018 at 7:34am — 3 Comments
Summary: Deep Learning, based on deep neural nets is launching a thousand ventures but leaving tens of thousands behind. Transfer Learning (TL), a method of reusing previously trained deep neural nets promises to make these applications available to everyone, even those with very little labeled data.
Continue
Added by William Vorhies on April 17, 2018 at 12:25pm — No Comments
Summary: There are several things holding back our use of deep learning methods and chief among them is that they are complicated and hard. Now there are three platforms that offer Automated Deep Learning (ADL) so simple that almost anyone can do it.
There are several things holding back our use of deep learning methods…
Added by William Vorhies on April 10, 2018 at 8:18am — 1 Comment
Summary: As a profession we do a pretty poor job of agreeing on good naming conventions for really important parts of our professional lives. “Machine Learning” is just the most recent case in point. It’s had a perfectly good definition for a very long time, but now the deep learning folks are trying to hijack the term. Come on folks. Let’s make up our minds.
As a profession we do a pretty poor job of agreeing on good naming conventions…
ContinueAdded by William Vorhies on December 4, 2017 at 3:30pm — 6 Comments
In machine learning, a convolutional neural network (CNN, or ConvNet) is a class of neural networks that has successfully been applied to image recognition and analysis. In this project I've approached this class of models trying to apply it to stock market prediction, combining stock prices with sentiment analysis. The implementation of the network has been made…
ContinueAdded by Mattia Brusamento on November 18, 2017 at 8:30am — 6 Comments
Summary: There’s a three way technology race to bring faster, easier, cheaper, and smarter AI. High Performance Computing is available today but so are new commercial versions of actual Quantum computers and Neuromorphic Spiking Neural Nets. These two new entrants are going to revolutionize AI and deep learning starting now.
ContinueAdded by William Vorhies on November 14, 2017 at 8:07am — 4 Comments
Summary: We are approaching a time when we need to be concerned that our AI robots may indeed harm us. The rapid increase in the conversation about what ethics should apply to AI is appropriate but needs to be focused on the real threats, not just the wild imaginings of the popular press. Here are some data points to help you in thinking about this, what our concerns should be today, and what our concerns should be in the future.
…
ContinueAdded by William Vorhies on October 24, 2017 at 9:26am — No Comments
Summary: We are swept up by the rapid advances in AI and deep learning, and tend to laugh off AI’s failures as good fodder for YouTube videos. But those failures are starting to add up. It’s time to take a hard look at the weaknesses in AI and where that’s leading us.
…
ContinueAdded by William Vorhies on April 18, 2017 at 8:04am — No Comments
Summary: The data science press is so dominated by articles on AI and Deep Learning that it has led some folks to wonder whether Deep Learning has made traditional machine learning irrelevant. Here we explore both sides of that argument.
On Quora the other day I saw a question from an aspiring data scientist that asked – since all the…
Added by William Vorhies on December 13, 2016 at 9:24am — 4 Comments
Summary: Convolutional Neural Nets are getting all the press but it’s Recurrent Neural Nets that are the real workhorse of this generation of AI.
We’ve paid a lot of attention lately to Convolutional Neural Nets…
Added by William Vorhies on October 24, 2016 at 3:53pm — No Comments
Summary: Here’s some background on how 3rd generation Spiking Neural Nets are progressing and news about a first commercial rollout.
Recently we wrote about the development of AI and neural nets beyond the second generation Convolutional and Recurrent Neural Nets (CNNs / RNNs) which have come on so strong and dominate the…
Added by William Vorhies on October 18, 2016 at 8:04am — 1 Comment
Summary: What comes next after Deep Learning? How do we get to Artificial General Intelligence? Adversarial Machine Learning is an emerging space that points to that direction and shows that AGI is closer than we think.
Deep Learning, Convolutional Neural Nets (CNNs) have given us dramatic improvements in image, speech, and text recognition over the last two years. They suffer from the flaw however that…
ContinueAdded by William Vorhies on September 27, 2016 at 10:25am — 3 Comments
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
1999
Posted 1 March 2021
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles