Summary: In a comprehensive study of 18 recently presented DNN advancements in top-N recommenders, only 7 presented sufficient data to allow reproduction. Worse, of the 7 that could be reproduced none showed an actual improvement over simple linear and KNN techniques when those were properly optimized.
Added by William Vorhies on May 19, 2020 at 12:53pm — No Comments
Summary: As we have become ever more enamored with DNNs, and their accuracy and utility has been paced only by their complexity we will need to answer the question of whether we will ever really be able to explain what goes on inside.
Added by William Vorhies on May 11, 2020 at 2:41pm — No Comments
Summary: Booz Allen just launched a one-stop shop for all manner of pretested DNN models. They’re even guaranteeing price. This makes buying just like picking accounting, CRM, or HRIS software. Equally as important, it’s a genius example of platform strategy to lock in customers and lock out competitors.
Summary: If you are guiding your company’s digital journey, to what extent should you be advising them to adopt deep learning AI methods versus traditional and mature machine learning techniques.
Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don’t fit well. Several alternatives are already in use and one that’s just been introduced, ODE net is a radical departure from our way of thinking about the solution.
Added by William Vorhies on March 11, 2019 at 7:30am — No Comments
Summary: How about we develop a ML platform that any domain expert can use to build a deep learning model without help from specialist data scientists, in a fraction of the time and cost. The good news is the folks at the Stanford DAWN project are hard at work on just such a platform and the initial results are extraordinary.
Added by William Vorhies on September 4, 2018 at 8:02am — No Comments
Summary: Deep Learning, based on deep neural nets is launching a thousand ventures but leaving tens of thousands behind. Transfer Learning (TL), a method of reusing previously trained deep neural nets promises to make these applications available to everyone, even those with very little labeled data.
Added by William Vorhies on April 17, 2018 at 12:25pm — No Comments