Summary: Workforce forecasting and scheduling applications are rapidly upgrading their use of AI. Techniques of time series forecasting ranging from the simple Holt Winters to the complex, DNNs and Multiple Temporal Aggregation are available on some but not all platforms. Increasingly, AI differentiates the usefulness of these apps.
Added by William Vorhies on January 28, 2020 at 2:15pm — No Comments
Summary: If you are guiding your company’s digital journey, to what extent should you be advising them to adopt deep learning AI methods versus traditional and mature machine learning techniques.
Summary: In the literal blink of an eye, image-based AI has gone from high cost, high risk projects to quick and reasonably reliable. C-level execs looking for AI techniques to exploit need to revisit their assumptions and move these up the list. Here’s what’s changed.
For data scientists these are miraculous times. We tend to think of miracles as something that occurs instantaneously but in our world that’s not quite so. Still the rate…Continue
Added by William Vorhies on March 4, 2019 at 9:41am — No Comments
Summary: Not enough labeled training data is a huge barrier to getting at the equally large benefits that could be had from deep learning applications. Here are five strategies for getting around the data problem including the latest in One Shot Learning.
Summary: This may be the golden age of deep learning but a lot can be learned by looking at where deep neural nets aren’t working yet. This can be a guide to calming the hype. It can also be a roadmap to future opportunities once these barriers are behind us.
Summary: How about we develop a ML platform that any domain expert can use to build a deep learning model without help from specialist data scientists, in a fraction of the time and cost. The good news is the folks at the Stanford DAWN project are hard at work on just such a platform and the initial results are extraordinary.
Added by William Vorhies on September 4, 2018 at 8:02am — No Comments
Summary: There are some interesting use cases where combining CNNs and RNN/LSTMs seems to make sense and a number of researchers pursuing this. However, the latest trends in CNNs may make this obsolete.
Summary: Quantum computing is already being used in deep learning and promises dramatic reductions in processing time and resource utilization to train even the most complex models. Here are a few things you need to know.
Added by William Vorhies on June 13, 2017 at 8:00am — No Comments