Summary: Since BERT NLP models were first introduced by Google in 2018 they have become the go-to choice. New evidence however shows that LSTM models may widely outperform BERT meaning you may need to evaluate both approaches for your NLP project.
Added by William Vorhies on September 21, 2020 at 12:28pm — No Comments
Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don’t fit well. Several alternatives are already in use and one that’s just been introduced, ODE net is a radical departure from our way of thinking about the solution.
Added by William Vorhies on March 11, 2019 at 7:30am — No Comments
Summary: There are some interesting use cases where combining CNNs and RNN/LSTMs seems to make sense and a number of researchers pursuing this. However, the latest trends in CNNs may make this obsolete.
Summary: Our starting assumption that sequence problems (language, speech, and others) are the natural domain of RNNs is being challenged. Temporal Convolutional Nets (TCNs) which are our workhorse CNNs with a few new features are outperforming RNNs on major applications today. Looks like RNNs may well be history.
Summary: There are several things holding back our use of deep learning methods and chief among them is that they are complicated and hard. Now there are three platforms that offer Automated Deep Learning (ADL) so simple that almost anyone can do it.