Summary: Since BERT NLP models were first introduced by Google in 2018 they have become the go-to choice. New evidence however shows that LSTM models may widely outperform BERT meaning you may need to evaluate both approaches for your NLP project.
Added by William Vorhies on September 21, 2020 at 12:00pm — No Comments
Summary: Objectively identifying hateful or abusive speech on social media platforms would allow those platforms to better control it. However to be objective and without bias that identification would have to be independent of the author especially where elected officials are involved.
Added by William Vorhies on June 8, 2020 at 2:25pm — No Comments
Summary: Contextually intelligent, NLP-based interactive assistants are one of the next big things for AI/ML. The tech is already here from recommendation engines. The need to be more efficient and to become AI-augmented in our decision making is now. Getting the contextual awareness is the hard part.
Added by William Vorhies on October 28, 2019 at 9:43am — No Comments
Summary: 99% of our application of NLP has to do with chatbots or translation. This is a very interesting story about expanding the bounds of NLP and feature creation to predict bestselling novels. The authors created over 20,000 NLP features, about 2,700 of which proved to be predictive with a 90% accuracy rate in predicting NYT bestsellers.
Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don’t fit well. Several alternatives are already in use and one that’s just been introduced, ODE net is a radical departure from our way of thinking about the solution.
Added by William Vorhies on March 11, 2019 at 7:30am — No Comments
Summary: Our starting assumption that sequence problems (language, speech, and others) are the natural domain of RNNs is being challenged. Temporal Convolutional Nets (TCNs) which are our workhorse CNNs with a few new features are outperforming RNNs on major applications today. Looks like RNNs may well be history.
Summary: This is the second in our chatbot series. Here we explore Natural Language Understanding (NLU), the front end of all chatbots. We’ll discuss the programming necessary to build rules based chatbots and then look at the use of deep learning algorithms that are the basis for AI enabled chatbots.
Summary: This is the first in a series about Chatbots. In this first installment we cover the basics including their brief technological history, uses, basic design choices, and where deep learning comes into play. In subsequent articles we’ll describe in more detail about how they are actually programmed and best practice dos and don’ts.
Summary: Gartner says that predictive analytics is a mature technology yet only one company in eight is currently utilizing this ability to predict the future of sales, finance, production, and virtually every other area of the…Continue
Added by William Vorhies on August 13, 2014 at 10:54am — No Comments