Chatbots are a hot topic. Conversational commerce and Artificial Intelligence are at the peak; if you haven’t already experimented with chatbots for your customer services, now’s the time. But how do you create a chatbot yourself? Let’s review the options from the simplest to the most complicated.
The simplest chatbots do not bother with processing the text typed by the user. Instead, a user can choose from a limited set of predefined “phrases” by pressing the appropriate phrase-button. Landbot.io constructor is a good example of such an approach. Slash-command Slack bots also fall into this category.
These bots don’t require NLU and the complications it brings, and thus are extremely easy to implement. In many cases, you don’t need to write even a single line of code. There are tools that allow creating such a bot and designing the dialogue flows in visual editors.
Menu-based chatbots are successfully used by a lot of businesses, but they are not nearly as engaging as other types of bots. Still, for basic customer service with simple queries and predefined conversation flows, they provide the best value for the money. Also, they are a great option for a quick test of a market or idea along the lines of the lean approach.
Now, let’s step out from the comfort zone of controlled user behaviour into the realm of unrestricted free text input. And here comes…
The simplest thing you can do when processing the free text input is to detect keywords and phrase patterns and then to react based on them. Given the complexity and variability of natural language, this approach is very simplistic. Nevertheless, it is impressive how far you can get with it if you need a highly specialised bot for queries with limited and very specific vocabulary.
Rule-based chatbots rely on simple text processing tools (read: regular expressions), and thus can be up and running quickly without any knowledge of modern NLU science. Unfortunately, as the project expands with additional functions and a broader vocabulary, a bot becomes too complex to maintain and develop.
Poor scalability coupled with the inability to fully embrace the richness of natural language makes this approach the least recommended. But if you still want to try it, check out ChatScript.
Finally, we are entering the realm of true NLU. But this does not need to be too hard since you can create…
Everything is simpler with SaaS. And thanks to Clouds, we have a good selection of SaaS NLU providers. Not surprisingly, most famous names come from FAMGA:
Most of these services provide APIs for intent classification. Here’s how they work. First, you define a model. The model basically contains a set of intents the user may express while chatting with the bot and some sample utterances the user may use for that (the more the better). Then, you let the service learn on this model. After that, the NLU-as-a-Service will be able to analyse the actual user input in real-time and tell you to which intent it most likely corresponds. Unlike a rule-based bot, this service can recognize the intent even if the real user’s message is quite different from what the developers have provisioned.
The actual development process is somewhat more nuanced, and you better entrust it to the experts that understand the concepts behind it. But apart from this, everything is still straightforward, and the project kick-off is as fast as with the rule-based approach. At the same time, NLU-as-a-Service bots scale better as their functionality grows.
Chatbots of this type provide a middle ground between the cost of development and the potential of the bot. NLUaaS gives you good results with interpreting user goals from conversational queries, but that may still be not enough. If you need a universal mechanism for entities recognition, support for complex contexts, ability to nest dialogues, etc., you will want to go with…
If you need more flexibility in the way your NLU works, you have to build your own stack from the open-source, free, or purchased components. You can choose the Rasa Stack that is a drop-in replacement for the abovementioned intent classification services.
You will need to roll out your own infrastructure, but for this additional cost you will also get the ability to fine-tune the NLU pipeline to the exact domain you need to get better results. SaaS solutions cannot do that as they are created to work for a broad range of customers”, says Andriy Skuratov, R&D Manager at ELEKS.
Or, you can drop the intent classification altogether and build your bot logic directly around Name Entity Recognition process using, for example, NLP4J from Emory University, or CoreNLP from Stanford—whatever works best for you.
The amazing things such bots can do don’t come for free. Projects of this type tend to take longer, need more infrastructure, and require specific skills and expertise. In general, such projects are more expensive compared to the SaaS approach.
But as a result, you get a more flexible system, NLU that can become incrementally better, higher product quality, and thus the ability to create a more advanced and engaging experience for your users.
So, if you are about quality, this is the way to go. But if you need to take the quality over the top of the problem you are solving is very specific, you will need to build your chatbot from scratch.
To do this, you need to involve the experts in the NLU field. They will be able to create a custom, low-level solution for your problem. Naturally, creating your chatbot from scratch is the most expensive and time-consuming option. We explored the capabilities of Natural Language Understanding (NLU) in our article here to help you understand how chatbots work and where they can potentially fail. You can use these insights to build your own NLP engine.
As you can see, when it comes to creating the chatbot, there is a range of options from the simplest and budget-friendly to the most sophisticated and expensive.
Original Post can be viewed here.