At Dashbot, we’ve processed more than 50 billion chatbot and voice messages. One of the most popular use-cases is for customer service, across verticals – including everything from retail to travel, finance, telecom, consumer products, and more. More and more enterprises are moving from live-agent chat, to automated chatbots to reduce costs and provide 24/7 support for customers.
As we’ve mentioned in previous posts, the chatbot data is all unstructured – users can say, write, or send whatever they want. It’s even more pronounced in customer service – users tell you what they want and how they feel.
The repercussions can also be greater too.
How you respond to customers’ questions is even more critical as it can be the difference between a happy customer or a lost one.
A common message we see across customer service chatbots is a request to cancel – cancelling a subscription, an order, or perhaps an account.
It can be difficult to think of all the ways a user might ask to cancel their account – especially if they’re upset or not. For example, users may say, “I want to cancel my account,” “how do I cancel my account,” “cancel my account,” “cancel my f**n acct,” “cncl acct,” or any other variation of cancel, with and without proper spelling and grammar.
Even the best Natural Language Processing (“NLP”) engines need to be trained and may not get all the variations.
At Dashbot, we built a Phrase Clustering algorithm to group similar messages together, whether you’re using an NLP engine or not. To cluster the messages, we use a model trained on a large corpus of sentences to understand semantic similarity. This means messages are considered more similar when the meaning of the message is similar, rather than relying on syntactic similarity (part of speech) or lexical similarity (word frequency similarity). The more often words or phrases appear in the same context in the sentences within the training data, the more similar the words or phrases are.
If you’re using an NLP engine, Phrase Clustering helps show where the NLP is breaking down. For example, in the “cancel” cases above, when a chatbot is using an NLP engine, sometimes you see a message map to the appropriate cancel Intent, sometimes to an unrelated Intent, and other times to the fallback, “I don’t understand” Intent – even though all the messages are effectively the same – the user asking to cancel. It’s an opportunity to quickly see where the NLP is breaking down and improve the response effectiveness.
For some chatbots, knowing a user wants to cancel is an opportunity to try to better understand why the user wants to cancel and perhaps win them over.
At Dashbot, we have a “Live Person Takeover” feature to enable live agents to pause a chatbot and interact with a user. We often see this used in conjunction with an alert around the cancel Intent, to insert an agent to try and help the user if possible.
We took a look at Kaggle’s publicly available customer service data sent to brands’ Twitter accounts and applied our Phrase Clustering as well.
A common request we see amongst our customers, and that comes through in the public data, is a request for help.
Clustering the public help messages together we see messages like “please I need help,” “hi I need help,” “hey I need help,” and many more.
Across our own customer data, we see much more variation – especially in text based chatbots. Users don’t just write “please,” they write “plz,” “pls,” “plzz,” “plese,” and many other variations of “please help.” Without a clustering algorithm, these may appear as different messages and your NLP may be matching the wrong Intent or no Intent – sending the user an “I don’t understand” response, which can further frustrate the user.
There is much more to it than just looking for the word, “help.” Users may be asking for help, but they could also be thanking you for the help. As seen in the cluster below, users are thanking for the help, rather than asking for help.
On a related note, depending on the “similarity threshold” of the phrase clustering, you can get a more nuanced idea of the user’s Intent. For example, with the “thank you” response at a 90% similarity threshold, we see four different types of thank you clusters – including one thanking for the response, one for the info provided, another a “thank you” plus a follow on action, and a “thank you” plus a status : “thanks for the response,” “thanks for the info,” “thank you will do,” and “thank you it’s working.”
As the user messages get longer, there is more room for variation which can affect the size and similarity of the clusters.
Have you ever placed an online order and wanted to know when your package will arrive? It’s a common question as we see in the following responses – and you can see not all users are as kind about it.
The above messages are clustered together with a 90% similarity threshold. The higher level results in a stricter cluster which may limit the number of similar messages within a group, and increase the overall clusters.
Adjusting the similarity threshold can get broader clusters, although they may be too dissimilar to be of value. For example, looking at a popular e-tailer’s public messages at a 60% similarity threshold, results in clusters like the ones below.
Here you can see the users’ messages are a mixture of delivery and return issues. Depending on the amount of messages and the length of the messages, the clustering similarity threshold may need to be adjusted to get meaningful results.
As more enterprises move from live-agent chat to automated chatbots for customer service, the ability to handle the unstructured nature of the messages becomes more critical. Applying Phrase Clustering techniques to the messages can help identify where NLP is breaking down and improve the response effectiveness of the chatbots. Dashbot provides a variety of tools, including phrase clustering, to better understand users’ Intents and optimize user engagement, satisfaction, and conversions.
Dashbot is a conversational analytics platform that enables enterprises and developers to increase engagement, acquisition, and monetization through actionable data and tools.
In addition to traditional analytics like engagement and retention, we provide chatbot specific metrics including NLP response effectiveness, sentiment analysis, conversational analytics, and the full chat session transcripts.
We also have tools to take action on the data, like our live person take over of chat sessions and push notifications for re-engagement.
We support Facebook Messenger, Alexa, Google Home, Slack, Twitter, Kik, SMS, web chat, and any other conversational interface.