How Smart Does my Chatbot Need to Be- and how Conversational?
The recent news that Microsoft are working on a conversational UI for Cortana shouldn’t be a surprise to anyone in the business world. Live chat with an agent has already become a popular method and chat or conversation, whether with an agent or a chatbot, saves time, increases customer satisfaction, and is 63% of customers’ preferred method of contact with a company. Add to this the ‘always on’ nature of virtual assistants, and it’s clear why conversation is such a large part of the future of CX.
In order for your chatbot to really succeed as a viable conversational interface for service at any or across many channels, it needs to offer your customer true support, complete with Natural Language Processing. If you’re thinking about conversational UI for your company, here are some key elements to keep in mind.
Phraseology is complicated at the best of times, but when you’re turning to a virtual assistant for an answer, you don’t have time for “Call me a taxi.” “OK, you’re a taxi”- style dad jokes or misunderstandings. Turns of phrase are important in a natural conversation, and your chatbot should be able to manage and understand these seamlessly. That also includes working towards understanding terms like book titles or show names, to avoid problems like the user below experienced.
Christopher Manning, the Thomas M Siegel Professor in Machine Learning works with Natural Language Processing to help computers “learn the soft changes in meaning over time” to the words we use and create natural sounding language for conversational UI. While all AI will glean data from the internet, a knowledge base, or wherever it is given access, it’s about being able to sift through what is relevant to each conversation and come out with accurate responses. Ensuring the right management of your solution can help your bot improve and learn, as can keeping the option of channeling the customer or user to a human if he just can’t get the bot to “speak his language.” As with any evolving technology, there is a learning curve, but one could argue that it’s a question of how steep, and how quickly it can be overcome. The best solutions continue to grow and learn, and the ability to pick up phrases is just one example of this understanding of nuance in language.
In a normal conversation, we don’t need to use full sentences every time to remember what we’re talking about. We can ask add-on questions and expect to be understood. Here, the Poncho weather bot can’t keep the conversation going past one layer of questioning, sure to create frustration for the user.
With the ability to ‘remember’ the discussion at hand, a customer could use a banking bot for example to ask about the exchange rate from dollars to euros, and then continue “How about yen?” and expect to get a dollar-yen exchange rate too.
Before building your bot, it’s vital to consider what you want it to achieve. What will it’s function be, how will it help your customer? If it best fits your brand and harnesses limited capabilities, it’s ok to have a bot which does only one task, for example checks hotel room availability, or confirms flight times. Maybe the point is to give information, or to extend your branding. Many complaints about Microsoft Zo were due to similar conversations to the one above, which seemed to be gimmicky and without any value whatsoever. The problem is less that she didn’t understand what people wanted from her, as bots will confront scenarios in which a user is “testing” or asking for something beyond the scope – part of what makes it problematic is her inability to provide straight and accurate answers with the insistence on being cutesy. What could be seen as ineffective often then instead is interpreted as irritating.
Ask yourself the following questions before you start building your chatbot.
• Which tasks will my chatbot manage?
• What added value will my bot offer to customers?
• How is it simpler or more interesting than the existing way my users interact with us?
• Why will customers prefer engaging with us with this channel?
Take the eBay ShopBot for example, which allows customers to browse from the chat tool, rather than search the website itself. Removing friction from the shopping journey, this encourages customers who might not know exactly what they are looking for. In a very natural way, it is giving them a conversational tool to bounce ideas off with, one which has the added bonus of the entire eBay database to select suggestions from.
If you make sure that the limits of your bot are clear from the outset, you can avoid customer dissatisfaction. No one is going to expect your weather app to be able to order you a pizza for example. The most important thing is that it performs what it can do seamlessly. The example above gives the user an example to try, and then fails to recognise even that simple instruction.
As Facebook scales back on its messenger bots following a 70% failure rate this is more important than ever. If your bot hits a wall, ensure that built-in escalation is part and parcel of your interface, so that you can seamlessly transfer a customer over to human support when necessary. 88% of US respondents told Business Insider that this was an essential part of successful conversational UI for them.
It goes without saying that conversational UI has to have an understanding of what your customers are trying to say. But what about the things that your customers are not saying, or didn’t mean to say? Like above, many bots seem to fail when a customer is trying to take back something they have said, or has changed their mind. Being able to recognise user failure and when a customer has said something in error is essential for a successful conversation.
Integrating your chatbot with your knowledgebase and analytics means that you can cross reference conversations with customer data, purchase records and other user behavior, giving you a better idea of how to improve both the responses of your bot and your customer support overall.