Have you ever yelled at a customer service agent over the phone? How about an AI-powered virtual customer service agent? If you answered yes to the latter, then thanks, you’ve made a significant contribution to the future of AI.
That’s because Intelligent Virtual Assistants, and the machine learning “brains” behind them, need exposure to natural human language to learn and adapt to the world around them. And how you speak to them is very important. Picking up the intricacies of spoken language requires exposure to slang, back-and-forth conversations, figures of speech, new words, curses, and everything else that’s fluid to the human ear.
It’s not what you say, it’s how you say it
When we talk to personal virtual assistants — like Alexa or Siri — we tend to change our speech patterns to fit the “formula” that works for the technology. Listen to any person ask their phone about state capitals or salmon recipes — they over-pronounce words, exaggerate consonants, and speak in short, concise sentences. It’s a form of human-to-machine “dialect” we’ve developed to guarantee the technology understands what we’re saying. In other words, rather than us teaching AI to understand us, AI is re-teaching us how to speak.
But in the customer service space, enterprise Intelligent Virtual Assistants allow for a much more natural, open-ended way of speaking. It’s the difference between deciphering “Where is the closest hotel?” and “So I want to stay at a hotel nearby for the next three nights. I need a king size bed with a view of the city. What’s the closest place I can get?” Ask your smartphone this question and you’ll be lucky if it pulls up the hotel website. But the machine learning “brains” of enterprise IVAs are fed on a higher volume and larger array of input. Instead of simply hearing direct questions like “How much does the moon weigh?”, they get tons of different requests and questions.
They’re also becoming more comfortable with deciphering human emotion. Since customer service is inherently focused on problems, many customers start their interaction already frustrated. As a result, the assistant is left to work with an angry customer shouting their issue in a jumble of sentence fragments, disorganized thoughts, and possible expletives — a firehose of information; some of it relevant, but most of it not.
And this is exactly what AI needs to hear.
Customer service assistants are team players
But how are customer service assistants able to absorb such complex, messy language and come out stronger? Part of it is the repetition they’re exposed to from handling these conversations every day, but it’s also because, unlike Alexa or Siri, they’re collaborating with humans.
In customer service, IVAs frequently work alongside human customer care teams. This means the assistant works in tandem with human operators to complete tasks without the customer noticing a break in conversation. When the assistant faces a tough situation, it doesn’t hand the conversation over to a human agent. Instead, it receives human guidance to solve the problem. It’s educated through real-life situations.
Like the old adage “Give a man a fish, he’s fed for a day; teach a man to fish, he’ll be fed for life,” customer service agents are teaching AI how to “fish.”
Of course, our at-home and smartphone personal assistants haven’t adopted this model. For now, they are capable of achieving what people ask of them. But as we continue to push our virtual assistants to handle more complex tasks, we’ll see a need for a level of comprehension that enterprise-level AI is currently working on.
What we’ve learned from virtual customer service assistants is that progress will only happen if we place the burden to communicate effectively on the virtual assistant rather than the user. Once the industry at large can make this shift, people won’t need to restrain themselves when talking to virtual assistants.
So, speak naturally to those customer service bots. Give ‘em hell if you want! At the end of the day, you’ll be advancing AI — one conversation at a time.