I am in the customer service outsourcing business, so as a pithy Hacker News commenter once noted, I have a slightly biased opinion on these matters. That said, I am also an avid technologist and I believe that progress doesn’t stop, it only outgrows our willingness to keep up with it. So I’ll be first in line to use this stuff when it passes muster! On to the story.
I was having a conversation the other day with a friend whose opinion I value. We had launched PeopleDelight, a customer service vendor, just a couple weeks prior, and my friend and I hadn’t had a chance to catch up.
“You know, your whole industry will be dead in a couple years, right?”
We hadn’t ordered yet, so I couldn’t really choke on my food. Still, I did a double take.
“How do you mean?”
“Well, AI is going to wipe out all those jobs. IBM is going to become the biggest provider of customer service outsourcing in the world. Watson just beat Ken Jennings at Jeopardy, and is getting smarter everyday.”
Oh my god, I thought, had I chosen exactly the worst moment in the history of man to start a customer service business?
What a waste! We had such a great concept! A company differentiated by people, that would put human interaction first, delighting people! Aaaargh.
But…could it really be that bad? Is AI an existential threat to the customer service industry in the next couple of years? I don’t think so.
Is outsourcing customer service to AI in the near future even possible, really?
First off, we have to define what we mean by “AI is going to wipe out all those jobs.” When my friend said “jobs” what he meant was live people answering incoming calls or emails from customers who need help. The person on the line who picks up when you call the airline, or the website you ordered your shoes from, or technical support.
So, the question I’m asking is – how long will it take for AI to literally replace all those people? As in, for it to to do everything a real person has to do in order to give great customer support today? I think it would have to be able to do 4 things:
- Flawlessly recognize the customer’s words – the Natural Language Processing, or NLP, Part.
- Understand the problem in the question – I guess this is part of NLP but seems different to me.
- Evaluate the customer’s tone and react to it – Is anyone even doing this yet?
- Find the right answer – this seems doable if you understand the question and have the information.
If that’s the goal, what can AI do today? Are we really getting that close? I’ve been reading a lot about bots that are revolutionizing things…but are they?
Smarts and Speed – A Way of Understanding AI Capabilities
Information processing capability is always the product of two things: intelligence and speed. So a human is very intelligent but can only process a few things at a time, a computer is not so intelligent on a given thing but tremendously fast. A lot of the improvement in AI in the past ten years has come from improvements in speed – just handling massive amounts of data in a simple way. More recent advances seem to be improving on the intelligence bit.
Alexa and Google Assistant Aren’t It
The two big examples used when talking about AI today are Google’s assistant and Amazon’s Alexa/Echo offering. For those of you that haven’t used either of them, it’s really quite amazing what they can do, especially compared to Siri (ugh, what a blown opportunity). Amazon just announced that from 100 applications at launch Alexa can now handle instructions for over 3000 “skills,” which means someone has taken the time to figure out how to map voice commands to specific functions in things like music players, software, and so on. Here is a selection from Engadget’s review of Amazon’s Echo Dot, which runs Alexa:
“The thousands of “Skills” (what Amazon calls third-party add-on features for the Alexa platform) are where Amazon has a distinct advantage over Google’s forthcoming speaker hub, called the Home. It already supports popular connected-home brands such as Nest, SmartThings, Philips Hue and IFTTT, as well as platforms from WeMo, Insteon, Lutron, Honeywell and Ecobee, among others. Plus it works with travel and recipe apps. You can order a pizza, flowers and a car with it. You can check bank balances and get news briefings from NPR, Fox and the AP. You can hear sports scores from ESPN, and you can even figure out how much gas is in your car using the “Automatic” Skill.”
The key thing here is “voice commands to specific functions.” That is basically what we are talking about. Figuring out that “turn on the radio” and “power up the radio” both refer to a single binary switch, the on/off circuit. Times however many commands an app or bit of hardware has. So a few hundred commands per, at most.
Google Home, which runs Google Assistant, seems to be at a different level, having integrated with a variety of hardware and software makers as well as tapping Google’s own index of the web. Here is a selection from the Engadget review on them:
“The list of things you can ask the Google Assistant is limited only by your imagination, and that’s one of Home’s biggest strengths. Amazon’s Alexa assistant has gotten smarter, but Amazon still doesn’t have access to the same breadth of information as Google. Alexa doesn’t understand context the way the Assistant does either. By comparison, Home and the Assistant are far more conversational.”
The Google approach seems to be ahead in terms of a possible replacement of a human customer service rep. In other words, it seems to be improving on the intelligence part of the Smart X Speed equation.
But think about it. When do you call customer service? It’s exactly when you have already searched online – probably using Google, and have been unable to solve your problem. So how would an AI customer service rep with that capability help you?
You could argue that Alexa and Google’s Assistant get as far as:
- Basic NLP (limited to a subset of words and commands per application)
- Matching the words to a predefined question or, in Google’s case, indexed knowledge.
That is a score of 2/3 of our list. But the score only applies to a predefined subset of tasks mapped to integrations or to information which a websearch on our own could discover. And there is definitely no evaluation of tone.
Anyone who has ever called up their phone company with a billing question knows that this level of capability would be useless when you are disputing a phone bill.
Well, then what about Watson?
It’s a little bit harder to get information about IBM’s AI, Watson. Most of the information on Watson is from IBM itself – and I am not a big fan of trusting marketing copy. Interestingly, I couldn’t find any reviews from clients that had actually implemented Watson as a customer service agent. The product seems to be online only, so no voice support quite yet. You’d think IBM would trumpet that pretty loudly if they could.
Most of what I found is about Watson being used for analytics, which makes sense – a limited number of variables and a focus on speed rather than on intelligence. But that doesn’t sound like taking over live customer agents anytime soon. Not enough data, or not publicly at least.
What the experts say
Of course, my simplistic evaluation of what a few AI products can and can’t do isn’t enough to make a pronouncement on the fate of the entire customer service industry, so I went and asked another friend, an expert in machine learning and artificial intelligence, what he thinks. He told me not to worry. Here is why:
What he is working on is using Artificial Intelligence to spot intrusions on networks. In other words, he is working in a discrete system bounded entirely by ones and zeros to detect anomalies. He has all the data. And he can feed it all to his system so that it can learn. But by his own admission even in that application it will be years before the system can identify all the threats reliably. There are just too many variables…in a closed system!
Life isn’t a closed system.
Don’t say goodbye to the humans yet
So whether it’s because of the limitations on the AIs themselves, or the fact that open-ended human interaction still has just way too many variables to account for digitally, you can plan on hearing a friendly human voice on the other end of the line for years to come. And I will be happy to be the one to employ those good folks on behalf of companies that care about great customer service.
That doesn’t mean, though, that in the short term a lot of companies aren’t going to try to save themselves money by shunting you to their bots for as long as possible, only to connect you to a human once the conversation hits a wall. They do that now, in the form of voice trees and community moderated FAQs. It’s just that they will still have to have real humans on standby, as they always have, to solve your problems with that dratted gizmo or your lost luggage.
But wait, this is a question about the future, not the present!
Ok that is a fair point – I have only considered the capabilities and approach of a few AI products in 2016. But saying that “bots will decimate the customer service industry” implies the end of the process, not the beginning of it. Clearly the beginning is happening. My point is that the state of art today is still so rudimentary that we are looking at five to ten year timelines, not one to three year timelines, for the customer service industry to be turned on its head.
By the way, if you can’t afford to wait for Artificial Intelligence to get good enough to take care of your customers with the whitest of white gloves, please get in touch. We would love to help.