Home / Software & Service News / Angry customers are shaping the future of AI

Angry customers are shaping the future of AI


Have you ever yelled at a customer service agent over the phone? How about an AI-powered virtual customer service agent? If you answered yes to the latter, then thanks, you’ve made a significant contribution to the evolution of artificial intelligence.

That’s because virtual assistants, and the machine learning “brains” behind them, need exposure to natural human language to learn and adapt to the world around them. And how you speak to them is very important. Picking up the intricacies of spoken language requires exposure to slang, back-and-forth conversations, figures of speech, new words, curses, and everything else that’s fluid to the human ear.

Although our everyday language is infused with these details, we leave them out when talking to mainstream virtual assistants like Alexa or Siri. So while these consumer virtual assistants remain our de-facto AI mascots, it’s the assistants dealing with business interactions like customer service that are really pushing AI forward.

It’s not what you say, it’s how you say it

When we talk to consumer virtual assistants, we tend to change our speech patterns to fit the “formula” that works for the technology. Listen to any person ask their phone about state capitals or salmon recipes — they over-pronounce words, exaggerate consonants, and speak in short, concise sentences. It’s a form of human-to-machine “dialect” we’ve developed to guarantee the technology understands what we’re saying. In other words, rather than us teaching AI to understand us, AI is re-teaching us how to speak.

But in the customer service space, enterprise virtual assistants allow for a much more natural, open-ended way of speaking. It’s the difference between deciphering “Where is the closest Hyatt hotel?” and “So I want to stay at a Hyatt nearby for the next three nights. I need a king size bed with a view of the city. What’s the closest place I can get?” Ask your smartphone this question and you’ll be lucky if it pulls up the Hyatt website. But the machine learning “brains” of enterprise virtual assistants are fed on a higher volume and larger array of input. Instead of simply hearing direct questions like “How much does the moon weigh?”, they get tons of different requests and questions.

They’re also becoming more comfortable with deciphering human emotion. Since customer service is inherently focused on problems, many customers start their interaction already frustrated. As a result, the assistant is left to work with an angry customer shouting their issue in a jumble of sentence fragments, disorganized thoughts, and possible expletives — a firehose of information; some of it relevant, but most of it not.

And this is exactly what AI needs to hear.

Customer service assistants are team players

But how are customer service assistants able to absorb such complex, messy language and come out stronger? Part of it is the repetition they’re exposed to from handling these conversations every day, but it’s also because, unlike Alexa or Siri, they’re collaborating with humans.

In customer service, virtual assistants frequently work alongside human customer care teams. This means the assistant works in tandem with human operators to complete tasks without the customer noticing a break in conversation. When the assistant faces a tough situation, it doesn’t hand the conversation over to a human agent. Instead, it receives human guidance to solve the problem. It’s educated through real-life situations.

Like the old adage “Give a man a fish, he’s fed for a day; teach a man to fish, he’ll be fed for life,” customer service agents are teaching AI how to “fish.”

Of course, our at-home and smartphone personal assistants haven’t adopted this model. For now, they are capable of achieving what people ask of them. But as we continue to push our virtual assistants to handle more complex tasks, we’ll see a need for a level of comprehension that enterprise-level AI is currently working on.

What we’ve learned from virtual customer service assistants is that progress will only happen if we place the burden to communicate effectively on the virtual assistant rather than the user. Once the industry at large can make this shift, people won’t need to restrain themselves when talking to virtual assistants.

So, speak naturally to those customer service bots. Give ‘em hell if you want! At the end of the day, you’ll be advancing AI — one conversation at a time.

Jay Wilpon is SVP of Natural Language Research at Interactions.

Click Here For Original Source Of The Article

About Ms. A. C. Kennedy

Ms. A. C. Kennedy
My name is Ms A C Kennedy and I am a Health practitioner and Consultant by day and a serial blogger by night. I luv family, life and learning new things. I especially luv learning how to improve my business. I also luv helping and sharing my information with others. Don't forget to ask me anything!

Check Also

Microsoft’s Seeing AI app for the blind now reads handwriting

Artificial intelligence took center stage at Microsoft's AI Summit in San Francisco on Wednesday. Aside from announcing AI smarts for a range of software -- from Bing to Office 365 -- the tech titan is also ramping up its Seeing AI app for iOS, which uses computer vision to audibly help blind and visually impaired people to see the world around them. According to Microsoft, it's nabbed 100,000 downloads since its launch in the US earlier this year, which convinced the tech titan to bring it to 35 countries in total, including the EU.

It's also getting a bunch of new features. The app now boasts more currency recognition, adding British pounds, US dollars, Canadian dollars, and Euros to its tally. Going beyond the color in a scene, it can also spot the color of specific objects, like clothes. Plus, it's no longer restricted to just short printed text, with handwriting recognition now part of its skill set. You can also customize the voice that it uses to speak its observations out loud, and set how fast it talks.

Finally, a musical light detector alerts you to the light in an environment with an audible tone -- Microsoft claims the tool will save users from having to touch a hot bulb or LED battery to check if it's on. Despite the big update, there's still no word on an Android launch.

Source: Microsoft

css.php