In the AI era, many businesses are turning to chatbots and deep learning to enhance customer service. But it’s not always the best approach.

“There is too much hype surrounding deep learning, and it’s not the right tool to solve all problems,” said Fabio Cardenas, CEO of Sundown AI. Why? It’s expensive, requires huge data sets, and often leads to bias, he said. Instead of turning to deep learning, Sundown AI, established in 2014, has mastered automated customer interactions using a combination of machine learning and policy graph algorithms.

Chloe is Sundown AI’s question-and-answer platform, using advanced natural language processing. It can be used in online chats, email, phone calls, or mobile apps. When a customer asks a question, Chloe finds the answer, serves it up, and then the answer can be customized or sent directly to the customer. If changes are made to the answer, Chloe will learn from it. The primary point, said Cardenas, is that the system must provide specific answers. To do this, it integrates with third party data, and can pull relevant information to the client to customize answers.

Essentially, “Chloe works as an AI teammate to make humans more productive,” said Cardenas.

The way the platform works is through machine learning combined with policy graphs. (For a refresher, deep learning is a subset of machine learning). Policy graphs are Sundown AI’s term for the “relationship between the contextual clues, questions, and answers within a dialog. The policy graph allows us to optimize outcomes for new customer service inquiries using machine learning and graph algorithms,” said Cardenas. The term derives from the concept of policies in reinforcement learning, and graph algorithms. The system can determine what the relationship is between the questions being asked, the answer, and then the follow-up reply. “We can reproduce some of what you get with deep learning, but we just do it in a way that’s more computationally efficient,” he said.

Chloe can initiate actions, send notifications, and essentially serves as “a linguistic layer on top of these systems,” Cardenas said. And if there are multiple questions, Chloe uses a memory-like system to recall previous interactions. Cardenas contrasts this with systems like IBM Watson, which he says “has no memory and is dependent on the client application for providing memory.”

Cardenas said that Chloe’s ability to learn separates it from many other platforms. After it selects an answer, it adds it to a “corpus, which is all the information that the AI uses to determine how to provide the optimal response.” The system is continually evolving, Cardenas said.

So how does Sundown AI compare to IBM Watson? Cardenas said IBM’s system “is terrible.”

“IBM Watson for conversation is akin to a chatbot, and a very bad one at that,” he said. “We’re lower cost with lower data requirements than you get with deep learning. We can handle complexity of complex workflows and logic because of our policy graphs. We handle unprocessed, unstructured data, which is a huge differentiator. Large systems can’t do that, especially chatbots.”

SEE: AI’s largest-scale innovations aren’t happening in cars or robots but in customer service

With Chloe’s linguistic layer, he said, you can automate training without supervision by speaking into the system. IBM hasn’t built this into their system, he said, which means it isn’t self-learning. One thing deep learning can do is use semantic variability using natural languages. But Cardenas says it’s not necessary.

So why would companies rely on deep learning? Cardenas said that since it’s customized for each client, “the company thinks it’s great and will solve all these problems.” And sometimes, deep learning has great applications. Google’s speech-to-text system, for instance, is trained on a variety of tagged speech that gets transcribed, and can work on huge amounts of data. The algorithm can then train itself on new inputs. The system then makes predictions, based on data.

Deep learning, said Cardenas, is best when it’s not goal-oriented. When it’s “open to a wide variety of inputs and questions, and must answer almost anything that comes up. ‘Why is the sky blue?’ ‘Who was the president of Italy in 1984?’ Anything that’s very open-ended works well with these systems because of the way they were designed,” he said.

But customer service questions have a goal in mind, he said, and are needed to come up with answers quickly. Because of this, Cardenas focuses on specific questions and answers. “It really changes the focus of the system, and it makes our system particular with what functionality we provide. It’s great for resolving questions from customer service issues,” he said. “It’s not great for replacing Siri.”

Also, deep learning is “computationally expensive, there are a lot of algorithms that are involved, it requires a lot of computing power,” said Cardenas. “From a financial standpoint, that has an issue as well with the amount of resources that are required. Updating the knowledge bases. Our system is more data efficient. It costs less, and the system evolves.”

Deep learning requires huge datasets, with hundreds of thousands of records. If you want a system that identifies pictures of cars, you need millions of records, he said. “It’s insane to think that a company, unless you’re an enterprise, is going to have millions of records for you to actually put into your algorithms to generate results. It’s just not feasible.” Even when a company has 10,000 emails a day, it may not be enough for deep learning, said Cardenas.

Also, these datasets introduce bias. “When you train a deep learning algorithm, it’s dependent on the training set that was used,” he said. “In a customer service context, that means you have to retrain it each time you switch between clients. If you can apply even a portion of old data to a new client, it’s going to have bias,” said Cardenas. “It’s just impossible to use.”

That’s because there are “changes in content, syntax, morphology, grammar, all these things are related to language change, because the training set is different. When you try to ramp up deep learning algorithms and you have something open like Siri, the more data she gets the better she becomes. But that’s not feasible when you’re doing this work with limited companies on limited scales for customer service.”

Also, there’s the issue of the deep learning data scientist, which Cardenas says are “rare” and “coveted by all the top companies.” That makes them valuable, and also very expensive–which makes the overall product more expensive. Cardenas, instead, uses machine learning data scientists.

The costs savings can be huge, said Cardenas. Competitors might charge $150,000 for an implementation, and Watson could get up to $500,000. Sundown AI charges between $10,000-$25,000. “We try to be about one-tenth of the price,” he said. After that, there’s a rate for completing tasks that query the system.

“We’re not disagreeing that deep learning is a great tool,” said Cardenas. “We think deep learning is one of the best tools we have. It’s just not the best tool for this problem. What we’ve been able to do is solve this problem without it.”

Also see…