Tech & Work

Are humans' days on the helpdesk numbered?

Ten years of research into the human brain have reportedly gone into developing a new virtual assistant designed to watch how a person does a job and then replace them.

Next time your PC locks up, would you want another computer helping you out?

Automation company IPsoft is betting you would, or at least believes your boss will think automated assistants are good enough to replace humans on helpdesks.

The field of cognitive computing - getting machines to replicate the ability of humans to learn from the world around them - is being researched at the world's biggest tech companies. Perhaps the most famous system to emerge has been IBM Watson, the machine that beat two of Jeopardy champions at the quiz show in 2011. Google has also devoted much time to this area, with its Google X labs devising a deep-learning algorithm running on a neural network capable of detecting human faces in YouTube videos four in five times.

The interest in the field is not surprising given the potential financial rewards. Any technology whose performance approaches that of a human in areas such as natural language processing or pattern recognition could help automate swathes of manual roles.

amelia-300dpi-1000x667.jpg
IPsoft's Amelia
Image: IPsoft

IPsoft's contribution to the field is Amelia, named after American aviator Amelia Earhart, a system it claims can learn how to do a job by 'reading' a manual or 'watching' humans carrying out that role, rather than needing instructions to be hand-coded.

IPsoft describes Amelia as a "cognitive agent", able to digest natural language, understand concepts and learn. That understanding of natural language is what IPsoft says lets Amelia learn by parsing a manual or a text-based interaction by someone currently carrying out the role you want the system to take on. Feed Amelia enough information about a certain topic and it will build a knowledge map that will allow it to answer related questions - a requirement of any helpdesk or advisory position.

"It's been a long journey. We spent the first decade of our journey trying to do justice to the outcomes of human cognition. Looking at how are we able to do things like reasoning, understanding, learning, leveraging our posterior and anterior cortex. The science was complicated enough that we wrote no code for the first decade," said Ergun Ekici, VP of emerging technologies at IPsoft.

"We carried out research into semantic role understanding, first order logic, predicate logic, emotional quotient and emotional reasoning, problem solving, learning. That research has now translated into Amelia."

Analyst house Gartner predicts that by 2017 that autonomic and cognitive platforms such as Amelia will drive down the cost services by 60 percent.

Amelia has been trialled within a number of Fortune 1000 companies, working in areas such as IT help desks and financial trading operations support, and is in the early stages of being rolled out at various organisations.

During trials, IPsoft found Amelia was able to progress from answering very few queries independently to 64 percent of queries within two months.

To demonstrate Amelia's capabilities I was shown a 45 minute long demo where Amelia performed a variety of tasks asked of her by an IPsoft representative. However, I was not given control over the system or opportunity to ask questions of it directly.

Amelia is split into eight software modules and can be split to run across a number of off-the-shelf servers, with the demo running on a mix of Dell PowerEdge R620 and R920 servers.

Each part of the demo was designed to show a different aspect of Amelia's capabilities - its ability to understand concepts, to learn by observing humans or 'reading manuals'. Here are the claims that IPsoft make for Amelia and how they chose to illustrate them.

Ability to learn from manuals

Inspired by a trial of Amelia by a major oil supplier, this demo began with the system reading a manual about a centrifugal pump and answering questions about why it might be going wrong. The idea was for that system to answer questions spoken to it by an engineer fixing the pump on site, although in this instance the demo was text-based, with the operator typing questions and receiving written and spoken responses.

It took Amelia about 20 seconds to process the manual, with the screen showing the various concepts Amelia had extracted.

The operator then asked Amelia a series of questions designed to illustrate what it knew about the centrifugal pump:

Human: What are the components of a pump?
Amelia: Drive shaft, impeller, casing, bearing, suction nozzle and discharge nozzle.

Amelia was also able to answer the same question in French, one of more than 20 languages it can use, but with this being a simple list it wasn't indicative of the quality of translation Amelia is capable of.

The questioning continued:

Human: What causes high power demand?
Amelia: High power demand can be caused by a bent drive shaft or worn impeller.

Ekici said that what was on display was a result of the system 'learning, not keyword indexing', highlighting the fact that Amelia is able to determine the relationship between symptom and cause.

As proof, he cited Amelia determining that a worn impeller is a cause of high power demand, while the manual only states that 'possible symptoms of a worn impeller are insufficient discharge pressure and high power demand'.

However, most of the questions asked tracked close to the wording in the manual that Amelia was fed, so it's unknown how the system would cope with more complex queries that deviated further from structure of the source material.

For instance, the manual included this line:

'Possible symptoms of a bent drive shaft are high power demand, vibration and noise, short mechanical seal life and short bearing life'

which Amelia used to then answer this question:

Human: What are the symptoms of a bent drive shaft?
Amelia: High power demand, vibration and noise, short mechanical seal life and short bearing life.

That said, this was an attempt to demonstrate learning based on a single manual digested in 20 seconds, and Ekici said that feeding Amelia with sufficient domain knowledge to take over the role of a frontline service representative generally takes more than one month of ingesting related text and observing humans carrying out a role.

Learning by observing others

A big selling point for Amelia in IPsoft's view is its ability to learn how to perform a role by watching others carrying it out.

As mentioned, in the demo's case watching means parsing text chats between agents and the people they were helping. Ekici said Amelia's ability to understand natural language allows it to build a map of what it needs to do at each stage of an interaction, a map that IPsoft refers to as building a Process Ontology.

"These process diagrams are natural language instructions that guide Amelia on what she needs to do and they are inferred through watching an agent speak to a customer, they are learned through reading a manual," said Ekici.

In the demo Amelia observed the following interaction:

Customer service agent: What is your username?
Customer: Ergun
Customer service agent: Amelia, verify that the user's username is correct?

After parsing this interaction it created a simple process ontology for how to handle a request by someone to log them onto their account:

'Ask what the user's username is?' —-> 'Verify that the username is correct?'.

Of course, this snippet has little of the complexity of the rather more messy dialogue that characterises real-life customer service exchanges. But given the one to two months IPsoft say is needed for Amelia to become comfortable in a role, perhaps Amelia would be able to build a process map from more haphazard real-life interactions. Also a customer service rep who is concious that Amelia is learning from an interaction could purposefully tailor the exchange to provide clear pointers to how that process ontology should be structured.

"The key part is that Amelia is watching and observing, but more importantly she is learning through her understanding so that she can problem solve," said Ekici.

These process maps, beyond allowing instances of Amelia to perform the role, also create a record of how customer service agents are working within an organisation.

"She watches 5,000 - 10,000 instances of these conversation and can come back to a business and say this is the process representing what you actually do, not just the process that you have documented," said Ekici.

After the process map is generated business owners can then tweak it to achieve the desired behaviour in Amelia.

Understanding of first order logic

Amelia was fed the following statements and questions:

Human: Nick bought IPsoft from Chetan yesterday

Human: Who owns IPsoft stock?

Amelia: Nick

Human: Who owned IPsoft stock last week?

Amelia: Chetan

"Ask any search engine in the world 'Who owns IPsoft stock?' and the answer isn't there," said Ekici.

However, Amelia in this instance knows both that buying implies a change of ownership and that yesterday implies that change of ownership took place before today, but not as far back as last week. This is an example of how Amelia can understand predicate or first order logic, he said.

Implying questions from context

In this instance Amelia wasn't given a direct question, rather an ambiguous 'problem statement', true to the way many people speak, and the exchange opened with:

Human: Can't access my account

Amelia detected the nature of the problem, the person was unable to access their account, and replied with a question to further clarify the issue, asking which account access level was the problem, and further questions to discern why they were unable to get access.

About Nick Heath

Nick Heath is chief reporter for TechRepublic. He writes about the technology that IT decision makers need to know about, and the latest happenings in the European tech scene.

Editor's Picks

Free Newsletters, In your Inbox