Big Data

Cognitive computing leads to the next level of big data queries

Cognitive computing analyzes how the brain works and approximates human intelligence. Find out how businesses might benefit from this new frontier in data analytics.

modhabraincognitivecomputingibm.png
Image: IBM

Over the past three years, data analytics has been a "digestive" exercise, with volumes of man- and machine-created data amalgamated, stored, indexed, and probed for intelligent gems that can transform businesses. The success stories are mounting about how companies have masterminded these analytics into innovative business approaches.

But as analytics progresses, there will also be new needs to more closely approximate human intelligence and how the brain works. In technology lingo, this is referred to as cognitive computing.

"A cognitive computing platform is modeled along the lines of how the human brain works," said Ian Hersey, chief product officer of Saffron Technology, which has built a cognitive computing platform for the Internet of Things. "The technology actively learns 'patterns' of association and then reasons based upon what it has learned from these associational patterns."

An everyday example occurs when a person walks into a dark room that she is unfamiliar with. She wants to turn on the light, and it is her memory that informs her where a light switch for the room is most likely to be. That memory has been developed through the repetition and association of many past incidents when she has turned on lights in darkened rooms. She reaches for the area of the room where the switch is most likely to be found, and she probably finds it.

"In computing, the cognitive platform is built from these associational experiences in much the same way," said Hersey. "The human brain is a collection of associational processes and we start with gathering associational attributes that connect different data elements, and we then assemble them into common contexts in which these associations appear."

Saffron's approach includes getting the raw data into a system much like an extract, transform, and load process (ETL). Connections to incoming data link into multiple data sources, ranging from Hadoop files to data coming from web-based, machine-based and other systemic inputs. "We then look at this data and begin to connect it at the data entity level," Hersey said.

In some cases, an element of data might have 10,000 different attributes that could be triggered by associations that will begin to build connections between these attributes and data elements in certain usage contexts. In this sense, the process is much like how the human brain processes associational aspects of information and then cross connects them into learning matrices that enable the brain to respond to certain situational contexts.

"Over time, we begin to build solid matrices between certain elements of data in specific behavioral contexts," said Hersey. "In other cases, the connections remain sparse, because there just aren't that many associations."

The lack of data associations begins to "rule out" certain contexts in which data is relevant, although the constant in-flow of data, and the fact that circumstances are constantly changing, can also change the sparseness or density of associational patterns between data attributes over time. This is how cognitive computing engines continue to "learn."

How does this factor into what businesses are trying to do?

In fraud detection, financial institutions could have cognitive tools that enable them to go beyond analyses of cardholders' credit transaction histories; cognitive computing might provide them with new "associational" intelligence, such as when an individual is most likely to make purchases, what he is likely to buy, and under what circumstances.

High-technology manufacturers that use engineering and manufacturing facilities frequently battle with consistency of execution because processes and approaches to manufacturing vary depending on where you are in the world. A system "learning" mechanism that can ingest and develop best practices from the most highly skilled people across all of these operations and then infuse the intelligence into machine-aided decisioning and automation processes for manufacture can improve their overall production results.

"The next level of big data queries will examine the associational connections between many different data elements that come from a diversity of sources, and they will use associational practices similar to those used by the human brain," said Hersey. "We've only begun to scratch the surface."

Also see

About Mary Shacklett

Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President o...

Editor's Picks

Free Newsletters, In your Inbox