Big data and analytics are well established in organizations. Further, the principle means through, which business insights are purveyed is through high-level dashboards that can tell a story in an eyeshot; or in spreadsheet-like reports that are both comfortable and familiar to end users.
The question is, do these types of reporting go far enough?
"It's a dilemma now for many companies," said Todd Blaschka, COO at TigerGraph, a company which provides graph analytics solutions. "They're grappling with blockchain, machine learning, and artificial intelligence. All give much more data, but this actually complicates how well companies can navigate through all of this data to arrive at meaningful business insights."
SEE: Quick glossary: Business intelligence and analytics (Tech Pro Research)
Blaschka references a copper mine shutdown as an example. "Copper is a critical ingredient needed in the manufacture of smartphones. So a CFO at a company might want to know what the possible repercussions of a copper shortage could be," said Blaschka. "Will there be a shortage of product, or an impact on the supply chain, or on the price of the company's stock? By using graph analytics instead of more standard database query languages like SQL, he or she can easily run multiple scenarios against a large body of information that can be analyzed in many ways."
Basically, instead of serially joining different datasets together one-by-one as in SQL, graph analytics uses a "hop" technique that quickly links disparate data sources without going through the complications of an SQL database JOIN. The result is faster performance and an ability to scale out rapidly to many different data sources.
"This broadens the reach of your analytics, and you are able to develop more algorithms and queries of new data combinations with faster time to market for your results," said Blaschka.
The importance of reforming and retrying data algorithms
There is another reason why this ability to reform and retry data algorithms is so important. Consider a telecom company, which has hundreds of millions of subscribers and processes billions of calls each day. The goal is to know if or when new telephone numbers pop up, and if they are fraudulent or legitimate.
SEE: Big data policy (Tech Pro Research)
"In a case like this, the data science team is likely to rerun and revise algorithms multiple times. They'll want to look at phone traffic patterns, such as whether there are constant calls back and forth between the subscriber and the caller, or whether a subscriber calls a number back later, or what the duration of a call is," said Blaschka. "They don't want to immediately make assumptions that a phone number should be blacklisted without executing enough queries against a variety of data."
Performing graph analytics isn't for everyone, though. Data scientists, data engineers, and business intelligence specialists will most likely work with graph technology because they already have the knowledge of databases and languages like SQL. End business users don't.
Three do's for working with graphs
"There are three things you want to do when you start working with graphs," said Blaschka. "First, formulate the business questions that you want to derive an answer for. Very often, customers will start by just bringing in the data. They'll try to ask the questions later, but you want the questions first because the questions will provide guidance on which data you need to bring into the graph to perform the analytics.
"Second, you want to bring in the best data for the questions you want to answer. If you have a retail, commercial, and investment part of your company, you might have three different data repositories that you need to pull data from.
SEE: End user data backup policy (Tech Pro Research)
"Third, you should always perform your own benchmarks before purchasing a graph product. If you feel that you can perform the equivalent of two SQL JOINS in a single graph hop, and the cumulative effect of this is less need for storage and more processing throughput, you'll want to confirm that."
Finally, graph technology works best when IT and data science departments align it with other analytics tools such as dashboards, spreadsheets, etc. The key is to fit the right type of tool to the right type of job for best results.
"Graph excels in situations where large datasets and many different sources of data need to be rapidly pulled together into an analytics matrix that can be queried by algorithms," said Blaschka. "We've seen companies like Ali Baba and Facebook build their businesses with graphics, so why not bring this capability to every enterprise?"
- Amazon Neptune is here: 6 ways customers use the AWS graph database (TechRepublic)
- Special report: Turning big data into business insights (free PDF) (TechRepublic)
- Transforming Graph Data for Statistical Relational Learning (TechRepublic)
- GraphQL for databases: A layer for universal database access? (ZDNet)
- Open source "Gandiva" project wants to unblock analytics (ZDNet)
- Intel and DARPA look to AI and machine learning to boost graph analytics in big data (ZDNet)
Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturing company in the semiconductor industry. Mary is a keynote speaker and has more than 1,000 articles, research studies, and technology publications in print.