Augmented analytics and artificial intelligence (AI) are among the top data and analytics technology trends that have the potential to significantly change business operations in the next three to five years, according to a presentation at the Gartner Data and Analytics Summit in Sydney this week.

Data and analytics leaders must examine the potential business impact of these technology trends, and adjust business models accordingly–or risk losing competitive advantage to companies that do, Rita Sallam, research vice president at Gartner, said at the event and in a press release.

“The story of data and analytics keeps evolving, from supporting internal decision making to continuous intelligence, information products and appointing chief data officers,” Sallam said in the release. “It’s critical to gain a deeper understanding of the technology trends fueling that evolving story and prioritize them based on business value.”

SEE: Big data policy (Tech Pro Research)

With digital transformation efforts underway at most organizations, businesses are collecting more data than ever before, creating challenges but also major opportunities, Donald Feinberg, vice president and distinguished analyst at Gartner, said in the release. Large amounts of data, combined with powerful processing capabilities enabled by the cloud, make it possible to train and execute algorithms at the massive scale needed to realize the full potential of AI, he added.

“The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” Feinberg said in the release. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change.”

Here are 10 data and analytics trends that data leaders and senior business leaders must explore in the coming years, according to Gartner:

1. Augmented analytics

Augmented analytics uses machine learning and AI techniques to change how analytics content is developed, consumed, and shared, according to the release.

“By 2020, augmented analytics will be a dominant driver of new purchases of analytics and BI, as well as data science and machine learning platforms, and of embedded analytics,” the release said. “Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.”

SEE: How to win with prescriptive analytics (ZDNet special report) | Download the free PDF ebook (TechRepublic)

2. Augmented data management

Augmented data management refers to converting metadata from being used for audit, lineage, and reporting to powering dynamic systems, the release said, becoming a driver for AI and machine learning.

“Augmented data management leverages machine learning capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning,” the release said. “It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks.”

3. Continuous intelligence

Continuous intelligence refers to a design pattern in which real-time analytics are integrated within a business operation, and can process current and past data to predict responses to events–useful for decision automation or support, the release noted.

“Continuous intelligence represents a major change in the job of the data and analytics team,” Sallam said in the release. “It’s a grand challenge–and a grand opportunity–for analytics and BI (business intelligence) teams to help businesses make smarter real-time decisions in 2019. It could be seen as the ultimate in operational BI.”

4. Explainable AI

While more businesses are deploying AI models to aid in decision making, they must make these models more understandable to build trust among users, the release said.

SEE: Straight up: How the Kentucky bourbon industry is going high tech (TechRepublic cover story)

5. Graph

Graph analytics are a set of techniques that allow businesses to explore relationships between organizations, people, and transactions.

“Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries,” according to the release.

6. Data fabric

Data fabric allows for a single, consistent data management framework, allowing easier data access and sharing in a distributed environment, Gartner noted. These designs will be deployed more rapidly through 2022.

7. Natural language processing (NLP)/Conversational analytics

By 2020, 50% of analytical queries will be generated via search, NLP, or voice, Gartner predicted.

“The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant,” the release said.

8. Commercial AI and machine learning

By 2022, 75% of new end user solutions that use AI and machine learning will be built on commercial solutions, rather than open source platforms, Gartner predicted. This will help enterprises scale and democratize AI and machine learning.

9. Blockchain

Blockchain could significantly impact the use of analytics; however, it will be several years before these technologies become dominant, the release noted. In the meantime, the cost of integrating blockchain into existing data and analytics infrastructure may outweigh the benefits.

10. Persistent memory servers

Emerging persistent memory technologies will reduce the costs and complexity of adopting in-memory computing (IMC)-enabled architectures, according to Gartner. This has the potential to improve application performance, availability, boot times, clustering methods, and security practices, while keeping costs low.

“The amount of data is growing quickly and the urgency of transforming data into value in real-time is growing at an equally rapid pace,” Feinberg said in the release. “New server workloads are demanding not just faster CPU performance, but massive memory and faster storage.”

Image: iStockphoto/Pinkypills