Augmented analytics, continuous intelligence and explainable artificial intelligence (AI) are among the top trends in big data and analytics that have significant disruptive potential over the next three to five years, according to the latest worldwide market study by Gartner.
"The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” said Donald Feinberg, vice president at Gartner. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change."
Gartner recommends that data and analytics leaders collaborate with senior business leaders about their critical business priorities and explore the ten top related trends.
Augmented Analytics
Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed and shared.
By 2020, augmented analytics will be a dominant driver of new purchases of analytics and business intelligence (BI), as well as data science and ML platforms, and of embedded analytics. Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.
Augmented Data Management
Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.
It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks.
Augmented data management converts metadata from being used for audit, lineage and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI and ML.
Through to the end of 2022, data management manual tasks will be reduced by 45 percent through the addition of ML and automated service-level management.
Continuous Intelligence
By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.
Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. It provides decision automation or decision support.
Continuous intelligence leverages multiple technologies such as augmented analytics, event stream processing, optimization, business rule management and ML.
Explainable AI
AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable.
Unfortunately, most of these advanced AI models are complex black boxes that are not able to explain why they reached a specific recommendation or a decision. Explainable AI in data science and ML platforms, for example, auto-generates an explanation of models in terms of accuracy, attributes, model statistics and features in natural language.
Graph Analytics
Graph analytics is a set of analytic techniques that allow for the exploration of relationships between entities of interest such as organizations, people and transactions.
The application of graph processing and graph DBMSs will grow at 100 percent annually through 2022 to continuously accelerate data preparation and enable more complex and adaptive data science.
Graph data stores can efficiently model, explore and query data with complex interrelationships across data silos, but the need for specialized skills has limited their adoption to date.
Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries.
Data Fabric
Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage.
Through 2022, bespoke data fabric designs will be deployed primarily as a static infrastructure, forcing organizations into a new wave of cost to completely re-design for more dynamic data mesh approaches.
NLP Conversational Analytics
By 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.
Commercial AI and ML
Gartner predicts that by 2022, 75 percent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms.
Commercial vendors have now built connectors into the Open Source ecosystem and they provide the enterprise features necessary to scale and democratize AI and ML, such as project & model management, reuse, transparency, data lineage, and platform cohesiveness and integration that Open Source technologies lack.
Blockchains
The core value proposition of blockchain and distributed ledger technologies is providing decentralized trust across a network of untrusted participants. The potential ramifications for analytics use cases are significant, especially those leveraging participant relationships and interactions.
It will be several years before four or five major blockchain technologies become dominant. Until then, technology end users will be forced to integrate with the blockchain technologies and standards dictated by their dominant customers or networks. This includes integration with your existing data and analytics infrastructure.
The costs of integration may outweigh any potential benefit. Blockchains are a data source, not a database, and will not replace existing data management technologies.
Persistent Memory Servers
New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads.
It has the potential to improve application performance, availability, boot times, clustering methods and security practices while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.
"The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” said Donald Feinberg, vice president at Gartner. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change."
Gartner recommends that data and analytics leaders collaborate with senior business leaders about their critical business priorities and explore the ten top related trends.
Augmented Analytics
Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed and shared.
By 2020, augmented analytics will be a dominant driver of new purchases of analytics and business intelligence (BI), as well as data science and ML platforms, and of embedded analytics. Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.
Augmented Data Management
Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.
It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks.
Augmented data management converts metadata from being used for audit, lineage and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI and ML.
Through to the end of 2022, data management manual tasks will be reduced by 45 percent through the addition of ML and automated service-level management.
Continuous Intelligence
By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.
Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. It provides decision automation or decision support.
Continuous intelligence leverages multiple technologies such as augmented analytics, event stream processing, optimization, business rule management and ML.
Explainable AI
AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable.
Unfortunately, most of these advanced AI models are complex black boxes that are not able to explain why they reached a specific recommendation or a decision. Explainable AI in data science and ML platforms, for example, auto-generates an explanation of models in terms of accuracy, attributes, model statistics and features in natural language.
Graph Analytics
Graph analytics is a set of analytic techniques that allow for the exploration of relationships between entities of interest such as organizations, people and transactions.
The application of graph processing and graph DBMSs will grow at 100 percent annually through 2022 to continuously accelerate data preparation and enable more complex and adaptive data science.
Graph data stores can efficiently model, explore and query data with complex interrelationships across data silos, but the need for specialized skills has limited their adoption to date.
Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries.
Data Fabric
Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage.
Through 2022, bespoke data fabric designs will be deployed primarily as a static infrastructure, forcing organizations into a new wave of cost to completely re-design for more dynamic data mesh approaches.
NLP Conversational Analytics
By 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.
Commercial AI and ML
Gartner predicts that by 2022, 75 percent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms.
Commercial vendors have now built connectors into the Open Source ecosystem and they provide the enterprise features necessary to scale and democratize AI and ML, such as project & model management, reuse, transparency, data lineage, and platform cohesiveness and integration that Open Source technologies lack.
Blockchains
The core value proposition of blockchain and distributed ledger technologies is providing decentralized trust across a network of untrusted participants. The potential ramifications for analytics use cases are significant, especially those leveraging participant relationships and interactions.
It will be several years before four or five major blockchain technologies become dominant. Until then, technology end users will be forced to integrate with the blockchain technologies and standards dictated by their dominant customers or networks. This includes integration with your existing data and analytics infrastructure.
The costs of integration may outweigh any potential benefit. Blockchains are a data source, not a database, and will not replace existing data management technologies.
Persistent Memory Servers
New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads.
It has the potential to improve application performance, availability, boot times, clustering methods and security practices while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.