Skip to main content

Enterprise Demand for Agile, Data-Centric Architectures

Augmented analytics, continuous intelligence and explainable artificial intelligence (AI) are among the top trends in big data and analytics that have significant disruptive potential over the next three to five years, according to the latest worldwide market study by Gartner.

"The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralized architectures and tools break down,” said Donald Feinberg, vice president at Gartner. “The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change."

Gartner recommends that data and analytics leaders collaborate with senior business leaders about their critical business priorities and explore the ten top related trends.

Augmented Analytics

Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed and shared.

By 2020, augmented analytics will be a dominant driver of new purchases of analytics and business intelligence (BI), as well as data science and ML platforms, and of embedded analytics. Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.

Augmented Data Management

Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.

It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks.

Augmented data management converts metadata from being used for audit, lineage and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI and ML.

Through to the end of 2022, data management manual tasks will be reduced by 45 percent through the addition of ML and automated service-level management.

Continuous Intelligence

By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.

Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. It provides decision automation or decision support.

Continuous intelligence leverages multiple technologies such as augmented analytics, event stream processing, optimization, business rule management and ML.

Explainable AI

AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable.

Unfortunately, most of these advanced AI models are complex black boxes that are not able to explain why they reached a specific recommendation or a decision. Explainable AI in data science and ML platforms, for example, auto-generates an explanation of models in terms of accuracy, attributes, model statistics and features in natural language.

Graph Analytics

Graph analytics is a set of analytic techniques that allow for the exploration of relationships between entities of interest such as organizations, people and transactions.

The application of graph processing and graph DBMSs will grow at 100 percent annually through 2022 to continuously accelerate data preparation and enable more complex and adaptive data science.

Graph data stores can efficiently model, explore and query data with complex interrelationships across data silos, but the need for specialized skills has limited their adoption to date.

Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries.

Data Fabric

Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage.

Through 2022, bespoke data fabric designs will be deployed primarily as a static infrastructure, forcing organizations into a new wave of cost to completely re-design for more dynamic data mesh approaches.

NLP Conversational Analytics

By 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyze complex combinations of data and to make analytics accessible to everyone in the organization will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.

Commercial AI and ML

Gartner predicts that by 2022, 75 percent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms.

Commercial vendors have now built connectors into the Open Source ecosystem and they provide the enterprise features necessary to scale and democratize AI and ML, such as project & model management, reuse, transparency, data lineage, and platform cohesiveness and integration that Open Source technologies lack.

Blockchains

The core value proposition of blockchain and distributed ledger technologies is providing decentralized trust across a network of untrusted participants. The potential ramifications for analytics use cases are significant, especially those leveraging participant relationships and interactions.

It will be several years before four or five major blockchain technologies become dominant. Until then, technology end users will be forced to integrate with the blockchain technologies and standards dictated by their dominant customers or networks. This includes integration with your existing data and analytics infrastructure.

The costs of integration may outweigh any potential benefit. Blockchains are a data source, not a database, and will not replace existing data management technologies.

Persistent Memory Servers

New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads.

It has the potential to improve application performance, availability, boot times, clustering methods and security practices while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.

Popular posts from this blog

Worldwide Contactless Payments will Exceed $1 Trillion

There's a huge upside opportunity for digital payment innovation in America. As of December 2017, Juniper Research estimates that only 9 percent of the total payment cards in circulation within the U.S. market was contactless-enabled -- this translates into just over 100 million cards. While this is a significant installed base -- around 13 percent of total chip cards issued in the U.S. market -- Juniper estimates that only 5.5 percent of the cards were actually used to make contactless offline point-of-sale purchases in 2017. This translates into about 6 million contactless cards used for payments. That's relatively low in comparison with more advanced markets such as Canada (60 million) and the UK (108 million). Contactless Payment Market Development Juniper Research forecasts that driven by payment cards and mobile wallets, in-store contactless payments will reach $2 trillion by 2020 -- that represents 15 percent of the total point of sale transactions. Furthermore

Digital Identity Verification Market to Reach $16.7B

As more enterprise organizations embrace the ongoing transition to digital business transformation, CIOs and CTOs are adopting new technologies that enable the secure identification of individuals within their key stakeholder communities. A "digital identity" is a unique representation of a person. It enables individuals to prove their physical identity during transactions. Moreover, a digital identity is a set of validated digital attributes and credentials for online interactions -- similar to a person's identity within the physical world. Individuals can use a 'digital ID' to be verified through an authorized digital channel. Usually issued or regulated by a national ID scheme, a digital identity serves to identify a unique person online or offline. Digital Identity Systems Market Development Complementary to more traditional forms of identification, digital identity verification systems can enhance the authenticity, security, confidentiality, and efficiency of

Software-Defined Infrastructure: The Platform of Choice

As more organizations adapt to a hybrid working model for their distributed workforce, enterprise CIOs and CTOs are tasked with delivering new productivity-enabling applications, while also seeking ways to effectively reduce IT cost, complexity, and risk. Traditional IT hardware infrastructure is evolving to more software-based solutions. The worldwide software-defined infrastructure (SDI) combined software market reached $12.17 billion during 2020 -- that's an increase of 5 percent over 2019, according to the latest market study by International Data Corporation (IDC). The market grew faster than other core IT technologies. The three technology pillars within the SDI market are: software-defined compute (53 percent of market value), software-defined storage controller (36 percent), and software-defined networking (11 percent). "Software-defined infrastructure solutions have long been popular for companies looking to eliminate cost, complexity, and risk within their data cente