Skip to main content

Exploring the Big Data Challenges and Opportunities

The need for Big Data solutions is becoming a reality for more companies. The market hype around big data is ending, signifying a long-term adoption trend that will expose many software application opportunities in the coming year.

As the broader big data and analytics (BDA) products and services market approaches $100 billion in revenues, software vendors are faced with the need to address an increasingly fragmented enterprise buyer population.

A new worldwide market study provides valuable insight into developer opinions about big data technologies, current database and analytics solutions, and the buyer motivations behind product and service choices.

Data quality and growing volume are the primary problems affecting software developers while exploring the evolving concept of big data, according to findings from the latest survey by Evans Data Corp.

This comprehensive market study measured and analyzed the IT buyer and business decision maker attitudes and perceptions, as well as the current and planned activities of developers involved with big data projects.

The ripple effect influenced by data quality, which can cause problems in the processing phase and cost millions of dollars in lost revenue annually, was cited as the top problem by 21 percent of the respondent developers who were surveyed.

Fourteen percent cited the volume of data being processed as a concern, and another 12 percent said their chief concern was more about the ongoing challenges of data storage.

Moreover, the inability to gain insight from big data was also cited, though all options had a low response set -- indicating that no single hurdle is to blame for big data project difficulties.

"Confidence in the veracity, precision, and comprehensiveness of the contents of an organization's data lake enables the kinds of complex exploratory analytics that can identify and deconstruct real-world trends as they emerge," said Charles Mander, analyst at Evans Data.

As such, Mander believes that the ability to capture and process big data in real-time can transform an organization's approach to strategic intelligence, adding tremendous value to data points that are already being generated in the course of normal business activity.

Evans annual survey covers topics such as the key drivers, barriers and risks for big data, machine and deep learning, real-time event processing, advanced analytics tools and related services.

It also covers big data applications relative to the Internet of Things (IoT), data visualization and graphic modeling tool use, cloud computing, Hadoop and testing requirements.

Popular posts from this blog

How AI Reshapes a $360 Billion Foundry Market

Few technology sectors sit as close to the center of gravity in today's artificial intelligence (AI) economy as semiconductor manufacturing. Every AI chip that trains a frontier model, every GPU that powers a data center inference workload, and every power management IC that keeps hyperscaler facilities running traces its origins back to the global Foundry ecosystem. IDC's latest market study throws that reality into sharp relief, projecting that the broadly defined Foundry 2.0 market will surpass $360 billion in 2026, a 17 percent year-over-year gain that would have seemed optimistic even two years ago. For anyone advising boards or investment committees on technology and AI infrastructure strategy, this growth trajectory demands careful consideration. Foundry 2.0 Market Development The umbrella term covers four distinct verticals: pure-play foundry, non-memory integrated device manufacturer (IDM) production, outsourced semiconductor assembly and test (OSAT), and photomask fab...