Skip to main content

GenAI Goes Everywhere via Device Chipsets

Soon you will start to experience Artificial Intelligence (AI) benefits everywhere -- in the public cloud, at the edge of mobile networks, and on many personal digital devices.

Generative AI (GenAI) workloads have moved beyond the bounds of cloud environments and can now run on-device supported by implementing heterogeneous AI chipsets.

Combined with an abstraction layer that can efficiently distribute AI workloads between processing architectures and compressed LLMs with under 15 billion parameters, these advanced chipsets can enable us to run generative AI inferencing locally.

On-Device AI Market Development

ABI Research estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as personal computers, smartphones, and other form factors will increasingly ship with on-device AI capabilities.

"Cloud deployment will act as a bottleneck for generative AI to scale due to data privacy, latency, and networking cost concerns. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications," said Paul Schell, industry analyst at ABI Research.

What's new is the amazing benefits of generative AI workloads running on heterogeneous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU.

Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.

Hardware alone will not be enough. Building a solid on-device AI value proposition requires strong partnerships between hardware and software players to create unified solutions.

These collaborations will nurture the development of productivity-focused applications to be deployed on-device. ABI analysts expect this will spur demand and shorten replacement cycles of end-user devices like smartphones and PCs.

This will lead to accelerating shipment numbers between 2025 and 2028 as the software ecosystem matures, breathing new life into markets that have been stagnating. Automotive and edge server markets are also impacted but to a lesser extent.

The productivity AI applications running on-device, powered by heterogeneous AI chipsets, will drive significant market growth in personal and work devices.

This is reflected by the increasing penetration of heterogeneous AI chipsets, eventually encompassing most systems and devices towards the end of this decade.

According to the ABI assessment, chip vendors and OEMs should look to expand the productivity AI application ecosystem to attract more customers and mature the offering.

This will create opportunities similar to the growth previously spurred by the expansion of mobile and web-based applications and require reaching a critical mass that appeals to a broad range of customers in consumer and enterprise markets.

Outlook for On-Device AI Applications Growth

ABI analysts believe that success in creating popular and useful applications could make or break the transition to on-device AI.

That said, I can imagine practical use cases, powered by on-device AI. Advanced chips could enable end-user computing apps that enhance our daily workflow. Here are some examples:

  • Smart dictation and note-taking: Capture your thoughts and ideas with AI-powered dictation that understands context and adapts to your voice. Even complex technical terms or industry jargon will be transcribed accurately.
  • Personalized language learning: New apps will leverage on-device AI to create a customized learning experience. The app can analyze your speech patterns, identify areas for improvement, and offer personalized skill exercises and feedback.
  • Enhanced accessibility features: These advancements can improve accessibility for users with disabilities. AI can power features like real-time captioning for video calls and voice-to-text conversion for documents, all enabled directly on the device.

These are just a few possibilities. The potential apps extend beyond productivity. As on-device AI matures, we can expect a future where our devices become even more intelligent and personalized, transforming the way we live, work, and play.

Popular posts from this blog

How AI Assistants Boost Software Creation

The field of enterprise software development has long been driven by human ingenuity. Programmers have meticulously crafted lines of code, bringing complex apps and systems to life. However, a new era is dawning, one where Artificial Intelligence (AI) is poised to fundamentally change the way software is created, tested, and deployed. According to the latest market study by Gartner, a significant shift is on the horizon. By 2028, 75 percent of enterprise software engineers will be utilizing AI-powered code assistants. This statistic paints a clear picture: AI is not here to replace software programmers, but rather to augment their capabilities and usher in a new era of collaborative co-creation. AI Code Assistant Market Development The rise of AI code assistants can be attributed to several factors. Firstly, the ever-increasing complexity of software demands new tools to streamline development. Modern applications are intricate networks of code, often built upon a foundation of existin