As more CIOs and CTOs focus attention on selecting the best-fit IT infrastructure for their particular cognitive computing needs, vendors of semiconductor technologies are exploring new ways to optimize their investment in solutions at the edge of enterprise networks.
Revenue from the sale of Artificial Intelligence (AI) chipsets for edge inference and inference training will grow at 65 percent and 137 percent respectively between 2018 and 2023, according to the latest worldwide market study by ABI Research.
During 2018, shipment revenues from edge AI processing reached $1.3 billion, and by 2023 this figure is forecast to reach $23 billion. While it's a massive increase, that doesn’t necessarily favor current market leaders Intel and NVIDIA.
AI Chipset Market Development
According to the ABI assessment, there will be intense vendor competition to capture this revenue between established players and several prominent startup players.
"Companies are looking to the edge because it allows them to perform AI inference without transferring their data. The act of transferring data is inherently costly and in business-critical use cases where latency and accuracy are key, and constant connectivity is lacking, applications can’t be fulfilled," said Jack Vernon, industry analyst at ABI Research.
Moreover, locating AI inference processing at the edge also means that companies don’t have to share private or sensitive data with public cloud service providers, a scenario that has proven to be problematic in the healthcare and consumer sectors.
That said, edge AI is going to have a significant impact on the semiconductor industry. The biggest winners from the growth in edge AI are going to be those vendors that either own or are currently building intellectual properties for AI-related Application-Specific Integrated Circuits (ASICs).
By 2023, it's predicted that ASICs could overtake GPUs as the architecture supporting AI inference at the edge, both in terms of annual vendor shipments and revenues.
In terms of market competition, on the AI inferencing side, Intel will be competing with several prominent AI start-ups -- such as Cambricon Technology, Horizon Robotics, Hailo Technologies, and Habana Labs -- for dominance of this market segment.
NVIDIA with its GPU-based AGX platform has also been gaining momentum in industrial automation and robotics. While FPGA leader Xilinx can also expect an uptick in revenues on the back of companies using FPGAs to perform inference at the edge, Intel as an FPGA vendor is also pushing its Movidius and Mobileye chipset.
Outlook for AI Chipset Applications Growth
For AI training, NVIDIA will hold on to its current position as the market leader. However, other AI applications at the edge will likely favor alternative vendors.
"Cloud vendors are deploying GPUs for AI training in the cloud due to their high performance. However, NVIDIA will see its market share chipped away by AI training focused ASIC vendors like Graphcore, who are building high-performance and use-case specific chipsets," concluded Vernon.
Revenue from the sale of Artificial Intelligence (AI) chipsets for edge inference and inference training will grow at 65 percent and 137 percent respectively between 2018 and 2023, according to the latest worldwide market study by ABI Research.
During 2018, shipment revenues from edge AI processing reached $1.3 billion, and by 2023 this figure is forecast to reach $23 billion. While it's a massive increase, that doesn’t necessarily favor current market leaders Intel and NVIDIA.
AI Chipset Market Development
According to the ABI assessment, there will be intense vendor competition to capture this revenue between established players and several prominent startup players.
"Companies are looking to the edge because it allows them to perform AI inference without transferring their data. The act of transferring data is inherently costly and in business-critical use cases where latency and accuracy are key, and constant connectivity is lacking, applications can’t be fulfilled," said Jack Vernon, industry analyst at ABI Research.
Moreover, locating AI inference processing at the edge also means that companies don’t have to share private or sensitive data with public cloud service providers, a scenario that has proven to be problematic in the healthcare and consumer sectors.
That said, edge AI is going to have a significant impact on the semiconductor industry. The biggest winners from the growth in edge AI are going to be those vendors that either own or are currently building intellectual properties for AI-related Application-Specific Integrated Circuits (ASICs).
By 2023, it's predicted that ASICs could overtake GPUs as the architecture supporting AI inference at the edge, both in terms of annual vendor shipments and revenues.
In terms of market competition, on the AI inferencing side, Intel will be competing with several prominent AI start-ups -- such as Cambricon Technology, Horizon Robotics, Hailo Technologies, and Habana Labs -- for dominance of this market segment.
NVIDIA with its GPU-based AGX platform has also been gaining momentum in industrial automation and robotics. While FPGA leader Xilinx can also expect an uptick in revenues on the back of companies using FPGAs to perform inference at the edge, Intel as an FPGA vendor is also pushing its Movidius and Mobileye chipset.
Outlook for AI Chipset Applications Growth
For AI training, NVIDIA will hold on to its current position as the market leader. However, other AI applications at the edge will likely favor alternative vendors.
"Cloud vendors are deploying GPUs for AI training in the cloud due to their high performance. However, NVIDIA will see its market share chipped away by AI training focused ASIC vendors like Graphcore, who are building high-performance and use-case specific chipsets," concluded Vernon.