Share

Worldwide AI chips revenue to grow 33 percent to $71 billion in 2024: Report

The value of AI accelerators used in servers is expected to increase to $33 billion by 2028
Worldwide AI chips revenue to grow 33 percent to $71 billion in 2024: Report
The growing demand for high-performance AI chips in data centers is being fueled by generative AI (GenAI). 

Global revenue from AI semiconductors is expected to reach $71 billion in 2024, a 33 percent increase from 2023, a new report revealed.

According to the latest forecast from Gartner, Inc., the value of AI accelerators used in servers, which offload data processing from microprocessors, is projected in 2024 to total $21 billion, and this figure is expected to increase to $33 billion by 2028.

Gartner’s VP Analyst, Alan Priestley, noted that the growing demand for high-performance AI chips in data centers is being fueled by generative AI (GenAI). 

AI PCs to dominate enterprise purchases by 2026

Furthermore, Gartner forecasts that AI PC shipments will account for 22 percent of total PC shipments in 2024. By the end of 2026, 100 percent of enterprise PC purchases will be AI PCs, which include a neural processing unit (NPU) enabling longer battery life, quieter operation, and cooler temperatures, as well as the capability to run AI tasks continuously in the background, creating new opportunities for leveraging AI in everyday activities.

While AI semiconductor revenue is expected to experience double-digit growth throughout the forecast period, 2024 will see the highest growth rate during this time.

In 2024, AI chips revenue from the computer electronics segment is projected to total $33.4 billion, accounting for 47 percent of total AI semiconductors revenue. AI chips revenue from automotive electronics is expected to reach $7.1 billion, and $1.8 billion from consumer electronics in 2024.

Read more: UAE and France partner to boost global AI ecosystem

Tech giants investing in custom AI chip development

The report also highlights the fierce battle between semiconductor vendors and tech companies, as the major hyperscalers (AWS, Google, Meta, and Microsoft) are all investing in developing their own chips optimized for AI. While chip development is expensive, using custom-designed chips can improve operational efficiencies, reduce the costs of delivering AI-based services to users, and lower the costs for users to access new AI-based applications. Moreover, according to Priestley, “As the market shifts from development to deployment, we expect to see this trend continue.”

For more technology news, click here.

The stories on our website are intended for informational purposes only. Those with finance, investment, tax or legal content are not to be taken as financial advice or recommendation. Refer to our full disclaimer policy here.