Nvidia vs. AMD vs. Intel: Comparing AI Chip Sales



In its second quarter financial release, the business recorded record revenue, with sales of AI chips playing a significant role. How do Nvidia, AMD, and Intel's sales of AI chips compare to those of other American rivals?


This graph shows the revenue over time for each company using earnings data.

A clear leader emerges.

The businesses share revenue for their data center division even though they don't officially report revenue for their AI chips.


Chips like central processing units (CPUs), data processing units (DPUs), and graphic processing units (GPUs) are included in the data center category. The latter are recommended for AI since they can effectively and simultaneously do a variety of easy tasks.


Below, we compare the growth in quarterly data center revenue for Nvidia, AMD, and Intel.


The calendar year is used to measure quarters. When revenue was updated later, we used the most recent revision that was made available. In 2023, Intel's Accelerated Computing Systems and Graphics (AXG) division was merged with the Data Center Group. With the exception of Q1 and Q2 2022, when updated data center revenue was supplied by Intel, we have included revenue from the AXG group in the data center revenue for quarters prior to 2023.


Over the past two years, Nvidia's income from the data center has doubled, and the company is said to control more than 70% of the market for AI chips.



The business became the market leader by spotting the AI trend early and establishing itself as a one-stop shop providing chips, software, and access to specialized computers. The stock keeps rising after surpassing a $1 trillion market valuation earlier in 2023.


Competition between Intel, AMD, and Nvidia
AMD has had slower growth and less income as compared to Nvidia. It was discovered that their MI250 chip was 80% faster than Nvidia's A100 chip.



However, AMD just announced a new MI300X CPU with 192GB of memory as opposed to the 141GB that Nvidia's new GH200 delivers, putting a focus on AI. With more RAM, fewer GPUs are required, which might strengthen AMD's position as a competitor.


While having almost no market share in AI chips, Intel has experienced annual revenue reductions. The company's entry into the AI market has been difficult because it is better known for producing conventional CPUs. Years of delays were experienced with its Sapphire Rapids processor because of a complicated design and numerous bugs.


All three businesses have said they intend to expand their AI solutions in the future. It's simple to understand why: ChatGPT apparently uses 10,000 Nvidia A100 chips, costing a total of almost $100 million.


The infrastructure that powers AI models will present a sizable income potential as more of them are built.

Comments

Popular posts from this blog

OpenAI Announces ChatGPT App for Android Devices

Augmented Reality (AR) and Virtual Reality (VR)