AMD introduces AI processor to compete with Nvidia's Blackwell
- On Thursday, AMD unveiled a new AI chip that competes with Nvidia's data center GPUs.
- The Instinct MI325X and Nvidia's Blackwell chips will face off in a battle for supremacy, with the Instinct MI325X set to launch first and the Blackwell chips following in early next year.
- If Nvidia's products are viewed as a close substitute by developers and cloud giants for AMD's AI chips, it could lead to pricing pressure on Nvidia.
On Thursday, a new artificial-intelligence chip was launched that competes directly with Nvidia's data center graphics processors, known as GPUs.
AMD's Instinct MI325X chip will begin production before the end of 2024, according to the company. This new product could put pressure on Nvidia's pricing, as it is seen as a close substitute for Nvidia's AI chips by developers and cloud giants.
To perform advanced AI tasks like ChatGPT, massive data centers with GPUs are necessary, which has led to an increase in demand for AI chip providers.
Nvidia has long held the top spot in the data center GPU market, but AMD is now challenging its dominance. The company aims to capture a significant portion of the market, which is projected to be worth $500 billion by 2028.
"The growth of AI demand has surpassed expectations, as stated by AMD CEO Lisa Su at the event. It's evident that the rate of investment in this field is increasing globally," rewritten sentence.
AMD did not unveil any significant new cloud or internet customers for its Instinct GPUs at the event, but it has previously disclosed that Meta and Microsoft use its AI GPUs, and OpenAI employs them for certain applications. Additionally, the company did not disclose the pricing for the Instinct MI325X, which is typically sold as part of a complete server.
AMD is speeding up its product release schedule with the launch of the MI325X to better compete with Nvidia and capitalize on the growing demand for AI chips. The MI350 and MI400 will follow as successors to the MI300X, which began shipping late last year.
Nvidia's Blackwell chips, set to launch in early next year, will face competition from the MI325X.
An impressive launch of AMD's latest data center GPU could attract investors seeking companies that could benefit from the AI surge. Despite a 20% increase in stock value in 2024, AMD lags behind Nvidia's impressive 175% growth. Nvidia is estimated to have a dominant 90% market share in data center AI chips.
AMD stock fell 3% during trading on Thursday.
Nvidia's use of its proprietary programming language, CUDA, is a significant barrier for AMD in gaining market share in the AI industry, as it limits developers' flexibility and ties them to Nvidia's ecosystem.
AMD announced that it has enhanced its competing software, ROCm, to facilitate the transition of AI models to its chips, known as accelerators, for AI developers.
AMD has positioned its AI accelerators as more suitable for content creation and prediction-making AI models, rather than for processing large amounts of data to enhance AI models. This is partly due to the advanced memory technology used on AMD's chips, which enables it to serve Meta's Llama AI model faster than some Nvidia chips.
Su stated that the MI325 platform provides up to 40% more inference performance than the H200 on Llama 3.1, which is a large-language AI model.
Taking on Intel, too
Despite the popularity of AI accelerators and GPUs, AMD's main focus remains on central processors, or CPUs, which are used in nearly every server worldwide.
In the June quarter, AMD's data center sales increased by more than double from the previous year, reaching $2.8 billion. Out of this, AI chips contributed only approximately $1 billion.
AMD has a 34% market share in data center CPUs, but it is still behind Intel, which leads the market with its Xeon line of chips. AMD aims to change this with its new EPYC 5th Gen line of CPUs, which it announced on Thursday.
The chips are available in a variety of configurations, from a low-cost, low-power 8-core chip priced at $527 to high-performance 192-core processors intended for supercomputers, which cost $14,813 per chip.
AMD stated that the new CPUs are excellent for processing data for AI tasks. Almost all GPUs need a CPU on the same system to start up the computer.
Su stated that today's AI primarily focuses on CPU capability, which is evident in data analytics and various applications.
AMD CEO Lisa Su states that tech trends take time to unfold, and we are still learning with AI.
Technology
You might also like
- Tech bros funded the election of the most pro-crypto Congress in America.
- Microsoft is now testing its Recall photographic memory search feature, but it's not yet flawless.
- Could Elon Musk's plan to reduce government agencies and regulations positively impact his business?
- Some users are leaving Elon Musk's platform due to X's new terms of service.
- The U.S. Cyber Force is the subject of a power struggle within the Pentagon.