The launch is one of the most important in AMD’s five-decade history, setting up a showdown with Nvidia in the red-hot market for AI accelerators. Such chips help develop AI models by bombarding them with data, a task they handle more adeptly than traditional computer processors.
Building AI systems that rival human intelligence — considered the holy grail of computing — is now within reach, Su said in an interview. But deployment of the technology is still only just beginning. It will take time to assess the impact on productivity and other aspects of the economy, she said.
“The truth is we’re so early,” Su said. “This is not a fad. I believe it.”
AMD is showing increasing confidence that the MI300 lineup can win over some of the biggest names in technology, potentially diverting billions in spending toward the company. Customers using the processors will include Microsoft Corp, Oracle Corp and Meta Platforms, AMD said.
See also: Intel CEO signals that he’ll stick with contentious foundry plan
Nvidia shares dropped 2.3% to US$455.03 in New York on Dec 6, a sign investors see the new chip as a threat. Still, AMD shares did not see a commensurate increase. On a day when tech stocks were generally down, the shares fell 1.3% to US$116.82.
Surging demand for Nvidia chips by data centre operators helped propel that company’s shares this year, sending its market value past US$1.1 trillion. The big question is how long it will essentially have the accelerator market to itself.
See also: Incoming Intel CEO signals he will stick with contentious foundry plan
AMD sees an opening: Large language models — used by AI chatbots such as OpenAI’s ChatGPT — need a huge amount of computer memory, and that is where the chipmaker believes it has an advantage.
The new AMD chip has more than 150 billion transistors and 2.4 times as much memory as Nvidia’s H100, the current market leader. It also has 1.6 as much memory bandwidth, further boosting performance, AMD said.
Su said that the new chip is equal to Nvidia’s H100 in its ability to train AI software and much better at inference — the process of running that software once it is ready for real-world use.
While the company expressed confidence in its product’s performance, Su said it will not just be a competition between two companies. Many others will vie for market share too.
At the same time, Nvidia is developing its own next-generation chips. The H100 will be succeeded by the H200 in the first half of next year, giving access to a new high-speed type of memory. That should match at least some of what AMD is offering. And then Nvidia is expected to come out with a whole new architecture for the processor later in the year.
AMD’s prediction that AI processors will grow into a US$400 billion market underscores the boundless optimism in the artificial intelligence industry. That compares with US$597 billion for the entire chip industry in 2022, according to IDC.
As recently as August, AMD had offered a more modest forecast of US$150 billion over the same period. But it will take the company a while to grab a large piece of that market. AMD has said that its own revenue from accelerators will top US$2 billion in 2024, with analysts estimating that the chipmaker’s total sales will reach about US$26.5 billion.
The chips are based on the type of semiconductors called graphics processing units, or GPUs, which have typically been used by video gamers to get the most realistic experience. Their ability to perform a certain type of calculation rapidly by doing many of computations simultaneously has made them the go-to choice for training AI software. — Bloomberg