Samsung Electronics' 12-layer, top-end HBM3E and other DDR modules are arranged in Seoul, South Korea, Thursday, April 4, 2024. Samsung's profits rebounded sharply in the first quarter of 2024, reflecting a turnaround in the company's pivotal semiconductor sector. Strong segmentation and sales of Galaxy S24 smartphones. Photographer: Seung-jun Cho/Bloomberg via Getty Images
Bloomberg | Bloomberg | Getty Images
High-performance memory chips are likely to remain in limited supply this year, as growing demand for artificial intelligence leads to shortages of these chips, according to analysts.
SK Hynix and Micron – two of the world's largest memory chip suppliers – are out of high-bandwidth memory chips for 2024, while inventory for 2025 is also almost sold out, according to the companies.
“We expect the overall supply of memory to remain tight throughout 2024,” Kazunori Ito, director of equity research at Morningstar, said in a report last week.
Demand for AI chips has boosted the high-end memory chip market, greatly benefiting companies such as Samsung Electronics and SK Hynix, the world's two largest memory chip manufacturers. While SK Hynix already supplies chips to NvidiaThe company is said to be considering Samsung as a potential supplier as well.
High-performance memory chips play a critical role in training large language models (LLMs) like OpenAI's ChatGPT, which has caused AI adoption to skyrocket. LLMs need these chips to remember details from previous conversations with users and their preferences to create human-like responses to queries.
“These chips are more complex to manufacture and ramping up production has been difficult. This will likely lead to shortages through the end of 2024 and through most of 2025,” said William Bailey, director of Nasdaq IR Intelligence.
Market intelligence firm TrendForce said in March that HBM's production cycle is 1.5 to two months longer than the DDR5 memory chip typically found in PCs and servers.
To meet the growing demand, SK Hynix plans to expand production capacity by investing in advanced packaging facilities in Indiana, US as well as in the M15X plant in Cheongju and Yongin Semiconductor Group in South Korea.
Samsung said during its first-quarter earnings call in April that its HBM module supplies in 2024 “expanded more than three-fold compared to last year.” Chip capacity refers to the number of data bits that a memory chip can store.
“We have already completed discussions with our customers on this committed offer. In 2025, we will continue to expand the offer by at least two times or more year-on-year, and we are already having smooth conversations with our customers on this offer,” Samsung said.
Micron did not respond to CNBC's request for comment.
Intense competition
Big tech companies Microsoft, Amazon and Google are spending billions to train their MBAs to stay competitive, increasing demand for AI chips.
“Major buyers of AI chips – companies like Meta and Microsoft – have indicated that they plan to continue pouring resources into building AI infrastructure,” Chris said. “This means they will be purchasing large quantities of AI chips, including HBM, at least until 2024.” “. Miller, author of “The Chip Wars,” a book about the semiconductor industry.
Chipmakers are in a fierce race to make the most advanced memory chips on the market to capitalize on the artificial intelligence boom.
SK Hynix said in a press conference earlier this month that it would begin mass production of HBM's latest generation chipset, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do so during the second quarter, after being the first to… Industry to ship samples of the latest chips.
“Samsung is currently progressing through the 12-layer HBM3E sampling process,” said SK Kim, executive director and analyst at Daiwa Securities. “If it can get the qualification earlier than its peers, I assume it can acquire a majority stake at the end of 2024.” And 2025.”