Samsung establishes HBM team to up AI chip production yields

“HBM leadership is coming to us,” Kyung Kye-hyun, Samsung's semiconductor business chief says

Samsung in February unveiled HBM3E 12H, the industry’s largest capacity HBM with a 12-layer stack (Courtesy of Samsung Electronics)
Samsung in February unveiled HBM3E 12H, the industry’s largest capacity HBM with a 12-layer stack (Courtesy of Samsung Electronics)
Jeong-Soo Hwang 2
2024-03-29 19:55:56 hjs@hankyung.com
Korean chipmakers

Samsung Electronics Co., the world’s No. 1 memory chipmaker, recently set up a high bandwidth memory (HBM) team within the memory chip division to increase production yields as it is developing a sixth-generation AI memory HBM4 and AI accelerator Mach-1.

The new team is in charge of the development and sales of DRAM and NAND flash memory, according to industry sources on March 29.

Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung, will lead the new team. It has not yet been decided how many employees will work for the division.

It is Samsung’s second HBM-dedicated team after the company launched the HBM taskforce team in January this year, composed of 100 talent from its device solutions division.

Samsung is ramping up efforts to upend its local rival SK Hynix Inc., the dominant player in the advanced HBM segment. In 2019, Samsung disbanded the then-HBM team on the conclusion that the HBM market would not grow significantly, a painful mistake it regrets.

Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung
Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung

TWO-TRACK STRATEGY

To grab the lead in the AI chip market, Samsung will pursue a “two-track” strategy of simultaneously developing two types of cutting-edge memory chips: HBM and Mach-1.

It plans to mass-produce HBM3E in the second half of this year and produce its follow-up model HBM4 in 2025.

Currently, HBM3E is the best-performing DRAM for AI applications and a fifth-generation DRAM memory, succeeding the previous generations: HBM, HBM2, HBM2E and HMB3.

“Customers who want to develop customized HBM4 will work with us,” Kyung Kye-hyun, head of Samsung's semiconductor business, said in a note posted on a social media platform on Friday.

“HBM leadership is coming to us thanks to the dedicated team’s efforts,” he added.

At Memcon 2024, a gathering of global chipmakers, held in San Jose, California on Tuesday, Samsung's Hwang said he expects the company to increase its HBM chip production volume by 2.9 times this year, compared to last year’s output.

Samsung Electronics' annual general meeting on March 20, 2024
Samsung Electronics' annual general meeting on March 20, 2024


HBM is a high-performance memory chip stacking multiple DRAMs vertically and an essential component of AI chips in processing great volumes of data.

According to Yole Group, a French IT research firm, the HBM market is forecast to expand to $19.9 billion in  2025 and $37.7 billion in 2029, compared to an estimated $14.1 billion in 2024.

MACH-1

Last week, Kyung said at its annual general meeting that the Mach-1 AI chip is currently under development and the company plans to produce a prototype by year-end.

Mach-1 is in the form of a system-on-chip (SoC) that reduces the bottleneck between the graphics processing unit (GPU) and HBM chips.

Samsung is also preparing to develop Mach-2, a next-generation model of inference-committed AI accelerator Mach-1.

“We need to accelerate the development of Mach-2, for which clients are showing strong interest,” Kyung said in the note on Friday.

Write to Jeong-Soo Hwang at hjs@hankyung.com
Yeonhee Kim edited this article. 

Samsung set to triple HBM output in 2024 to lead AI chip era

Samsung set to triple HBM output in 2024 to lead AI chip era

Choi Jin-hyeok, corporate EVP and head of the R&D Center, Samsung Semiconductor US, speaks at Memcon 2024 SILICON VALLEY – Samsung Electronics Co., the world’s largest memory chipmaker, will likely triple its high bandwidth memory (HBM) chip production volume this year from last

Samsung to unveil Mach-1 AI chip to upend SK Hynix’s HBM leadership

Samsung to unveil Mach-1 AI chip to upend SK Hynix’s HBM leadership

Samsung Electronics unveils its chip business strategy at its annual general meeting on March 20, 2024 Samsung Electronics Co., the world’s top memory chipmaker, is developing a next-generation artificial intelligence chip, Mach-1, with which the South Korean tech giant aims to upend its

Samsung rallies on expectations of Nvidia’s HBM order

Samsung rallies on expectations of Nvidia’s HBM order

Nvidia CEO Jensen Huang talks about processing units during the keynote address of GTC in San Jose, Calif., on March 18, 2024. (Courtesy of AP via Yonhap) SAN JOSE, Calif. – Samsung Electronics Co.'s stock zoomed on Wednesday after the world’s most famous AI chip provider Nvidia Cor

SK Hynix mass-produces HBM3E chip to supply Nvidia

SK Hynix mass-produces HBM3E chip to supply Nvidia

(Courtesy of SK Hynix) SK Hynix Inc., the world’s No. 2 memory chipmaker, began mass-producing HBM3E, the best-performing DRAM chip for AI applications, for the first time in the industry to supply Nvidia Corp., the leading global semiconductor designer. SK Hynix said on Tuesday it p

SK Hynix explores HBM chip collaboration with Kioxia

SK Hynix explores HBM chip collaboration with Kioxia

SK Hynix's HBM3 memory chips SK Hynix Inc. is exploring collaboration with Japanese chipmaker Kioxia Holdings Corp. to produce high bandwidth memory (HBM) chips, which have been enjoying soaring demand for artificial intelligence applications, according to foreign media reports.The South Korean

Samsung doubles down in HBM race with largest memory

Samsung doubles down in HBM race with largest memory

Samsung's HBM3E 12H (Courtesy of Samsung Electronics) Samsung Electronics Co. has developed the industry’s first 36-gigabyte (GB) 12-layer high-value, high-performance memory chip in a bid to restore its reputation as a memory giant in the burgeoning high bandwidth memory (HBM) market cur

Samsung showcases HBM3E DRAM, automotive chips

Samsung showcases HBM3E DRAM, automotive chips

SILICON VALLEY -- Samsung Electronics Co. on Friday showcased a number of new-concept, next-generation DRAM chips, including the industry’s most advanced HBM3E, alleviating market concerns that it might trail local rival SK Hynix Inc. in the super-giant AI chip market.At Samsung Memory T

Samsung Elec to launch HBM4 in 2025 to win war in AI sector

Samsung Elec to launch HBM4 in 2025 to win war in AI sector

Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung (Courtesy of Samsung) Samsung Electronics Co., the world’s top memory chipmaker, aims to introduce sixth-generation top-performance High Bandwidth Memory4 (HBM4) DRAM chips in 2025 to win the intensifyin

(* comment hide *}