Samsung accelerates foray into fastest AI chip market

The chipmaker is set to mass produce 16 GB and 24 GB HBM3 chips, the industry's fastest and slimmest models

Samsung recently shipped samples of its two memory chip models designed for AI applications
Samsung recently shipped samples of its two memory chip models designed for AI applications
Jeong-Soo Hwang 2
2023-06-26 18:23:46 hjs@hankyung.com
Korean chipmakers

South Korea’s Samsung Electronics Co. is speeding up its efforts to penetrate deeper into the high bandwidth memory 3 (HBM3) market, an area it has neglected relative to other high-performance chips due to its tiny share of the entire memory chip market.

But the advent of generative AI such as ChatGPT is driving the world’s top memory chipmaker to ramp up production of HBM chips, which boast faster data-processing speeds and lower energy consumption than conventional DRAMs.

Samsung has recently shipped samples of HBM3 products with a 16 gigabyte (GB) memory capacity with the lowest energy consumption of its kind to customers, according to industry sources on Monday.

Currently, the 16 GB is the maximum memory capacity for the existing HBM3 products and processes data at a speed of 6.4 GB per second, the industry’s fastest.

It also delivered the samples of a 12-layer 24 GB HBM3, a fourth-generation HBM chip and the industry’s slimmest of its kind. Its smaller domestic rival SK Hynix Inc. unveiled the same-type model for the first time in the world last April.

Samsung is now ready to mass produce both types of HBM3, the sources said. In the second half, it will launch an advanced model of HBM3 memory with higher performance and capacity.

An HBM is a product made by vertically stacking DRAM chips. It is mainly used for graphic processing units (GPUs) that power generative AI platforms such as ChatGPT. 

SK Hynix's HBM3 DRAM chip
SK Hynix's HBM3 DRAM chip


Samsung is understood to have begun the shipment of HBM3 products to its major customers.

AMD Inc.’s recently announced MI 300 accelerated processing units are embedded with Samsung’s HBM3 memory. AMD is a fabless US semiconductor company. MI 300 chips are used to power supercomputers.

The Aurora supercomputer, developed jointly by Intel Corp. and the Argonne National Laboratory, is said to be equipped with Samsung chips, according to the sources.

IN EARLY STAGES OF GROWTH

HBM was not high on the list for Samsung. It instead focused on mobile chips and high-performance computing technologies designed to improve the data processing capability and the performance of complex calculations.

The HBM market is still in its early stages of growth, accounting for less than 1% of the DRAM market.

But Samsung’s aggressive foray into the sector is expected to shake up the HBM market, giving a leg up to the sluggish DRAM market, industry watchers said.

The HBM market is forecast to grow by an annual average rate of 45% or more between this year and 2025, in tandem with AI market growth, according to TrendForce.

Samsung unveiled HBM-PIM (processing-in-memory) in 2021
Samsung unveiled HBM-PIM (processing-in-memory) in 2021

SK Hynix controls half of the HBM market worldwide, trailed by Samsung with a 40% stake and Micron Technology Inc. with a 10% stake, the Taiwan-based research firm said.

In 2021, Samsung developed HBM-PIM (processing-in-memory) integrated with an AI accelerator. It enhances the generative capability of an AI application by 3.4 times more than an HBM-powered GPU accelerator, according to Samsung Electronics.

It also unveiled CXL DRAMs, which have bigger data-processing capacity than conventional DRAMs so that it can prevent a memory bottleneck for AI supercomputers, or a slowdown in text generation and data movement.

Write to Jeong-Soo Hwang at hjs@hankyung.com
Yeonhee Kim edited this article. 

SK Hynix’s latest 1b nm DDR5 DRAM chip under Intel test run

SK Hynix’s latest 1b nm DDR5 DRAM chip under Intel test run

SK Hynix's 64 GB DDR5 server DRAM made using its 1b technology SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Tuesday it has supplied its latest DDR5 DRAM chips made with its cutting-edge 1b technology to Intel Corp. for compatibility che

Micron overtakes SK Hynix in shrinking DRAM market

Micron overtakes SK Hynix in shrinking DRAM market

Samsung's 16 Gb DDR5 DRAM chips made with the industry's most advanced 12-nanometer process node Micron Technology Inc., the world’s third-largest DRAM maker, raised its share in the shrinking DRAM market in the first quarter, with a milder sales decline than its bigger rivals Samsung Ele

Samsung rolls out industry’s finest 12 nm DDR5 DRAM chips

Samsung rolls out industry’s finest 12 nm DDR5 DRAM chips

Samsung's 16 Gb DDR5 DRAM chips made with the industry's most advanced 12-nanometer process node Samsung Electronics Co. has begun mass production of 16 gigabit DDR5 DRAM chips, using the industry’s finest 12-nanometer process node, further widening its lead over rivals in chip processing

Samsung, Naver to jointly develop generative AI to rival ChatGPT

Samsung, Naver to jointly develop generative AI to rival ChatGPT

Robot humanoid backed by an AI-thinking brain uses a laptop South Korea’s two tech giants – Samsung Electronics Co. and Naver Corp. – have agreed to jointly develop a generative artificial intelligence platform for corporate users to compete with global AI tools such as ChatGP

Samsung develops 128GB DRAM for data intensive applications

Samsung develops 128GB DRAM for data intensive applications

(Courtesy of Samsung) Samsung Electronics Co. said on Friday it has developed the industry’s first 128-gigabyte DRAM based on the compute express link (CXL) 2.0 memory used in high-performance server systems.The latest CXL was developed on an Intel Xeon platform. Montage Technology Co. wi

SK Hynix unveils industry’s slimmest 12-layer, 24 GB HBM3 chip

SK Hynix unveils industry’s slimmest 12-layer, 24 GB HBM3 chip

SK Hynix's HBM3 DRAM chip SK Hynix Inc., the world’s second-largest memory chipmaker, said on Thursday it has developed the world’s first 12-layer DRAM product with a 24 gigabyte (GB) memory capacity, the industry’s largest.Called HBM3, short for High Bandwidth Memory 3, the c

(* comment hide *}