Tesla asks Samsung, SK Hynix to supply HBM4 chip samples

The US EV giant is expected to replace old HBM chips used in Dojo with the newer model under development

Tesla's Cybertruck on display at Korea's Future Innovation Tech Expo 2024 in Daegu on Oct. 23, 2024 (Courtesy of News1 Korea) 
Tesla's Cybertruck on display at Korea's Future Innovation Tech Expo 2024 in Daegu on Oct. 23, 2024 (Courtesy of News1 Korea) 
Chae-Yeon Kim 3
2024-11-19 16:57:12 why29@hankyung.com
Korean chipmakers

Samsung Electronics Co. and SK Hynix Inc. are said to each be developing a sixth-generation high-bandwidth memory (HBM4) chip prototype for Tesla Inc., which has joined its US Big Tech peers in a race to develop its own artificial intelligence chips, according to semiconductor industry sources on Tuesday.

A slew of next-generation HBM chip orders received by the South Korean memory chip cross-town rivals suggest the AI-driven HBM boom will continue through next year.

Industry sources said that the US EV giant has asked the Korean chip duo to supply HBM4 chips for general use, and it is expected to choose one of the two companies as its HBM4 supplier after testing their samples. 

The Korean chipmakers have been developing customized HBM4 chips for US Big Tech companies such as Google LLC, Meta Platforms Inc. and Microsoft Corp., seeking to lower their reliance on Nvidia Corp.'s AI chips.

Joining the Big tech companies, Tesla is expected to use the next-generation HBM chip to enhance its AI chip capability.

SK Hynix HBM3E (Courtesy of News1 Korea) 
SK Hynix HBM3E (Courtesy of News1 Korea) 

Tesla operates Dojo, its custom-built supercomputer designed to train its “Full Self-Driving” neural networks. This is also expected to be the cornerstone of Tesla’s AI ambitions beyond self-driving.

HBM chips are one of the key parts in running the supercomputer to train AI models with massive datasets, and Tesla is expected to use the sixth-generation HBM chip in Dojo, also powered by its own AI chip D1.

The HBM4 chip could also be used in Tesla’s AI data centers under development and its self-driving cars, which are currently fitted with HBM2E chips for pilot programs.

WHY HBM4?

More advanced HBM chips can improve efficiency in processing massive data and AI model training.  

The performance of the sixth-generation HBM chip is expected to be significantly improved compared with its predecessors, which were built with a base die method that connects the bottom layer of an HBM stack to the graphics processing unit (GPU).

Samsung's HBM3E (Courtesy of Samsung Electronics) 
Samsung's HBM3E (Courtesy of Samsung Electronics) 

The HBM4 uses a logic die, which sits at the base of the stack of dies and is a core component of an HBM chip.

According to SK Hynix, the HBM4 chip delivers a bandwidth that is 1.4 times faster than that of the fifth-generation HBM3E and consumes about 30% less power.

Since HBM3E delivers a bandwidth of 1.18 terabytes (TB) per second, the HBM4’s bandwidth is expected to top 1.65 TB/s. The newer model’s drain power voltage (VDD) is also set to drop to 0.8 V from 1.1 V.

FIERCE HBM4 BATTLE

Samsung and SK Hynix, the world’s two biggest memory chipmakers, are going all-out to take the lead in the HBM4 market, poised to bloom later next year.

The HBM market is forecast to grow to $33 billion in 2027 from $4 billion in 2023, according to Morgan Stanley.

The HBM market is currently led by SK Hynix, a major HBM chip supplier for the global AI chip giant Nvidia, which controls more than 90% of the global AI chip market.

Tesla asks Samsung, SK Hynix to supply HBM4 chip samples

To catch up to SK Hynix, its bigger memory rival Samsung Electronics has even formed a partnership with foundry archrival Taiwan Semiconductor Manufacturing Company Ltd. (TSMC) under an agreement, in which TSMC will manufacture base dies for Samsung’s HBM4 chips upon requests by the latter’s customers.

Samsung Electronics currently promotes a turnkey order for HBM chips, covering from memory architecture design to production and foundry.

In July, the world’s top memory chipmaker said it will use its cutting-edge 4-nanometer (nm) foundry process to mass-produce the HBM4 chip.

Bagging an HBM4 order from Tesla after quality tests would allow it to turn the tide in the global HBM market.

But SK Hynix is also expected to accelerate the development of HBM4 chips to win orders from Tesla with high AI ambitions, a move expected to cement its leadership.

SK Hynix has been actively seeking to develop automotive HBM chips, considered among the next-generation memory chips.

Write to Chae-Yeon Kim at why29@hankyung.com

Sookyung Seo edited this article.

Nvidia asks SK Hynix to bring forward HBM4 supply by 6 months

Nvidia asks SK Hynix to bring forward HBM4 supply by 6 months

SK Group Chairman Chey Tae-won (Courtesy of SK Group) Nvidia Corp.’s Chief Executive Jensen Huang has asked SK Group Chairman Chey Tae-won to supply 12-layer HBM4 chips, the most advanced AI chips available, six months earlier than SK's schedule of early 2026, Chey said on Monday.“W

Samsung Electronics, TSMC tie up for HBM4 AI chip development

Samsung Electronics, TSMC tie up for HBM4 AI chip development

TSMC is the world's largest foundry player TAIPEI – South Korea’s Samsung Electronics Co., the world’s largest memory chipmaker, is partnering with its foundry rival Taiwan Semiconductor Manufacturing Co. (TSMC) to jointly develop a next-generation artificial intelligence chip

Samsung, SK Hynix up the ante on HBM to enjoy AI memory boom

Samsung, SK Hynix up the ante on HBM to enjoy AI memory boom

Samsung memory business chief Lee Jung-bae at Semicon Taiwan 2024 TAIPEI – Samsung Electronics Co. and SK Hynix Inc., the world’s two largest memory chipmakers, are racing to supply their advanced DRAM chips to their clients, including Nvidia Corp. to enjoy the Al boom.Executives fr

Samsung to mass-produce HBM4 on 4 nm foundry process

Samsung to mass-produce HBM4 on 4 nm foundry process

Choi Siyoung, Samsung Electronics' foundry chief, gives a keynote speech at Samsung Foundry Forum 2024 in Seoul on July 9, 2024 (File photo by Samsung Electronics) Samsung Electronics Co., the world’s top memory chipmaker, plans to use its cutting-edge 4-nanometer (nm) foundry process for

HBM chip war intensifies as SK Hynix hunts for Samsung talent

HBM chip war intensifies as SK Hynix hunts for Samsung talent

SK Hynix's advanced chipset South Korean chipmaker SK Hynix Inc., just like its global peers, has largely designed and produced semiconductors in-house, including high-bandwidth memory (HBM), an AI chip, whose demand is growing explosively.For the next-generation AI chip, HBM4, however, the com

SK Hynix works on next-generation HBM chip supply plans for 2025

SK Hynix works on next-generation HBM chip supply plans for 2025

SK Hynix executives discuss AI memory leadership and future HBM market trends at a roundtable discussion in May 2024 South Korea’s SK Hynix Inc. said on Thursday it is working on next year’s supply plans for its high-bandwidth memory (HBM) chips as clients are advancing their produc

SK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO

SK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO

SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap at a press conference at the company's headquarters on May 2, 2024 ICHEON, Gyeonggi Province – SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Thursday its capacity

(* comment hide *}