Samsung to supply $752 million in Mach-1 AI chips to Naver, replace Nvidia

Samsung is also in talks with Big Tech firms such as Microsoft and Meta to supply its new AI accelerator

A researcher at a Samsung Electronics chip cleanroom
A researcher at a Samsung Electronics chip cleanroom
Jeong-Soo Hwang, Chae-Yeon Kim and Eui-Myung Park 4
2024-03-22 11:18:06 hjs@hankyung.com
Korean chipmakers

Samsung Electronics Co., the world’s top memory chipmaker, will supply its next-generation Mach-1 artificial intelligence chips to Naver Corp. by the end of this year in a deal worth up to 1 trillion won ($752 million).

With the contract, Naver will significantly reduce its reliance on Nvidia Corp. for AI chips.

Samsung’s System LSI business division has agreed with Naver on the supply deal and the two companies are in final talks to fine-tune the exact volume and prices, people familiar with the matter said on Friday.

Samsung, South Korea’s tech giant, hopes to price the Mach-1 AI chip at around 5 million won ($3,756) apiece and Naver wants to receive between 150,000 and 200,000 units of the AI accelerator, sources said.

Naver's headquarters
Naver's headquarters

Naver, a leading Korean online platform giant, is expected to use Mach-1 chips in its servers for AI inference, replacing chips it has been procuring from Nvidia.

Leveraging its sale of Mach-1 chips to Naver as a stepping stone, Samsung plans to expand its client base to Big Tech firms. Samsung is already in supply talks with Microsoft Corp. and Meta Platforms Inc., sources said.

An accelerator is a special-purpose hardware device that uses multiple chips designed for data processing and computing.

MACH-1, COMPETITIVE IN PRICING, PERFORMANCE & EFFICIENCY

Kyung Kye-hyun, head of Samsung's semiconductor business, said during the company’s annual general meeting on Wednesday that the Mach-1 AI chip is under development and it will begin mass production of a prototype by year-end.

Nvidia is the world's top AI chip designer
Nvidia is the world's top AI chip designer

Mach-1 is an AI accelerator in the form of a system-on-chip (SoC) that reduces the bottleneck between the graphics processing unit (GPU) and high bandwidth memory (HBM) chips, according to Samsung.

Kyung said Mach-1 is a product specified to fit the transformer model.

“By using several algorithms, it can reduce the bottleneck phenomenon that occurs between memory and GPU chips to one-eighth of what we are witnessing today and improve the power efficiency by eight times,” he said. “It will enable large language model inference even with low-power memory instead of power-hungry HBM.”

Unlike Nvidia's AI accelerator, which consists of GPUs and HBM chips, Mach 1 combines Samsung’s proprietary processors and low-power (LP) DRAM chips.

With that design, Mach-1 boasts fewer data bottlenecks, consuming less power than Nvidia products, industry sources said.

Besides, the price of the Mach-1 chip is one-tenth that of Nvidia's, they said.

SK Hynix developed the industry's first HBM3 DRAM chip
SK Hynix developed the industry's first HBM3 DRAM chip

TO WEAN ITSELF OFF NVIDIA 

Nvidia, the world’s largest chip design firm and AI chip provider, posted an operating profit margin of 62% in the November-January quarter. Some $18.8 billion, or 40% of its server business revenue, came from AI inference chip sales last year.

Sources said Naver will use Samsung’s Mach-1 chips to power servers for its AI map service, Naver Place. Additional Mach-1 chip supply to Naver is possible if the first batch shows “good performance,” they said.

Naver has been reducing its reliance on Nvidia for AI chips.

Last October, Naver replaced Nvidia’s GPU-based server with Intel Corp.’s central processing unit (CPU)-based server.

Intel's fourth-generation Sapphire Rapids Xeon scalable processors
Intel's fourth-generation Sapphire Rapids Xeon scalable processors

Naver’s AI server switch comes as global information technology firms are increasingly disgruntled with Nvidia’s GPU price hikes and a global shortage of its GPUs.

For Samsung, its deal with Naver would help it compete with crosstown rival SK Hynix Inc., the dominant player in the advanced HBM segment.

Kyung, chief executive of Samsung’s Device Solutions (DS) division, which oversees its chip business, said with the Mach-1 chip Samsung aims to catch up to SK Hynix, which recently started mass production of its next-generation HBM chip.

HBM has become an essential part of the AI boom, as it provides the much-needed faster processing speed compared with traditional memory chips.

Intel's fourth-generation Sapphire Rapids Xeon scalable processors
Intel's fourth-generation Sapphire Rapids Xeon scalable processors

A laggard in the HBM chip segment, Samsung has been investing heavily in HBM to rival SK Hynix and other memory players.

Last month, Samsung said it developed HBM3E 12H, the industry's first 12-stack HBM3E DRAM and the highest-capacity HBM product to date. Samsung said that it will start mass production of the chip in the first half of this year.

According to market research firm Omdia, the global inference AI accelerator market is forecast to grow from $6 billion in 2023 to $143 billion by 2030.

Write to Jeong-Soo Hwang, Chae-Yeon Kim and Eui-Myung Park at hjs@hankyung.com

In-Soo Nam edited this article.

Samsung to unveil Mach-1 AI chip to upend SK Hynix’s HBM leadership

Samsung to unveil Mach-1 AI chip to upend SK Hynix’s HBM leadership

Samsung Electronics unveils its chip business strategy at its annual general meeting on March 20, 2024 Samsung Electronics Co., the world’s top memory chipmaker, is developing a next-generation artificial intelligence chip, Mach-1, with which the South Korean tech giant aims to upend its

Samsung rallies on expectations of Nvidia’s HBM order

Samsung rallies on expectations of Nvidia’s HBM order

Nvidia CEO Jensen Huang talks about processing units during the keynote address of GTC in San Jose, Calif., on March 18, 2024. (Courtesy of AP via Yonhap) SAN JOSE, Calif. – Samsung Electronics Co.'s stock zoomed on Wednesday after the world’s most famous AI chip provider Nvidia Cor

Samsung to double HBM chip production to lead on-device AI chip era

Samsung to double HBM chip production to lead on-device AI chip era

▲ Prepared for the AI Revolution! Explore Samsung Semiconductor's Exclusive CES 2024 Showcase! LAS VEGAS – Samsung Electronics Co., the world’s largest memory chipmaker, plans to more than double its high bandwidth memory (HBM) chip production volume as it aims to take the lead in t

SK Hynix bets on DRAM upturn with $7.6 bn spending; HBM in focus

SK Hynix bets on DRAM upturn with $7.6 bn spending; HBM in focus

SK Hynix's M16 DRAM chip plant SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., is significantly increasing its capital spending on DRAM facilities to ramp up the production of advanced chips as the company bets on an imminent industry upturn.SK Hy

Naver replaces Nvidia GPU with Intel CPU for its AI map app server

Naver replaces Nvidia GPU with Intel CPU for its AI map app server

Naver's headquarters South Korea’s web portal giant Naver Corp. has replaced the main chip supplier of its artificial intelligence server for its map service, Naver Place, from Nvidia Corp. to Intel Corp.Naver has so far used Nvidia’s graphic processing unit (GPU)-based server to ru

Samsung set to supply HBM3 to Nvidia, develops 32 Gb DDR5 chip

Samsung set to supply HBM3 to Nvidia, develops 32 Gb DDR5 chip

Nvidia’s advanced graphic chips Samsung Electronics Co. is expected to soon supply high-performance DRAM chips, HMB3, to US graphics chip designer Nvidia Corp. – a move that will strengthen Samsung’s presence in the fast-growing artificial intelligence (AI) chip segment.Separa

Funds with high exposure to Nvidia, Samsung yield higher returns

Funds with high exposure to Nvidia, Samsung yield higher returns

It's time again to snap up semiconductor stocks – the market darlings these days.Just like its global peers, including the New York Stock Exchange and the Nasdaq, South Korea’s main bourse is on an uptrend trajectory led by the likes of Samsung Electronics Co. and SK Hynix Inc., na

Samsung to make 3 nm chips for Nvidia, Qualcomm, IBM, Baidu

Samsung to make 3 nm chips for Nvidia, Qualcomm, IBM, Baidu

Samsung executives hold semiconductor wafers made with 3-nanometer tech Samsung Electronics Co., the world’s largest memory chipmaker, will make semiconductors with the industry’s most advanced 3-nanometer process node for its clients, including Nvidia, Qualcomm Technologies, IBM an

Hyundai Motor to install Nvidia’s in-vehicle system in entire fleet from 2022

Hyundai Motor to install Nvidia’s in-vehicle system in entire fleet from 2022

Hyundai Motor Group will use US chipmaker Nvidia Corp.’s in-vehicle information and entertainment system for its entire fleet of Hyundai Motor Co. and Kia Motors Corp. models, as well as its standalone luxury brand Genesis, from 2022.The automotive group said on Nov. 10 that both sides have al

(* comment hide *}