SK Hynix may cooperate with Kioxia to produce HBM in Japan

tech 2024-06-14 43 Comment

According to a report by MoneyDJ, South Korean memory chip giant SK Hynix is exploring a partnership with Japanese NAND flash memory manufacturer Kioxia to produce high bandwidth memory (HBM) for artificial intelligence applications. The production is expected to take place at Kioxia's joint venture with Western Digital in Japan. On the other hand, Kioxia will assess the proposed collaboration based on the semiconductor market conditions and its relationship with Western Digital.

The report highlights that the demand for HBM, a type of DRAM primarily used in AI servers, is surging globally, driven by NVIDIA. Furthermore, according to a press release previously issued by TrendForce, the market share of the top three original manufacturers of HBM in 2023 is as follows: SK Hynix at 46-49%, Samsung also around 46-49%, and Micron at approximately 4-6%.

Advertisement

For SK Hynix, utilizing Kioxia's existing factories in Kitakami City, Iwate Prefecture, and Yokkaichi, Mie Prefecture, Japan, to produce HBM will enable a rapid establishment of an expanded production system.

Meanwhile, the Japanese joint venture factory of Kioxia and Western Digital currently only produces NAND Flash. If they can produce the most advanced DRAM in the future, it will also contribute to the revitalization plan of the Japanese semiconductor industry.

The report further notes that SK Hynix has an indirect investment of about 15% in Kioxia through the U.S. investment firm Bain Capital. It is reported that Bain Capital is conducting behind-the-scenes negotiations with SK Hynix to seek a restart of the Kioxia/Western Digital merger. However, according to sources cited in a report by Jiji Press, "this cooperation and merger are two separate discussion items."

As previously reported by Asahi Shimbun, Kioxia and Western Digital are expected to resume merger talks by the end of April. Despite encountering obstacles in their merger negotiations last fall, both parties face the survival pressure of expanding their scale. However, there is still uncertainty about whether both parties can ultimately reach a merger agreement.

According to TrendForce's data for the third quarter of 2023, Samsung continues to maintain its position as the world's largest NAND flash memory manufacturer, holding a 31.4% market share. Following closely is the SK Group, ranking second with a 20.2% market share. Western Digital occupies the third position with a 16.9% market share, while Japan's Kioxia holds a 14.5% market share.

The takeoff of the HBM concept is directly related to the boom of AIGC.The rise of AI mega-models has given birth to a massive demand for computing power, and the significant increase in data processing volume and transmission rates has led to higher requirements for chip memory capacity and transmission bandwidth in AI servers. HBM, with its high bandwidth, high capacity, low latency, and low power consumption advantages, has gradually become the standard configuration for GPUs in AI servers.

Currently, HBM products are being developed in the sequence of HBM (first generation), HBM2 (second generation), HBM2E (third generation), HBM3 (fourth generation), and HBM3E (fifth generation), with the latest HBM3E being an extended version of HBM3.

SK Hynix is undoubtedly the biggest beneficiary of this memory boom. In its annual financial report for 2023, SK Hynix stated that in terms of DRAM, the company actively responded to customer demands with leading market technology, and its main products, DDR5 DRAM and HBM3, saw revenues grow more than fourfold and fivefold respectively compared to 2022.

Recently, SK Hynix Vice President Kim Ki-tae stated in a blog post that although 2024 has just begun, all of SK Hynix's HBM for this year has already been sold out. At the same time, to maintain a leading position in the market, the company has already started preparations for 2025.

Kim Ki-tae explained that despite ongoing external uncertainties, the memory market is expected to gradually heat up this year. The reasons include the recovery of product demand from major global technology customers. In addition, AI applications on devices such as PCs and smartphones will not only boost HBM3E sales but also likely increase the demand for products like DDR5 and LPDDR5T.

It is worth mentioning that at the end of last year's financial report meeting, Micron CEO Sanjay Mehrotra revealed that due to the popularity of generative AI, there is a strong demand for high-bandwidth memory (HBM) in high-performance AI chips in the cloud, and Micron's HBM production capacity for 2024 is expected to be fully booked. The HBM3E, which began mass production at the beginning of 2024, is expected to generate hundreds of millions of dollars in revenue in the 2024 fiscal year.

Kim Ki-tae emphasized that the sales competitiveness of HBM is also based on "technology." This is because, in order to respond to the rapidly growing market demand for AI memory, it is most important to ensure the specifications required by customers first. Secondly, being aware of market changes and preparing in advance is also very effective.

In this light, the competition for high-end HBM has just begun. Although the current shipment of HBM products still accounts for a very small part of the overall storage, in the long run, as consumer electronics develop towards AI, the main demands will be high computing power, high storage, and low energy consumption. Given this, it is expected that HBM will also become the technological development direction for future storage manufacturers.

*Disclaimer: This article is the original creation of the author. The content of the article is the author's personal opinion. Our reposting is for sharing and discussion only and does not represent our approval or agreement. If you have any objections, please contact the backend.