SK Hynix outlines the vision of the creator of AI memory across the board and discloses future technology blueprint development

In an era where AI applications are accelerating in popularity and information traffic is growing explosively, the importance of semiconductor memory is rapidly increasing. Kwak Noh-Jung, CEO of SK hynix, a major Korean memory manufacturer, announce...


In an era where AI applications are accelerating in popularity and information traffic is growing explosively, the importance of semiconductor memory is rapidly increasing. Kwak Noh-Jung, CEO of SK hynix, a major Korean memory manufacturer, announced the company's "Full Stack AI Memory Creator" at the "SK AI Summit 2025" held in Seoul on November 3. The new vision enables SK hynix to deepen its role before ushering in the next era of AI.

Kwak Noh-Jung pointed out that although the adoption of AI is accelerating, leading to explosive growth in information traffic, the hardware technology that supports these growth, especially memory performance, has not kept pace with the advancement of processors, forming an obstacle known as the "Memory Wall". In order to solve this problem, semiconductor memory is no longer just an ordinary component, but is evolving into a core value product in the AI ​​industry.

In this regard, SK Hynix has so far played the role of a "Full Stack Memory Provider", focusing on providing products that meet customer needs in a timely manner. However, as the importance of memory increases, the company believes that its mere "supplier" role is no longer sufficient to meet market demand. Therefore, the new goal is to become a "full-line AI memory creator", which means that SK hynix will serve as a co-architect, partner, and eco-contributor to exceed customer expectations in the field of AI computing and jointly solve the challenges faced by customers through active cooperation in the ecosystem.

In order to implement the new vision, SK Hynix Kwak Noh-Jung also revealed its product lineup, including customized HBM (Custom HBM), AI DRAM (AI-D) and AI NAND (AI-N). Such a layout will replace the traditional memory solutions that are mostly computing-centric, and instead develop in the direction of diversifying and expanding memory functions. The goal is to achieve more efficient use of computing resources and structurally solve the AI ​​inference bottleneck.

In Custom HBM (Custom HBM), SK hynix emphasized that the AI market is expanding from commoditization to inference efficiency and optimization. Therefore, HBM has also evolved from traditional products to customized products. Customized HBMs are products that integrate specific capabilities of GPUs and ASICs onto the HBM base to reflect customer needs. This technology can maximize the performance of GPU and ASIC while reducing data transmission power consumption through HBM, thereby significantly improving system efficiency.

In response to market demand, SK hynix has also further segmented AI DRAM (AI-D) and is preparing to provide memory solutions that best suit the needs of each field.

1. Optimization of AI-D O (Optimization), which is a low-power, high-performance DRAM designed to reduce total cost of ownership (TCO) and improve operating efficiency. The solution includes MRDIMM (Multiplexed Dual Rank Memory Module, which enhances speed by running two ranks simultaneously), SOCAMM (Small Form Factor Compressed Attached Memory Module, a low-power DRAM memory module for AI servers), and LPDDR5R (Low-Power Double Data Rate 5 RAS, a low-voltage DRAM with reliability, availability, serviceability (RAS) features for mobile products).

2. AI-D B (Breakthrough) breakthrough, this solution product covers overcoming the "memory wall". It features ultra-high-capacity memory and flexible memory allocation. This category includes CMM (Compute eXpress Link Memory Module, a next-generation interface that efficiently connects CPU, GPU, memory and other components) and PIM (Processing-In-Memory, a next-generation technology that integrates computing power into memory to solve data movement bottlenecks in AI and big data processing).

3. The expansion of AI-D E (Expansion) aims to expand the use cases of DRAM, not only limited to data centers, but also extended to areas such as robotics, mobility, and industrial automation. This solution contains HBM.

In addition to DRAM, SK Hynix is also preparing three next-generation storage solutions in terms of AI NAND (AI-N).

1. AI-N P (Performance) improves performance and emphasizes ultra-high performance. The solution is designed to efficiently handle the large amounts of data generated by large-scale AI inference work. Dramatically improve processing speed and energy efficiency by minimizing bottlenecks between storage and AI operations. SK Hynix plans to design NAND and controllers with new structures and aims to release samples by the end of 2026.

2. AI-N B (Bandwidth) increases the bandwidth. Expanding bandwidth by vertically stacking semiconductor dies as a solution to compensate for HBM capacity growth limitations. The key is to combine the stacked structure of HBM with high-density and cost-effective NAND flash memory.

3. AI-N D (Density) development density. Enhance cost competitiveness by achieving ultra-high capacity. This is a high-density solution suitable for storing large amounts of AI data with low power consumption and low cost. SK Hynix aims to increase density from the terabyte (TB) level of current QLC-based solid-state drives (SSDs) to the petabyte (PB) level and achieve a mid-range storage solution that combines the speed of SSDs with the cost-effectiveness of HDDs.

Kwak Noh-Jung emphasized that in the AI era, companies that create stronger synergies and superior products by collaborating with customers and partners are expected to succeed. As part of its cooperation with global leaders, SK hynix not only cooperates with NVIDIA on HBM, but also uses NVIDIA Omniverse to improve fab productivity through "fab digital twins". The company also has a long-term partnership with OpenAI to supply high-performance memory. In addition, SK Hynix is ​​working closely with Taiwan Semiconductor Manufacturing Company (TSMC) on next-generation HBM base dies.

In addition, in terms of storage technology, the company is working with Sandisk to jointly develop global standards for HBF (High Bandwidth Flash Memory). At the same time, SK hynix is ​​also working with NAVER Cloud to optimize next-generation AI memory and storage products to make them suitable for real data center environments. Going forward, SK hynix will continue to prioritize customer satisfaction and work with partners to overcome limitations and create a future.



Recommend News