SK hynix develops trio of AI-specific memory technologies

In an era where artificial intelligence (AI) is rapidly evolving, memory technology plays a crucial role in maximizing performance. With this in mind, SK hynix is not just resting on its laurels following its announcements of new AI NAND technologies; the company is also venturing into advanced DRAM solutions tailored for AI applications. This initiative aims to solidify SK hynix's position as a comprehensive creator of AI memory solutions.

INDEX

Advancements in SK hynix's AI Memory Technology

At the recent “SK AI Summit 2025” held in Seoul, SK hynix showcased its vision for the future of memory performance tailored for AI applications. The company’s focus on bridging the gap between GPU capabilities and memory performance is critical, especially since the current memory technology struggles to keep pace with the advancements in GPU designs. This disparity is often referred to as the “Memory Wall.”

Despite SK hynix's leadership position in the high-bandwidth memory (HBM) market, they recognize that merely improving HBM is not sufficient to meet the demands placed on memory systems by modern AI workloads. To address this, SK hynix has announced plans to develop a full stack of AI memory technologies, encompassing both DRAM and NAND solutions.

Key Product Innovations: Custom HBM and AI DRAM

Among the significant innovations introduced are two promising product lines: custom HBM and AI DRAM (AI-D). These products are engineered to enhance the performance of GPUs and application-specific integrated circuits (ASICs) while reducing overall data transfer power consumption.

  • Custom HBM: This product integrates specific functions directly into the GPU and ASIC, optimizing their performance. A notable design change involves relocating the HBM controller from the GPU stack to the HBM base die, thereby improving efficiency.
  • AI DRAM (AI-D): This line is categorized into three specific technologies: Optimization (AI-D O), Breakthrough (AI-D B), and Expansion (AI-D E), each designed to tackle different challenges within the AI memory ecosystem.

Understanding AI-D: Types and Functionality

SK hynix is developing three distinct types of AI DRAM, each tailored for specific use cases:

  • AI-D O (Optimization): This low-power, high-performance DRAM is designed to reduce the total cost of ownership. It incorporates technologies such as MRDIMM, SOCAMM2, and LPDDR5R, which enhance memory access speed and reliability.
  • AI-D B (Breakthrough): Aimed at overcoming the memory wall, this product boasts ultra-high-capacity memory with flexible allocation, utilizing Compute eXpress Link Memory Module (CMM) and Processing-In-Memory (PIM) technologies.
  • AI-D E (Expansion): This concept focuses on extending memory applications beyond data centers, targeting industries such as robotics, mobility, and industrial automation.

Technological Features of AI-D Innovations

Each type of AI-D features unique technologies designed to enhance performance:

  • MRDIMM: This module operates two ranks of memory simultaneously, effectively increasing memory data access speed.
  • SOCAMM2: A low power Small Outline Compression Attached Memory Module, it serves AI servers and adheres to an open industry standard.
  • LPDDR5R: An improved version of traditional LPDDR technology, it offers greater reliability through enhanced serviceability.
  • CMM: This interface connects CPUs, GPUs, and memory, facilitating high-performance computations.
  • PIM: This integration brings computational capabilities directly into memory, addressing data movement challenges in AI and big data processing.

Future Prospects: HBM4 and Beyond

SK hynix is actively working on advanced HBM technologies, with plans for HBM4 and HBM4E versions featuring up to 16-layer stacks. The company has also hinted at future developments, including HBM5 and HBM5E technologies, expected to emerge between 2029 and 2031, positioning itself as a leader in next-generation memory solutions.

Collaborative Efforts and Strategic Partnerships

To bolster its memory technology advancements, SK hynix is forging strategic partnerships with leading companies in the tech industry:

  • NVIDIA: Collaborating on HBM technology and enhancing fab productivity.
  • OpenAI: Engaging in long-term cooperation focused on high-performance memory solutions.
  • TSMC: Joint efforts are underway to develop next-generation HBM base dies.
  • NAVER Cloud: Working together to optimize AI memory and storage products for real-world environments.

Engagement and Resources

For those interested in a more detailed exploration of these advancements, SK hynix has made presentations available online. Notably, you can find a video of the presentations, including insights from President and CEO Noh-Jung Kwak, at the following link:

View the presentation video here. Please note that the video is in Korean.

In conclusion, SK hynix is positioning itself at the forefront of AI memory technology, with ambitious plans that could redefine the landscape of memory solutions in the era of artificial intelligence. Their innovative approach, underpinned by strategic partnerships and a clear understanding of market needs, sets a promising stage for the future of memory technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

Your score: Useful