Connect with us

Innovation and Technology

HBM And Emerging Memory Technologies For AI

Published

on

HBM And Emerging Memory Technologies For AI

Introduction to AI and Mobile Networks

During congressional hearing in the House of Representatives’ Energy & Commerce Committee Subcommittee of Communication and Technology, Ronnie Vasishta, Senior VP of telecom at Nvidia said that mobile networks will be called upon to support a new kind of traffic—AI traffic. This AI traffic includes the delivery of AI services to the edge, or inferencing at the edge. Such growth in AI data could reverse the general trend towards lower growth in traffic on mobile networks.

The Rise of AI Traffic

Many AI-enabled applications will require mobile connectivity including autonomous vehicles, smart glasses, generative AI services and many other applications. He said that the transmission of this massive increase in data needs to be resilient, fit for purpose, and secure. Supporting this creation of data from AI will require large amount of memory, particularly very high bandwidth memory, such as HBM. This will result in great demand for memory that supports AI applications.

Micron’s HBM4 Memory

Micron announced that it is now shipping HBM4 memory to key customers, these are for early qualification efforts. The Micron HBM4 provides up to 2.0TB/s bandwidth and 24GB capacity per 12-high die stack. The company says that their HBM4 uses its 1-beta DRAM node, advanced through silicon via technologies, and has a highly capable built-in self-test.

HBM Memory and AI Applications

HBM memory consisting of stacks of DRAM die with massively parallel interconnects to provide high bandwidth are combined GPU’s such as those from Nvidia. This memory close to the processor allows training and inference of various AI models. The current generation of HBM memory used in current GPUs use HBM3e memory. At the 2025 March GTC in San Jose, Jensen Huang said that Micron HBM memory was being used in some of their GPU platforms.

HBM Memory Manufacturers

The manufacturers of HBM memories are SK Hynix, Samsung and Micron with SK Hynix and Samsung providing the majority of supply and with Micron coming in third. SK hynix was the first to announce HBM memory in 2013, which was adopted as an industry standard by JEDEC that same year. Samsung followed in 2016 and in 2020 Micron said that it would create its own HBM memory. All of these companies expect to be shipping HBM4 memories in volume by sometime in 2026.

Emerging Memory Technologies

Numen, a company involved in magnetic random access memory applications, recently talked about how traditional memories used in AI applications, such as DRAM and SRAM have limitations in power, bandwidth and storage density. They said that processing performance has skyrocketed by 60,000X over the past 20 years but DRAM bandwidth has improved only 100X, creating a “memory wall.”

AI Memory Engine

The company says that its AI Memory Engine is a highly configurable memory subsystem IP that enables significant improvements in power efficiency, performance, intelligence, and endurance. This is not only for Numem’s MRAM-based architecture, but also third-party MRAMs, RRAM, PCRAM, and Flash Memory.

Future of Memory Technologies

Numem said that it has developed next-generation MRAM supporting die densities up to 1GB which can deliver SRAM-class performance with up to 2.5X higher memory density in embedded applications and 100X lower standby power consumption. The company says that its solutions are foundry-ready and production-capable today.

Projections for Emerging Memories

Coughlin Associates and Objective Analysis in their Deep Look at New Memories report predict that AI and other memory-intensive applications, including the use of AI inference in embedded devices such as smart watches, hearing aids and other applications are already using MRAM, RRAM and other emerging memory technologies will decrease the costs and increase production of these memories.

Conclusion

AI will generate increased demand for memory to support training and inference. It will also increase the demand for data over mobile networks. This will drive demand for HBM memory but also increase demand for new emerging memory technologies.

FAQs

Q: What is AI traffic?
A: AI traffic refers to the delivery of AI services to the edge, or inferencing at the edge, over mobile networks.
Q: What is HBM memory?
A: HBM (High-Bandwidth Memory) is a type of memory that provides high bandwidth and is used in applications such as AI and machine learning.
Q: Who are the manufacturers of HBM memory?
A: The manufacturers of HBM memories are SK Hynix, Samsung, and Micron.
Q: What are emerging memory technologies?
A: Emerging memory technologies include MRAM, RRAM, PCRAM, and Flash Memory, which offer improvements in power efficiency, performance, and storage density compared to traditional memories.
Q: What is the projected market size for emerging memories?
A: The projected market size for emerging memories is $100B, with NOR and SRAM expected to be replaced by new memories within the next decade.

Advertisement

Our Newsletter

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending