Close Menu
The Financial News 247The Financial News 247
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
What's On
Embattled BuzzFeed warns end could be near as it faces major cash crunch

Embattled BuzzFeed warns end could be near as it faces major cash crunch

March 12, 2026
Fears of a bloodbath are growing over Paramount-Warner merger — including speculation around top HBO exec 

Fears of a bloodbath are growing over Paramount-Warner merger — including speculation around top HBO exec 

March 12, 2026
Live Nation employees bragged about overcharging fans: ‘Robbing them blind, baby’

Live Nation employees bragged about overcharging fans: ‘Robbing them blind, baby’

March 12, 2026
Inside Santa Monica’s 3rd Street Promenade vacancies

Inside Santa Monica’s 3rd Street Promenade vacancies

March 12, 2026

‘Using Culture To End Stigma And Save Lives’: The Mission Behind Elton John’s Glitzy Oscars Viewing Party

March 12, 2026
Facebook X (Twitter) Instagram
The Financial News 247The Financial News 247
Demo
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
The Financial News 247The Financial News 247
Home » ICMSP Could Drive Additional 100EB Of AI Storage

ICMSP Could Drive Additional 100EB Of AI Storage

By News RoomJanuary 16, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn WhatsApp Telegram Reddit Email Tumblr
ICMSP Could Drive Additional 100EB Of AI Storage
Share
Facebook Twitter LinkedIn Pinterest Email

At the 2026 CES Jensen Huang made many announcements associated with the introduction of Nvidia’s Rubin Vera AI architecture and the various open-source AI foundational models that the company created for various applications, including Alpamayo for automotive. One interesting announcement from NVIDIA was about an Inference Context Memory Storage Platform or ICMSP, using the Nvidia BlueField-4 Network intelligent Interface Card or NIC. This is described as a new kind of AI-native storage infrastructure designed for gigascale inference in order to accelerate and scale agentic AI. An image of the BlueField-4 NIC is shown below.

So, what is context memory? During an interaction with an AI system data is generated about the interaction with the AI. This is the context of that interaction. If that context information is saved it can make future interactions with the AI more consistent, coherent and personalized. It allows AI to remember details across conversations, learn unique patterns, and understand complex, multi-turn interactions by storing relevant data beyond the immediate prompt. By storing this context information in long term storage this context can retained for future interactions.

Besides the additional usefulness of context memory storage, it can also reduce the new calculation requirements by an AI system for individual queries, since data is recovered from storage rather than regenerated or kept in expensive and limited HBM, thus saving energy and allowing more efficient use of GPU processing and memory by using the context from prior interactions. This data is in the form of key-value, or KV, cache.

I spoke with Kevin Deierling, an executive from Nvidia, about the BlueField smart ethernet NIC, or digital processing unit, DPU. He told me that the ICMSP is a network storage system that can consist of SSDs and/or HDDs for storing and accessing the context memory data. It thus comprises a new tier of storage between traditional enterprise storage and the HBM DRAM, which holds the data being processed by GPUs.

The Nvidia ICMSP will store 16TB of storage per GPU and this can enable petabytes of shared context across a GPU cluster to support very large workloads. Throughput is targeted at 800Gb/s through the BlueField-4 board. The ICMSP retains an interesting form of data, in that, if needed, it can be regenerated, unlike the data typically stored in enterprise storage systems. This means that traditional data retention requirements, such as redundancy, can be relaxed and still meet the needs for context memory storage. Thus a 4-9’s reliability might be acceptable versus 9-9’s required in conventional enterprise storage.

NVIDIA said that ICMSP products from AIC, Cloudian, DDN, Dell Technologies, HPE, Hitachi Vantara, IBM, Nutanix , Pure Storage, Supermicro, VAST Data and WEKA will be available in the second half of 2026.

In addition to Kevin, I also spoke with from Phil Manez, VP of GTM Execution at VAST and Jeremy Werner SVP & GM Core Data Center Business Unit at Micron about their plans and observations about memory, storage and the ICMSP. Phil spoke about the shortages in all types of memory and storage this year.

He also pointed out that 16TB per GPU in an ICMSP could easily result in an additional 100 Exabytes of context memory data being stored, putting additional requirements for storage, particularly solid-state NAND storage. He said that the ICMSP could provide a premium inference experience for customers. The image below shows a conceptual drawing of VAST’s implementation of an ICMSP.

VAST allows adding policies in their system, such as providing premium user experiences. Phil said that VAST has an advantage for this types of storage through their use of erasure coding with an overhead ratio of only about 3% with n+4 redundancy. They also have an extensive capability for data reduction where only the differences in otherwise very similar files are stored.

In addition to their ICMSP VAST also has what they call a flash reclamation program that can reuse a company’s existing SSD storage using VAST and they have a soft launch currently with this offering.

Jeremy Werner from Micron said that this is a very good time to be in the memory and storage business and spoke about the company’s investments in additional production capacity in Boise and New York state. This should result in 3.6M square feet of DRAM fabrication in the US. He said that the storage and memory hierarchy is getting more specialized layers and spoke also about trends for memory disaggregation.

He said that the company’s Gen 6 NVMe SSD is in qualification and that at the 2025 Super Compute Conference they demonstrated 230M IOPS in a single storage server and that the company is working on additional innovations like Storage Next.

Micron is also looking at the large amount of storage required for context KV cache storage. He mentioned the company’s 245TB E3L form factor SSDs that it is introducing this year. He foresaw overall DRAM supply growth in the high teens to 20% this year but that demand will be much higher in 2026, and this will lead to higher prices and some constraints on AI data center buildouts.

Nvidia’s inference context memory storage initiative based upon the BlueField-4 DPU will drive even greater demand for storage to support higher quality and more efficient AI inference experience.

Bluefield context memory Dell HPE IBM micron Nvidia Pure Storage Supermicro VAST
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

How AI Is Transforming Enterprise Software Into Living Systems

March 11, 2026

VC-Backed Style Brands That Are Reshaping Furniture And Home Decor

March 10, 2026

Venture Capital Is Discovering Fashion Tech

March 7, 2026

Will The Iran Conflict Reshape Venture Capital?

March 6, 2026

Founder Accused By His Own Startup Of Forgery, Secret Deals And Luxury Spending

March 6, 2026

Data Plateau: Hit The Scaling Wall With AI Or Remain An Innovator?

March 3, 2026
Add A Comment
Leave A Reply Cancel Reply

Don't Miss
Fears of a bloodbath are growing over Paramount-Warner merger — including speculation around top HBO exec 

Fears of a bloodbath are growing over Paramount-Warner merger — including speculation around top HBO exec 

Business March 12, 2026

LOS ANGELES — Hollywood is buzzing this week – less about Sunday’s Academy Awards than…

Live Nation employees bragged about overcharging fans: ‘Robbing them blind, baby’

Live Nation employees bragged about overcharging fans: ‘Robbing them blind, baby’

March 12, 2026
Inside Santa Monica’s 3rd Street Promenade vacancies

Inside Santa Monica’s 3rd Street Promenade vacancies

March 12, 2026

‘Using Culture To End Stigma And Save Lives’: The Mission Behind Elton John’s Glitzy Oscars Viewing Party

March 12, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks
Anthropic would ‘pollute’ US military supply chain, Pentagon official says

Anthropic would ‘pollute’ US military supply chain, Pentagon official says

March 12, 2026
Tesla rival Lucid unveils two-seat robotaxi without steering wheel, pedals

Tesla rival Lucid unveils two-seat robotaxi without steering wheel, pedals

March 12, 2026
Costco shopper sues retailer for tariff refunds in possible class-action case

Costco shopper sues retailer for tariff refunds in possible class-action case

March 12, 2026
Dow falls nearly 600 points, oil hits 0 as Iran’s new leader to keep Strait of Hormuz blocked

Dow falls nearly 600 points, oil hits $100 as Iran’s new leader to keep Strait of Hormuz blocked

March 12, 2026
The Financial News 247
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact us
© 2026 The Financial 247. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.