Close Menu
The Financial News 247The Financial News 247
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
What's On
Disney entertainment boss Dana Walden unveils new power team, elevates Debra OConnell to TV chief

Disney entertainment boss Dana Walden unveils new power team, elevates Debra OConnell to TV chief

March 17, 2026
US diesel tops  per gallon, oil spikes 4% as Strait of Hormuz crisis continues

US diesel tops $5 per gallon, oil spikes 4% as Strait of Hormuz crisis continues

March 17, 2026
Amazon launches 1-hour, 3-hour delivery options — here’s how much it will cost you

Amazon launches 1-hour, 3-hour delivery options — here’s how much it will cost you

March 17, 2026
Here’s how to know if you’re eligible

Here’s how to know if you’re eligible

March 17, 2026

The New Chief AI Officers In The Enterprise Org Chart

March 17, 2026
Facebook X (Twitter) Instagram
The Financial News 247The Financial News 247
Demo
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
The Financial News 247The Financial News 247
Home » ICMSP Could Drive Additional 100EB Of AI Storage

ICMSP Could Drive Additional 100EB Of AI Storage

By News RoomJanuary 16, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn WhatsApp Telegram Reddit Email Tumblr
ICMSP Could Drive Additional 100EB Of AI Storage
Share
Facebook Twitter LinkedIn Pinterest Email

At the 2026 CES Jensen Huang made many announcements associated with the introduction of Nvidia’s Rubin Vera AI architecture and the various open-source AI foundational models that the company created for various applications, including Alpamayo for automotive. One interesting announcement from NVIDIA was about an Inference Context Memory Storage Platform or ICMSP, using the Nvidia BlueField-4 Network intelligent Interface Card or NIC. This is described as a new kind of AI-native storage infrastructure designed for gigascale inference in order to accelerate and scale agentic AI. An image of the BlueField-4 NIC is shown below.

So, what is context memory? During an interaction with an AI system data is generated about the interaction with the AI. This is the context of that interaction. If that context information is saved it can make future interactions with the AI more consistent, coherent and personalized. It allows AI to remember details across conversations, learn unique patterns, and understand complex, multi-turn interactions by storing relevant data beyond the immediate prompt. By storing this context information in long term storage this context can retained for future interactions.

Besides the additional usefulness of context memory storage, it can also reduce the new calculation requirements by an AI system for individual queries, since data is recovered from storage rather than regenerated or kept in expensive and limited HBM, thus saving energy and allowing more efficient use of GPU processing and memory by using the context from prior interactions. This data is in the form of key-value, or KV, cache.

I spoke with Kevin Deierling, an executive from Nvidia, about the BlueField smart ethernet NIC, or digital processing unit, DPU. He told me that the ICMSP is a network storage system that can consist of SSDs and/or HDDs for storing and accessing the context memory data. It thus comprises a new tier of storage between traditional enterprise storage and the HBM DRAM, which holds the data being processed by GPUs.

The Nvidia ICMSP will store 16TB of storage per GPU and this can enable petabytes of shared context across a GPU cluster to support very large workloads. Throughput is targeted at 800Gb/s through the BlueField-4 board. The ICMSP retains an interesting form of data, in that, if needed, it can be regenerated, unlike the data typically stored in enterprise storage systems. This means that traditional data retention requirements, such as redundancy, can be relaxed and still meet the needs for context memory storage. Thus a 4-9’s reliability might be acceptable versus 9-9’s required in conventional enterprise storage.

NVIDIA said that ICMSP products from AIC, Cloudian, DDN, Dell Technologies, HPE, Hitachi Vantara, IBM, Nutanix , Pure Storage, Supermicro, VAST Data and WEKA will be available in the second half of 2026.

In addition to Kevin, I also spoke with from Phil Manez, VP of GTM Execution at VAST and Jeremy Werner SVP & GM Core Data Center Business Unit at Micron about their plans and observations about memory, storage and the ICMSP. Phil spoke about the shortages in all types of memory and storage this year.

He also pointed out that 16TB per GPU in an ICMSP could easily result in an additional 100 Exabytes of context memory data being stored, putting additional requirements for storage, particularly solid-state NAND storage. He said that the ICMSP could provide a premium inference experience for customers. The image below shows a conceptual drawing of VAST’s implementation of an ICMSP.

VAST allows adding policies in their system, such as providing premium user experiences. Phil said that VAST has an advantage for this types of storage through their use of erasure coding with an overhead ratio of only about 3% with n+4 redundancy. They also have an extensive capability for data reduction where only the differences in otherwise very similar files are stored.

In addition to their ICMSP VAST also has what they call a flash reclamation program that can reuse a company’s existing SSD storage using VAST and they have a soft launch currently with this offering.

Jeremy Werner from Micron said that this is a very good time to be in the memory and storage business and spoke about the company’s investments in additional production capacity in Boise and New York state. This should result in 3.6M square feet of DRAM fabrication in the US. He said that the storage and memory hierarchy is getting more specialized layers and spoke also about trends for memory disaggregation.

He said that the company’s Gen 6 NVMe SSD is in qualification and that at the 2025 Super Compute Conference they demonstrated 230M IOPS in a single storage server and that the company is working on additional innovations like Storage Next.

Micron is also looking at the large amount of storage required for context KV cache storage. He mentioned the company’s 245TB E3L form factor SSDs that it is introducing this year. He foresaw overall DRAM supply growth in the high teens to 20% this year but that demand will be much higher in 2026, and this will lead to higher prices and some constraints on AI data center buildouts.

Nvidia’s inference context memory storage initiative based upon the BlueField-4 DPU will drive even greater demand for storage to support higher quality and more efficient AI inference experience.

Bluefield context memory Dell HPE IBM micron Nvidia Pure Storage Supermicro VAST
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

The New Chief AI Officers In The Enterprise Org Chart

March 17, 2026
Nvidia CEO Jensen Huang makes bold prediction that AI chip sales will hit T

Nvidia CEO Jensen Huang makes bold prediction that AI chip sales will hit $1T

March 16, 2026

“85% Of What I Do Basically Can Be Done By AI,” Says Top Tech Investor

March 16, 2026

NYT Strands Hints Today: Tuesday, March 17 Clues And Answers (Happy Saint Patrick’s Day!)

March 16, 2026

How AI Is Tracking Illegal Wildlife Trade Hidden In Online Marketplaces

March 15, 2026

Naval Ravikant’s AI Thesis Is Playing Out In Public Markets

March 15, 2026
Add A Comment
Leave A Reply Cancel Reply

Don't Miss
US diesel tops  per gallon, oil spikes 4% as Strait of Hormuz crisis continues

US diesel tops $5 per gallon, oil spikes 4% as Strait of Hormuz crisis continues

Business March 17, 2026

Gas prices are surging across the US as the war with Iran continues to rattle…

Amazon launches 1-hour, 3-hour delivery options — here’s how much it will cost you

Amazon launches 1-hour, 3-hour delivery options — here’s how much it will cost you

March 17, 2026
Here’s how to know if you’re eligible

Here’s how to know if you’re eligible

March 17, 2026

The New Chief AI Officers In The Enterprise Org Chart

March 17, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks
Bank of America settles lawsuit brought by Jeffrey Epstein victims

Bank of America settles lawsuit brought by Jeffrey Epstein victims

March 16, 2026
SEC preparing to scrap quarterly earnings requirement — a move Trump supports: report

SEC preparing to scrap quarterly earnings requirement — a move Trump supports: report

March 16, 2026
Nvidia CEO Jensen Huang makes bold prediction that AI chip sales will hit T

Nvidia CEO Jensen Huang makes bold prediction that AI chip sales will hit $1T

March 16, 2026
Average age of NYC homeowner jumps to stunning new high — as American dream more out of reach for young people

Average age of NYC homeowner jumps to stunning new high — as American dream more out of reach for young people

March 16, 2026
The Financial News 247
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact us
© 2026 The Financial 247. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.