Close Menu
The Financial News 247The Financial News 247
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
What's On
Viral Reddit post mocking  grilled cheese helped sink Bay Area shop, owner says

Viral Reddit post mocking $22 grilled cheese helped sink Bay Area shop, owner says

January 29, 2026
The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

January 29, 2026
DOJ Indicts Man Who Allegedly Attacked Ilhan Omar

DOJ Indicts Man Who Allegedly Attacked Ilhan Omar

January 29, 2026
Starbucks CEO lays out long-term growth plan, aims to open thousands of new stores

Starbucks CEO lays out long-term growth plan, aims to open thousands of new stores

January 29, 2026
Sudan’s war of information – disinformation and hate speech

Sudan’s war of information – disinformation and hate speech

January 29, 2026
Facebook X (Twitter) Instagram
The Financial News 247The Financial News 247
Demo
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
The Financial News 247The Financial News 247
Home » ICMSP Could Drive Additional 100EB Of AI Storage

ICMSP Could Drive Additional 100EB Of AI Storage

By News RoomJanuary 16, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn WhatsApp Telegram Reddit Email Tumblr
ICMSP Could Drive Additional 100EB Of AI Storage
Share
Facebook Twitter LinkedIn Pinterest Email

At the 2026 CES Jensen Huang made many announcements associated with the introduction of Nvidia’s Rubin Vera AI architecture and the various open-source AI foundational models that the company created for various applications, including Alpamayo for automotive. One interesting announcement from NVIDIA was about an Inference Context Memory Storage Platform or ICMSP, using the Nvidia BlueField-4 Network intelligent Interface Card or NIC. This is described as a new kind of AI-native storage infrastructure designed for gigascale inference in order to accelerate and scale agentic AI. An image of the BlueField-4 NIC is shown below.

So, what is context memory? During an interaction with an AI system data is generated about the interaction with the AI. This is the context of that interaction. If that context information is saved it can make future interactions with the AI more consistent, coherent and personalized. It allows AI to remember details across conversations, learn unique patterns, and understand complex, multi-turn interactions by storing relevant data beyond the immediate prompt. By storing this context information in long term storage this context can retained for future interactions.

Besides the additional usefulness of context memory storage, it can also reduce the new calculation requirements by an AI system for individual queries, since data is recovered from storage rather than regenerated or kept in expensive and limited HBM, thus saving energy and allowing more efficient use of GPU processing and memory by using the context from prior interactions. This data is in the form of key-value, or KV, cache.

I spoke with Kevin Deierling, an executive from Nvidia, about the BlueField smart ethernet NIC, or digital processing unit, DPU. He told me that the ICMSP is a network storage system that can consist of SSDs and/or HDDs for storing and accessing the context memory data. It thus comprises a new tier of storage between traditional enterprise storage and the HBM DRAM, which holds the data being processed by GPUs.

The Nvidia ICMSP will store 16TB of storage per GPU and this can enable petabytes of shared context across a GPU cluster to support very large workloads. Throughput is targeted at 800Gb/s through the BlueField-4 board. The ICMSP retains an interesting form of data, in that, if needed, it can be regenerated, unlike the data typically stored in enterprise storage systems. This means that traditional data retention requirements, such as redundancy, can be relaxed and still meet the needs for context memory storage. Thus a 4-9’s reliability might be acceptable versus 9-9’s required in conventional enterprise storage.

NVIDIA said that ICMSP products from AIC, Cloudian, DDN, Dell Technologies, HPE, Hitachi Vantara, IBM, Nutanix , Pure Storage, Supermicro, VAST Data and WEKA will be available in the second half of 2026.

In addition to Kevin, I also spoke with from Phil Manez, VP of GTM Execution at VAST and Jeremy Werner SVP & GM Core Data Center Business Unit at Micron about their plans and observations about memory, storage and the ICMSP. Phil spoke about the shortages in all types of memory and storage this year.

He also pointed out that 16TB per GPU in an ICMSP could easily result in an additional 100 Exabytes of context memory data being stored, putting additional requirements for storage, particularly solid-state NAND storage. He said that the ICMSP could provide a premium inference experience for customers. The image below shows a conceptual drawing of VAST’s implementation of an ICMSP.

VAST allows adding policies in their system, such as providing premium user experiences. Phil said that VAST has an advantage for this types of storage through their use of erasure coding with an overhead ratio of only about 3% with n+4 redundancy. They also have an extensive capability for data reduction where only the differences in otherwise very similar files are stored.

In addition to their ICMSP VAST also has what they call a flash reclamation program that can reuse a company’s existing SSD storage using VAST and they have a soft launch currently with this offering.

Jeremy Werner from Micron said that this is a very good time to be in the memory and storage business and spoke about the company’s investments in additional production capacity in Boise and New York state. This should result in 3.6M square feet of DRAM fabrication in the US. He said that the storage and memory hierarchy is getting more specialized layers and spoke also about trends for memory disaggregation.

He said that the company’s Gen 6 NVMe SSD is in qualification and that at the 2025 Super Compute Conference they demonstrated 230M IOPS in a single storage server and that the company is working on additional innovations like Storage Next.

Micron is also looking at the large amount of storage required for context KV cache storage. He mentioned the company’s 245TB E3L form factor SSDs that it is introducing this year. He foresaw overall DRAM supply growth in the high teens to 20% this year but that demand will be much higher in 2026, and this will lead to higher prices and some constraints on AI data center buildouts.

Nvidia’s inference context memory storage initiative based upon the BlueField-4 DPU will drive even greater demand for storage to support higher quality and more efficient AI inference experience.

Bluefield context memory Dell HPE IBM micron Nvidia Pure Storage Supermicro VAST
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

January 29, 2026
Sudan’s war of information – disinformation and hate speech

Sudan’s war of information – disinformation and hate speech

January 29, 2026
A Robotaxi Hit A Kid. Here’s What We Know

A Robotaxi Hit A Kid. Here’s What We Know

January 29, 2026
ServiceNow’s Q4 Earnings Beat Market Estimates

ServiceNow’s Q4 Earnings Beat Market Estimates

January 29, 2026
European Auto Makers Face Tough 2026 Tests As China Accelerates

European Auto Makers Face Tough 2026 Tests As China Accelerates

January 29, 2026
Wayfair Lessons In Driving Millions Of Dollars In Savings With AI

Wayfair Lessons In Driving Millions Of Dollars In Savings With AI

January 29, 2026
Add A Comment
Leave A Reply Cancel Reply

Don't Miss
The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

The Last Great Northern Lights Display Until 2035 Could Be In 50 Days

Tech January 29, 2026

For the latest on the Northern Lights, including all forecasts, keep an eye on my…

DOJ Indicts Man Who Allegedly Attacked Ilhan Omar

DOJ Indicts Man Who Allegedly Attacked Ilhan Omar

January 29, 2026
Starbucks CEO lays out long-term growth plan, aims to open thousands of new stores

Starbucks CEO lays out long-term growth plan, aims to open thousands of new stores

January 29, 2026
Sudan’s war of information – disinformation and hate speech

Sudan’s war of information – disinformation and hate speech

January 29, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks
Steve Ballmer Drops From No. 9 To No. 14 Richest As Microsoft Stock Tanks

Steve Ballmer Drops From No. 9 To No. 14 Richest As Microsoft Stock Tanks

January 29, 2026
Mattel unveils He-Man action figures ahead of ‘Masters of the Universe’, seeking to repeat ‘Barbie’ success

Mattel unveils He-Man action figures ahead of ‘Masters of the Universe’, seeking to repeat ‘Barbie’ success

January 29, 2026
A Robotaxi Hit A Kid. Here’s What We Know

A Robotaxi Hit A Kid. Here’s What We Know

January 29, 2026
UpScrolled Hits No. 1 On App Store After Disgruntled TikTok Users Flock To New App

UpScrolled Hits No. 1 On App Store After Disgruntled TikTok Users Flock To New App

January 29, 2026
The Financial News 247
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact us
© 2026 The Financial 247. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.