Close Menu
The Financial News 247The Financial News 247
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
What's On
Another California wine giant shuts site and axes staff as chaos rips across Napa Valley

Another California wine giant shuts site and axes staff as chaos rips across Napa Valley

February 25, 2026
Warner Bros. weighing revised bid from Paramount Skydance as bidding war escalates

Warner Bros. weighing revised bid from Paramount Skydance as bidding war escalates

February 24, 2026
Ozempic, Wegovy prices to drop up to 50% as Novo Nordisk’s rivalry with Eli Lilly heats up

Ozempic, Wegovy prices to drop up to 50% as Novo Nordisk’s rivalry with Eli Lilly heats up

February 24, 2026
Google says it’s sorry for push alert on BAFTA N-word fiasco that included slur

Google says it’s sorry for push alert on BAFTA N-word fiasco that included slur

February 24, 2026
Warner Bros. Discovery may upend Netflix deal after getting revised Paramount bid

Warner Bros. Discovery may upend Netflix deal after getting revised Paramount bid

February 24, 2026
Facebook X (Twitter) Instagram
The Financial News 247The Financial News 247
Demo
  • Home
  • News
  • Business
  • Finance
  • Companies
  • Investing
  • Markets
  • Lifestyle
  • Tech
  • More
    • Opinion
    • Climate
    • Web Stories
    • Spotlight
    • Press Release
The Financial News 247The Financial News 247
Home » ICMSP Could Drive Additional 100EB Of AI Storage

ICMSP Could Drive Additional 100EB Of AI Storage

By News RoomJanuary 16, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn WhatsApp Telegram Reddit Email Tumblr
ICMSP Could Drive Additional 100EB Of AI Storage
Share
Facebook Twitter LinkedIn Pinterest Email

At the 2026 CES Jensen Huang made many announcements associated with the introduction of Nvidia’s Rubin Vera AI architecture and the various open-source AI foundational models that the company created for various applications, including Alpamayo for automotive. One interesting announcement from NVIDIA was about an Inference Context Memory Storage Platform or ICMSP, using the Nvidia BlueField-4 Network intelligent Interface Card or NIC. This is described as a new kind of AI-native storage infrastructure designed for gigascale inference in order to accelerate and scale agentic AI. An image of the BlueField-4 NIC is shown below.

So, what is context memory? During an interaction with an AI system data is generated about the interaction with the AI. This is the context of that interaction. If that context information is saved it can make future interactions with the AI more consistent, coherent and personalized. It allows AI to remember details across conversations, learn unique patterns, and understand complex, multi-turn interactions by storing relevant data beyond the immediate prompt. By storing this context information in long term storage this context can retained for future interactions.

Besides the additional usefulness of context memory storage, it can also reduce the new calculation requirements by an AI system for individual queries, since data is recovered from storage rather than regenerated or kept in expensive and limited HBM, thus saving energy and allowing more efficient use of GPU processing and memory by using the context from prior interactions. This data is in the form of key-value, or KV, cache.

I spoke with Kevin Deierling, an executive from Nvidia, about the BlueField smart ethernet NIC, or digital processing unit, DPU. He told me that the ICMSP is a network storage system that can consist of SSDs and/or HDDs for storing and accessing the context memory data. It thus comprises a new tier of storage between traditional enterprise storage and the HBM DRAM, which holds the data being processed by GPUs.

The Nvidia ICMSP will store 16TB of storage per GPU and this can enable petabytes of shared context across a GPU cluster to support very large workloads. Throughput is targeted at 800Gb/s through the BlueField-4 board. The ICMSP retains an interesting form of data, in that, if needed, it can be regenerated, unlike the data typically stored in enterprise storage systems. This means that traditional data retention requirements, such as redundancy, can be relaxed and still meet the needs for context memory storage. Thus a 4-9’s reliability might be acceptable versus 9-9’s required in conventional enterprise storage.

NVIDIA said that ICMSP products from AIC, Cloudian, DDN, Dell Technologies, HPE, Hitachi Vantara, IBM, Nutanix , Pure Storage, Supermicro, VAST Data and WEKA will be available in the second half of 2026.

In addition to Kevin, I also spoke with from Phil Manez, VP of GTM Execution at VAST and Jeremy Werner SVP & GM Core Data Center Business Unit at Micron about their plans and observations about memory, storage and the ICMSP. Phil spoke about the shortages in all types of memory and storage this year.

He also pointed out that 16TB per GPU in an ICMSP could easily result in an additional 100 Exabytes of context memory data being stored, putting additional requirements for storage, particularly solid-state NAND storage. He said that the ICMSP could provide a premium inference experience for customers. The image below shows a conceptual drawing of VAST’s implementation of an ICMSP.

VAST allows adding policies in their system, such as providing premium user experiences. Phil said that VAST has an advantage for this types of storage through their use of erasure coding with an overhead ratio of only about 3% with n+4 redundancy. They also have an extensive capability for data reduction where only the differences in otherwise very similar files are stored.

In addition to their ICMSP VAST also has what they call a flash reclamation program that can reuse a company’s existing SSD storage using VAST and they have a soft launch currently with this offering.

Jeremy Werner from Micron said that this is a very good time to be in the memory and storage business and spoke about the company’s investments in additional production capacity in Boise and New York state. This should result in 3.6M square feet of DRAM fabrication in the US. He said that the storage and memory hierarchy is getting more specialized layers and spoke also about trends for memory disaggregation.

He said that the company’s Gen 6 NVMe SSD is in qualification and that at the 2025 Super Compute Conference they demonstrated 230M IOPS in a single storage server and that the company is working on additional innovations like Storage Next.

Micron is also looking at the large amount of storage required for context KV cache storage. He mentioned the company’s 245TB E3L form factor SSDs that it is introducing this year. He foresaw overall DRAM supply growth in the high teens to 20% this year but that demand will be much higher in 2026, and this will lead to higher prices and some constraints on AI data center buildouts.

Nvidia’s inference context memory storage initiative based upon the BlueField-4 DPU will drive even greater demand for storage to support higher quality and more efficient AI inference experience.

Bluefield context memory Dell HPE IBM micron Nvidia Pure Storage Supermicro VAST
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

AMD strikes blockbuster 0B AI chip deal with Mark Zuckerberg’s Meta

AMD strikes blockbuster $100B AI chip deal with Mark Zuckerberg’s Meta

February 24, 2026

How An Entrepreneur’s Frightening Diagnosis Sparked A Million-Dollar Business

February 23, 2026

The Biological Age Testing Market, From Research Promise To Clinical Reality

February 20, 2026

The Mirror We Refuse To Look Into

February 20, 2026

Eufy Rolls Out Three New Smart Sensors In A Busy Week Of Launches

February 19, 2026

These Billionaires Plan To Bring Self-Driving Tech To Everything That Moves

February 12, 2026
Add A Comment
Leave A Reply Cancel Reply

Don't Miss
Warner Bros. weighing revised bid from Paramount Skydance as bidding war escalates

Warner Bros. weighing revised bid from Paramount Skydance as bidding war escalates

Business February 24, 2026

Warner Bros. Discovery  said Tuesday it was considering a new bid from Paramount Skydance without disclosing the…

Ozempic, Wegovy prices to drop up to 50% as Novo Nordisk’s rivalry with Eli Lilly heats up

Ozempic, Wegovy prices to drop up to 50% as Novo Nordisk’s rivalry with Eli Lilly heats up

February 24, 2026
Google says it’s sorry for push alert on BAFTA N-word fiasco that included slur

Google says it’s sorry for push alert on BAFTA N-word fiasco that included slur

February 24, 2026
Warner Bros. Discovery may upend Netflix deal after getting revised Paramount bid

Warner Bros. Discovery may upend Netflix deal after getting revised Paramount bid

February 24, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Our Picks
Defense Sec. Pete Hegseth gives Anthropic Friday deadline to remove military AI restrictions or face potential blacklisting

Defense Sec. Pete Hegseth gives Anthropic Friday deadline to remove military AI restrictions or face potential blacklisting

February 24, 2026
Microsoft reveals partnership with Starlink despite Elon Musk’s feud with OpenAI

Microsoft reveals partnership with Starlink despite Elon Musk’s feud with OpenAI

February 24, 2026
Ford recalls over 412,000 vehicles due to suspension issue

Ford recalls over 412,000 vehicles due to suspension issue

February 24, 2026
AMD strikes blockbuster 0B AI chip deal with Mark Zuckerberg’s Meta

AMD strikes blockbuster $100B AI chip deal with Mark Zuckerberg’s Meta

February 24, 2026
The Financial News 247
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact us
© 2026 The Financial 247. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.