- Solidigm’s new AI lab redefines what can be achieved with dense SSD storage
- Cluster Delivers Record Performance, But Questions Remain About True Scalability
- 23.6 petabytes compressed into 16U challenges the way storage is viewed
Solidigm has opened its central AI lab at the FarmGPU site near its headquarters in Rancho Cordova, California.
The facility is billed as a place to study how storage interacts with AI workloads using high-performance GPUs and dense storage arrays.
The company says the lab provides one of the most compact large-scale clusters in the industry, intended to replicate the conditions found in modern data centers.
Testing record performance claims
At the center of the announcement are Solidigm’s D7-PS1010 SSDs, which were used to achieve 116 GB/s per node in MLPerf Storage benchmarks.
This figure is described as a record result, although the value of such synthetic tests for real-world AI operations remains open to debate.
Benchmarks often highlight peak performance, but actual workload performance can largely depend on software and system integration.
Still, the lab offers a platform where vendors, developers and storage partners can conduct experiments under controlled but relevant conditions.
“Our Solidigm AI core lab combines today’s most powerful GPUs with leading storage infrastructure to unlock new levels of testing and co-innovation for our customers and developer community,” said Avi Shetty, senior director of AI ecosystems and partnerships at Solidigm.
“These were capabilities that were previously only available to select companies, and Solidigm now enables them and demonstrates the importance of having storage close to the GPU.”
Perhaps the most striking aspect is the claim of 23.6PB of storage packed into just 16U of rack space.
This was achieved using 192 Solidigm D5-P5336 SSDs, each with 122TB capacity.
By volume, it may represent one of the densest publicly deployed enterprise storage clusters.
This raises questions about how such hardware compares to the best HDD options that continue to offer a lower cost per terabyte.
If performance demands outweigh profitability, Solidigm’s setup could be seen as a step toward defining larger SSD-based systems in production environments.
Solidigm emphasized collaboration with partners such as Metrum AI, which claims to have reduced DRAM usage during increased recovery generation by up to 57% by offloading data to SSDs.
Such claims suggest potential benefits for memory management, although they also highlight the dependence of AI efficiency on tightly coupled hardware and software tuning.
While the best SSDs in the lab may demonstrate impressive speed and density, the broader market may weigh these gains against practical considerations such as power usage, scalability, and long-term cost.
Solidigm’s AI Central Lab is positioned as a space for both innovation and marketing demonstration.
“Running storage tests is no longer enough. In our core AI lab, we can run real-world AI workloads and use our cutting-edge telemetry capabilities to optimize system performance and efficiency and gain insight into the storage needs of emerging workloads,” Shetty said.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.