The storage business is at a tipping‑level. For years, onerous disks dominated excessive‑capability workloads, however shifting economics round energy, area, and efficiency are tilting the scales closely towards SSDs.
Few corporations illustrate that shift as clearly as Solidigm, and I lately spoke with Roger Corell, the agency’s senior director of AI and management advertising, and he informed me why he believes QLC‑based mostly SSDs are poised to exchange HDDs in AI and knowledge‑intensive environments.
Corell highlighted Solidigm’s progress so far and revealed its roadmap extends past 245 TB, concentrating on next-gen fashions by 2026.
You might like
Solidigm’s message goes past sheer scale nevertheless, and Corell informed us in regards to the firm’s improvements in rack-level designs, thermal effectivity, and edge-ready architectures as key to assembly AI’s rising demand for low-latency, high-concurrency storage.
Corell sees HDDs relegated to archival roles. The long run, he argues, belongs to NAND – stacked denser, deployed at scale, and designed for efficiency, effectivity, and sustainability.
- 4 SSD distributors have confirmed that they are going to launch SSDs greater than 245TB within the close to future: SanDisk, Samsung, Kioxia, and Micron. Is Solidigm going to launch such an SSD in 2025?
Solidigm is already main the business in SSD capability with the D5-P5336, which delivers as much as 122.88TB, greater than double the earlier technology, and is delivery to clients at this time.
In actual fact, we lately handed the milestone of getting shipped greater than 120 exabytes (EB) of QLC SSD capability throughout our 4 generations of QLC expertise, whereas others within the business are simply getting began.
Moreover, whereas we don’t pre-announce unlaunched merchandise, our roadmap is concentrated on pushing the boundaries of QLC density, thermal effectivity, and rack-scale integration to satisfy the calls for of AI and data-centric workloads.
We’re actively exploring next-gen capacities, type elements, and system-level improvements that may redefine how storage is deployed at scale. This consists of our plans to ship 245+TB drives by the tip of 2026.
We wish to say: we’re not simply scaling capability; we’re redefining what’s potential in enterprise storage.
We additionally work carefully with our clients to make sure our options are tailor-made to their distinctive workloads, particularly in AI and hyperscale environments.
- I’m stunned that QLC is quickly turning into the dominant NAND taste when fairly a couple of specialists predicted that PLC would change it sooner moderately than later. What has modified since these predictions had been made?
QLC has matured right into a confirmed, high-density answer that delivers distinctive worth in real-world deployments. Solidigm’s management in QLC is constructed on delivering SSDs that meet the efficiency, endurance, and effectivity wants of AI, large knowledge, and cloud environments.
PLC stays an fascinating idea, nevertheless it faces main technical hurdles in endurance and reliability that QLC has already overcome.
The business has shifted from theoretical potential to sensible influence, and QLC is delivering. We’re centered on sustaining a management portfolio based mostly on QLC NAND.
- Solidigm and different SSD distributors are adamant that high-capacity SSDs ought to (and can in time) displace onerous disk drives for AI workloads. What’s the argument for this?
AI workloads, and different data-intensive workloads needing quick entry to giant knowledge units, demand excessive throughput, low latency, and vitality effectivity, all areas the place SSDs, particularly Solidigm’s QLC-based drives, have a transparent benefit.
Our D5-P5336, for instance, permits storing 10PB of knowledge utilizing simply 84 SSDs, in comparison with greater than 400 HDDs. That math speaks for itself.
That’s a dramatic discount in energy, cooling, and rack area, which interprets immediately into decrease TCO and better efficiency.
SSDs are not simply sooner; they’re smarter, denser, and extra sustainable. Whereas HDDs nonetheless have a spot within the knowledge stack, SSDs are more practical at assembly the latency, throughput, and concurrency calls for of AI.
- So, does that imply that you just not see HDDs taking part in a task in AI inference?
For AI inference, which requires speedy knowledge entry and constant efficiency, SSDs are rapidly turning into the usual. HDDs should still serve in archival or chilly storage roles for swimming pools of knowledge which can be much less regularly tapped.
Solidigm’s portfolio is optimized for each stage of the AI knowledge pipeline, from ingest to inference, with merchandise just like the D7-PS1010 delivering as much as 70% higher IOPS per watt than main rivals.
AI doesn’t wait, and inference workloads don’t tolerate lag – and neither ought to the infrastructure supporting them. Moreover, the actual development in AI inference shall be on the edge, the place constraints akin to energy, area, cooling, and even weight will pose much more challenges for HDDs.
Yet one more level on HDDs on the whole: trendy workloads mixed with knowledge heart constraints have quickly uncovered HDD limitations.
HDD suppliers are addressing knowledge storage development with bigger drives, however this comes on the expense of slowing efficiency. Subsequently, what some within the business are calling nearline SSDs is a really related dialog to be having.
- Will we ever see large-capacity SSDs for prosumers or finish customers (i.e., M.2 format), or will such merchandise stay unique to enterprise and hyperscalers?
Solidigm has strategically exited the buyer SSD market to give attention to enterprise and hyperscaler wants, however we proceed to help M.2 type elements for enterprise deployments.
Whereas ultra-high-capacity SSDs are at present tailor-made for knowledge facilities, the improvements driving them – akin to QLC density, thermal optimization, and rack-scale effectivity – will ultimately affect broader markets.
We’re constructing for the info heart, however the innovation advantages everybody.
- Final query. SSD capability doubles roughly each 18 months. So, the following step shall be 512TB and presumably 1PB by 2028. Would the plain transfer be rack-level SSDs? 1U or 4U populated with NAND chips and controllers solely?
Completely. Solidigm is already enabling rack-scale innovation with high-density SSDs and new designs incorporating liquid cooling and superior thermal administration.
As AI workloads develop in complexity and scale, the necessity for built-in, rack-level NAND options turns into extra pressing.
Our strategy isn’t just about filling racks with drives; it’s about rethinking learn how to ship the efficiency, effectivity, and sustainability demanded at scale.