DDN Gooses AI Storage Pipelines with Infinia 2.0


(spainter_vfx/Shutterstock)

AI’s insatiable demand for information has uncovered a rising drawback: storage infrastructure isn’t maintaining. From coaching basis fashions to working real-time inference, AI workloads require high-throughput, low-latency entry to huge quantities of knowledge unfold throughout cloud, edge, and on-prem environments. Conventional storage programs have typically struggled beneath the load of those calls for, creating bottlenecks that may drastically delay innovation within the AI area. 

At this time, DDN unveiled Infinia 2.0, a major replace to its AI-focused, software-defined information storage platform designed to get rid of the inefficiencies in AI storage and information administration. The corporate says Infinia 2.0 acts as a unified, clever information layer that dynamically optimizes AI workflows. 

“Infinia 2.0 is not only an improve—it’s a paradigm shift in AI information administration,” DDN CEO Alex Bouzari says, emphasizing how Infinia builds on the corporate’s deep-rooted experience in HPC storage to energy the subsequent era of AI-driven information providers. 

A rendering of a large-scale Infinia 2.0 configuration from DDN’s Past Synthetic digital occasion.

As AI adoption grows, the challenges of scale, velocity, and effectivity turn into extra obvious. LLMs, generative AI functions, and inference programs require not solely huge datasets however the potential to entry and course of them sooner than ever. Conventional storage options battle with efficiency bottlenecks, making it tough for GPUs to obtain the info they want shortly sufficient, limiting general coaching effectivity. On the similar time, organizations should navigate the fragmentation of knowledge throughout a number of areas, from structured databases to unstructured video and sensory information. Transferring information between these environments creates inefficiencies, driving up operational prices and creating latency points that sluggish AI functions. 

DDN claims Infinia 2.0 solves these challenges by integrating real-time AI information pipelines, dynamic metadata-driven automation, and multi-cloud unification, all optimized particularly for AI workloads. Fairly than forcing enterprises to work with disconnected information lakes, Infinia 2.0 introduces a Information Ocean, a unified world view that eliminates redundant copies and permits organizations to course of and analyze their information wherever it resides. This is supposed to scale back storage sprawl and to permit AI fashions to look and retrieve related information extra effectively utilizing a sophisticated metadata tagging system. With just about limitless metadata capabilities, AI functions can affiliate huge quantities of metadata with every object, making search and retrieval operations dramatically sooner. 

Infinia 2.0 integrates with frameworks like TensorFlow and PyTorch, which the corporate says eliminates the necessity for complicated format conversions, permitting AI execution engines to work together with information on to considerably velocity up processing occasions. The platform can also be designed for excessive scalability, supporting deployments that vary from just a few terabytes to exabytes of storage, making it versatile sufficient to fulfill the wants of each startups and enterprise-scale AI operations.  

Efficiency is one other space the place Infinia 2.0 could possibly be a breakthrough. The platform boasts 100x sooner metadata processing, lowering lookup occasions from over ten milliseconds to lower than one. AI pipelines execute 25x sooner, whereas the system can deal with as much as 600,000 object lists per second, surpassing the restrictions of even AWS S3. By leveraging these capabilities, DDN asserts that AI-driven organizations can guarantee their fashions are skilled, refined, and deployed with minimal lag and most effectivity. 

(Supply: DDN)

Throughout a digital launch occasion right now referred to as Past Synthetic, DDN’s claims have been strengthened by robust endorsements from business leaders like Nvidia CEO Jensen Huang, who highlighted Infinia’s potential to redefine AI information administration, emphasizing how metadata-driven architectures like Infinia rework uncooked information into actionable intelligence. Enterprise computing chief Lenovo additionally praised the platform, underscoring its potential to merge on-prem and cloud information for extra environment friendly AI deployment. 

Supermicro, one other DDN accomplice, additionally endorses Infinia: “At Supermicro, we’re proud to accomplice with DDN to rework how organizations leverage information to drive enterprise success,” mentioned Charles Liang, founder, president, and CEO at Supermicro. “By combining Supermicro’s high-performance, energy-efficient {hardware} with DDN’s revolutionary Infinia platform, we empower prospects to speed up AI workloads, maximize operational effectivity, and scale back prices. Infinia’s seamless information unification throughout cloud, edge, and on-prem environments permits companies to make sooner, data-driven choices and obtain measurable outcomes, aligning completely with our dedication to delivering optimized, sustainable infrastructure options.” 

On the Past Synthetic occasion, Bouzari and Huang sat down for a hearth chat to mirror on how a earlier concept, born from a 2017 assembly with Nvidia, advanced into the Infinia platform. 

DDN had been requested to assist construct a reference structure for AI computing, however Bouzari noticed a a lot larger alternative. If Huang’s imaginative and prescient for AI was going to materialize, the world would wish a basically new information structure, one that might scale AI workloads, get rid of latency, and rework uncooked data into actionable intelligence. 

On the Past Synthetic occasion, Huang and Bouzari sit down for a hearth chat concerning the larger image of storage and AI.

Infinia is extra than simply storage, Bouzari says, and fuels AI programs the best way power fuels a mind. And based on Huang, that distinction is crucial. 

“Probably the most necessary issues individuals neglect is the significance of knowledge that’s vital throughout utility, not simply throughout coaching,” Huang notes. “You need to practice on an unlimited quantity of knowledge for pretraining, however throughout use, the AI has to entry data, and AI wish to entry data, not in uncooked information type, however in informational stream.” 

This shift from conventional storage to AI-native information intelligence has profound implications, the CEOs say. As a substitute of treating storage as a passive repository, DDN and Nvidia are turning it into an energetic layer of intelligence, enabling AI to retrieve insights immediately. 

“That is the rationale why the reframing of storage of objects and uncooked information into information intelligence is that this new alternative for DDN, offering information intelligence for the entire world’s enterprises as AIs run on high of this cloth of data,” Huang says, calling it “a rare reframing of computing and storage.” 

Reframing definitely appears vital as AI continues to evolve as a result of the infrastructure supporting it should evolve as properly. DDN’s Infinia 2.0 might characterize a significant shift in how enterprises strategy AI storage, not as a passive archive, however as an energetic intelligence layer that fuels AI programs in actual time. By eliminating conventional bottlenecks, unifying distributed information, and integrating seamlessly with AI frameworks, Infinia 2.0 goals to reshape how AI functions entry, course of, and act on data. 

With endorsements from business leaders like Nvidia, Supermicro, and Lenovo, and with its newest funding spherical of $300 million at a $5 billion valuation, DDN is positioning itself as a key participant within the AI panorama. Whether or not Infinia 2.0 delivers on its bold guarantees stays to be seen, however one factor is obvious: AI’s subsequent frontier isn’t nearly fashions and compute however is about rethinking information itself. And with this launch, DDN is making the case that the way forward for AI hinges on new paradigms for information administration.

Study extra concerning the technical facets of Infinia 2.0 at this hyperlink, or watch a replay of Past Synthetic right here.

Associated Objects:

Feeding the Virtuous Cycle of Discovery: HPC, Massive Information, and AI Acceleration

The AI Information Cycle: Understanding the Optimum Storage Combine for AI Workloads at Scale

DDN Cranks the Information Throughput with AI400X2 Turbo

Editor’s word: This text first appeared on AIWire.

 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles