What are you looking for ?
Advertise with us
Advertise with us

DDN Storage Solutions Deliver 700% Gains in AI and ML for Image Segmentation and Natural Language Processing

Inaugural AI storage benchmarks released by MLCommons Association

DDN (DataDirect Networks Inc.) announced performance results of its AI storage platform for the inaugural AI storage benchmarks released by MLCommons Association.

The MLPerf Storage v0.5 benchmark results confirm the company’s storage solutions as the gold standard for AI and ML applications.

The firm‘s entries cover Image Segmentation and Natural Language Processing categories of the MLPerf Storage Benchmark. Using single and multi-node GPU configurations, DDN’s A3I AI400X2 storage appliance scales to deliver faster and more reliable data access, while maximizing GPU utilization and delivering high efficiency for demanding AI workloads.

In an individual compute node evaluation, a single company’s AI400X2 NVMe appliance equipped with EXAScaler 6.2 parallel filesystem fully served 40 AI accelerators, delivering a throughput of 16.2GB/s.(1) In a multi-node configuration, the same AI400X2 NVMe appliance quadrupled its output, serving 160 accelerators over 10 GPU compute nodes, achieving a throughput of 61.6GB/s.(2) These results demonstrate 700% better efficiency on a per storage node basis when compared to the competitive on-premises solution submissions.

DDN’s cutting-edge data storage solutions fuel and accelerate GPUs in data centers and in the cloud, helping organizations develop better cancer detection methodologies, putting safe and reliable robotaxis on our roads and highways, and bringing to market more effective chatbots and virtual assistants to make our lives easier,” said Dr. James Coomer, SVP, products. “We’re proud to lead the way in safe and power-efficient AI adoption, setting new standards for innovation and performance in the industry.”

The ability to power AI workloads, ML and Large Language Models at highest levels of efficiency and scale, while minimizing power usage and data center footprint is critical. With thousands of systems deployed on premise and in the cloud, the company’s AI infrastructure storage systems are a solution to power GPUs for the demanding and innovative organizations in the world.

(1) MLPerf Storage v0.5 Closed. Retrieved from this webpage. Result verified by MLCommons Association
(2) MLPerf Storage v0.5 Closed. 11 September 2023, entry 0.5-0005. Result verified by MLCommons Association.

Read also :
Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E
RAIDON