PNY AI Storage Appliances With Flash Arrays for Nvidia GPU Servers
Solution focused on Nvidia key features, such as HDR/200Gbe and GPUDirect, starting at 30TB and expandable to 360TB
This is a Press Release edited by StorageNewsletter.com on June 17, 2021 at 2:33 pmPNY Technologies, Inc. has launched a range of AI storage appliances, redeveloped to deliver unseen price to performance ratios to suit the AI market which is seeing increasing numbers of smaller clusters of GPU servers.
The Nvidia DGX A100 HPC has provided organisations and research institutions with a new capability, and as these projects have grown, so have the number of smaller clusters of DGX‘s systems. Which in turn places more demand on the storage system.
While many storage vendors have raced to develop solutions for multi-petabyte super-pods, the company has focused on a solution for the average customer. Engaging with a SDS team to develop a PNY bespoke solution focused purely on Nvidia key features, such as HDR/200Gbe and GPUDirect, yet starting at 30TB. The solutions are designed to be affordable for new projects, while still delivering HDR/200Gbe performance. The 1U is expandable to 150TB and the 2U to 360TB, with optional 1U/2U expansion boxes should projects scale.
The 1U has been aimed at the growing POD/Edge market where fast storage is required for inferencing, but cost and space are critical.
“Project funds are best spent on GPUs; it is these GPUs which provide the user value and ROI. Yet, we need to ensure that the storage can keep the GPUs active and offer the quality to sustain such high levels of performance. Our gen 1 solution provided this, but with NVMe-oF as the connectivity, it was mostly restricted to single servers. As projects grew, even if only two servers, they needed more storage power and the ability to share data. This was the challenge and took considerable focus, investment and time, but the results we believe will change what a default AI POD solution looks like. If you are starting an AI project and need to factor in storage while ensuring your funds are mostly spent on GPU, this provides a simple, plug-n-play appliance solution” said Laurent Chapoulaud, DM, professional solutions, PNY EMEAI.
The solution is unique to the company and although their primary focus is price, performance and ease of use, recognising the growing challenges faced by isolated and edge-based solutions, additional features are being developed to help unify the complete PNY POD, for example, full Nvidia monitoring will the firm’s storage monitor itself, anf will also monitor the DGX and Mellanox switch creating a single unified support path for solution partners to provide full remote monitoring.
“PNY aim to provide partners with all the elements needed to create a full solution, adding unified PNY POD remote monitoring options is just an extension of PNY’s commitment to helping resellers deliver solutions“.
To help tune the solution, the company worked with Mark Klarzynski, a storage expert and a pioneer of the SDS movement and the AFA concept. “Clearly the focus on performance has paid off. In our tests, even an entry level 1U solution outperformed an enterprise all flash array. In storage, we have many test methods to provide great benchmark results, commonly we will use multiple servers to drive the storage faster and achieve good-looking and marketable performance figures. However, with the PNY solutions, a single NVIDIA A100 server could easily saturate the HDR/200Gbe link. Put simply, it outperformed most leading vendors at a fraction of the cost, without even trying hard,” he commented.
“Running real-life deep learning tests, we simply could not throw enough hardware at it, we had three DGX servers fully maxed out and the storage looking like it was hardly trying. The new design has made good use of the NVIDIA Mellanox RDMA strengths, building a new storage stack to take full advantage of its ultra-low latency and high bandwidth. But ultimately, I was most impressed with its ease, we simply plugged it in and within minutes we were up and flying” he added.