What are you looking for ?
Advertise with us
PNY

OCP Global Summit 2024: Wiwynn AI Data Center and Cooling Solutions

Liquid-cooled rack-level AI system developed in collaboration with Wistron, GS1400A 4U server, GS1300N 3U server, and Open IP SuperFluid cooling technology

Wiwynn Corp. is unveiling a suite of AI data center solutions and liquid cooling technologies, on display at OCP Global Summit 2024 on October 15-17, San Jose Convention Center, San Jose, CA.

Wiwynn Gb200nvl72 Rack 1As one of the 1st in line with Nvidia GB200 NVL72-readiness, the company showcases the liquid-cooled, rack-level AI system, developed in collaboration with Wistron Corporation.

The high power demands of AI are pushing the limits of data centers, and as the technology grows more complex, it’s essential to enhance both performance and sustainability,” noted William Lin, president. “We’re excited to showcase Wiwynn’s complete AI and advanced cooling solutions at OCP Global Summit 2024. This year, we’re demonstrating how our technology creates unprecedented performance and efficiency to empower data centers worldwide and unlock new possibilities in the AI era.”

AI solutions harnessing Nvidia accelerated computing
To drive innovation in data centers, the company has strengthened its cooperation with Wistron Corp., delivering complete AI acceleration platforms leveraging state-of-the-art chips – including the Nvidia GB200 Grace Blackwell Superchip.

Nvidia GB200 NVL72 rack solution:

  • One of the 1st Nvidia GB200 NVL72 platform-based solutions available on the market.
  • Supercharges training and inference for trillion-parameter-scale AI models while providing 25x lower TCO compared with the previous-gen of GPUs.
  • The liquid-cooled rack connects 72 Nvidia Blackwell GPUs and 36 Nvidia Grace CPUs through 5th Gen Nvidia NVLink and NVLink Switch technologies, enabling AI acceleration in a single rack.

Wiwynn is also offering updated AI solutions on Nvidia HGX platform, including:

Wiwynn Gs1400a Server Gs1400a Ocp24 1

  • GS1400A: The Nvidia MGX 4U server leverages 8 Nvidia H200 Tensor Core GPUs interconnected with NVLink and NVSwitch to bring accelerated computing into any data center with modular server designs.

Wiwynn Gs1300n Ai Server Gs1300n Ocp24 1

  • Compact GS1300N: Can be equipped with 8 Nvidia Hopper or Blackwell architecture GPUs within 3U-height and achieves over 90% heat dissipation with the firm‘s DLC technology.

Lquid cooling
At the Summit, the company will introduce: 

  • Leading-edge 2-phase liquid cooling techs that push chip thermal limits:

    • Through a partnership with ZutaCore, Inc., a provider of DLC and waterless liquid cooling solutions, this pushes the single-chip power limit by delivering up to 2.5kW thermal design power (TDP). Deep dive at OCP Future Technologies Symposium presentation.

    • Demonstrating a validated 2-phase immersion cooling solution that achieves 2.8kW TDP, meeting future demands.

  • Open IP SuperFluid cooling technology: Partnering with Intel Corp., Wiwynn is solving a major roadblock for DLC by replacing water with a novel dielectric fluid, protecting electric circuits from damage and loss from water leakage, reducing the risk of data center outage, while achieving over 1.5kW TPD cooling capacity with a single-phase solution.

For rapidly diversifying data centers, the company is focusing on developing the cooling technology for the future and offers optimized solutions for data centers.

Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E
RAIDON