ASUS’s new ESC AI Pod brings the latest NVIDIA processors to power generative AI
- by autobot
- June 6, 2024
- Source article
Publisher object (8)
At Computex 2024, Jensen Huang, founder and CEO of NVIDIA was on hand as ASUS unveiled its latest AI servers based on the latest NVIDIA processors that is called the . The ASUS ESC AI Pod is an full rack solution of GPUs, CPUs and switches. It is equipped with the NVIDIA GB200 Grace Blackwell Superchip and fifth-generation and supports both liquid-to-air and liquid-to-liquid cooling solutions for optimal AI computing performance. As Kaustubh Sanghani, vice president of GPU product management at NVIDIA said: At Computex, ASUS will have on display a wide range of servers powered by NVIDIA’s AI platform that are able to handle the unique demands of enterprises. Additionally, ASUS also presented a full lineup of servers based on the NVIDIA MGX architecture. These include the 2U ESC NM1-E1 and ESC NM2-E1 with NVIDIA GB200 NVL2 servers, and the 1U ESR1-511N-M1. By using NVIDIA’s GH200 Grace Hopper Superchip, the ASUS NVIDIA MGX-powered server is designed to cater to large-scale AI and HPC applications by facilitating seamless and rapid data transfers, deep-learning (DL) training and inference, data analytics and high-performance computing. In the area of HPC computing, the latest HGX servers from ASUS including the ESC N8 (powered by the 5th Gen Intel Xeon Scalable processors and NVIDIA Blackwell Tensor Core GPUs) as well as the ESC N8A (powered by the AMD EPYC 9004 and NVIDIA Blackwell GPUs) come with an enhanced thermal solution to ensure optimal performance and lower PUE, and are designed for AI and data-science usage that require compute-heavy workloads. ASUS’s AI server solutions also come with integrated NVIDIA BlueField-3 SuperNICs and DPUs to meet the standards set by NVIDIA’s Spectrum-X Ethernet networking platform to deliver what ASUS terms “best-of-breed” networking capabilities for generative AI infrastructures. ASUS says to for more information on its servers and to contact your for more information.