Supermicro AS -4126GS-NBR-LCC GPU A+ Server DP AMD 4U Liquid-Cooled System with NVIDIA HGX B200 8-GPU
High-performance 4U liquid-cooled Supermicro GPU A+ server with dual AMD processors and NVIDIA HGX B200 8-GPU system for AI, HPC, and enterprise workloads.

AS-4126GS-NBR-LCC Resources
Unleash Extreme AI & HPC Performance
The Supermicro AS-4126GS-NBR-LCC is a high-performance 4U liquid-cooled system powered by dual Intel® Xeon® processors and NVIDIA® HGX H100 GPUs, delivering the compute power needed for the most demanding AI training, HPC, and data-intensive workloads.


Purpose-Built Design
Built for AI, deep learning, and large-scale simulations, the AS-4126GS-NBR-LCC combines efficiency and performance with liquid cooling and dense GPU support. Powered by dual 4th Gen Intel® Xeon® Scalable processors. Supports up to 8 NVIDIA® H100 Tensor Core GPUs. High-speed DDR5 memory and PCIe Gen5 architecture. Optimized 4U design with direct-to-chip liquid cooling

Scalable Innovation
Designed for next-generation computing workloads, this system provides exceptional flexibility and scalability for data centers, enterprises, and research environments. GPU-accelerated performance for AI, ML, and HPC. Advanced liquid-cooling technology for thermal efficiency. Up to 32 DIMM slots for DDR5 memory expansion. Enterprise-grade reliability with high-availability design.
| MFG Number | AS -4126GS-NBR-LCC |
|---|---|
| Condition | Item Condition : Brand New |
| Price | $0.00 |
| Show Product Will Be Available Soon on FE | No |
| Product Card Description | The Supermicro AS-4126GS-NBR-LCC GPU A+ Server combines dual AMD processors with NVIDIA HGX B200 8-GPU acceleration in a 4U liquid-cooled design. Built for demanding AI, HPC, and deep learning workloads, it delivers scalable performance with DDR5 memory and PCIe Gen5 support. |
| Order Processing Guidelines | Order Processing Guidelines:
Inquiry First – Please reach out to our team to discuss your requirements before placing an order. |
| Special Price | $0.00 |
| Special Price From Date | May 2, 2024 |


