Unlock Unprecedented Performance Leveraging Supermicro GPU Systems
Unlock the full potential of AI with Supermicro’s cutting-edge AI-ready infrastructure solutions. From large-scale training to intelligent edge inferencing, our turn-key reference designs streamline and accelerate AI deployment. Empower your workloads with optimal performance and scalability while optimizing costs and minimizing environmental impact.
Supermicro has built a portfolio of workload-optimized systems for optimal GPU performance and efficiency across this broad spectrum of workloads.
Large Scale AI Training
Large Language Models, Generative AI Training, Autonomous Driving, Robotics
Liquid Cooled AI Rack Solutions
- Large-Scale NVIDIA H100 AI Training Solution with Liquid Cooling
- NVIDIA HGX H100 SXM 8-GPU
- Up to 80 kW/Rack
8U 8 Gaudi2 HL-225H AI Training Server
- 8 Gaudi2 HL-225H AI accelerator
- Dual 3rd Gen Intel® Xeon® processors
- 32 DIMMs - up to 8TB registered ECC DDR4-3200MHz SDRAM
- 4U Dual Processor (Intel) GPU System
- NVIDIA HGX H100 SXM 4-GPU
- 6 U.2 NVMe Drives
- 8 PCIe 5.0 x16 networking slots
HPC/AI
Engineering Simulation, Scientific Research, Genomic Sequencing, Drug Discovery
- NVIDIA HGX H100 SXM 8-GPU
- 16 U.2 NVMe Drives
- 8 PCIe 5.0 x16 networking slots
- Up to 2 or 1 H100 PCIe per blade
- 2 M.2 NVMe Drives per blade
- 2 E1.S Drives per blade
- Up to 10 H100 PCIe
- 8 NVMe + 8 SATA drives
- 4-5 PCIe 5.0 x16 networking slots
Enterprise AI Inferencing & Training
Generative AI Inference, AI-enabled Services/Applications, Chatbots, Recommender System, Business Automation
- 10 H100 PCIe
- 8 NVMe and 8 SATA Drives
- 32 DIMMs DDR5-4800
- Single Root, Dual Root or Direct Connect
- 2 NVIDIA H100 PCIe
- 2 U.2 NVMe Drives, 3 M.2 NVMe Drives
- 2 E1.S Drives
- 2x25GbE LOM
- 4 NVIDIA H100 PCIe or NVL
- 8 E1.S + 2 M.2 drives
- 16 DIMMs DDR5-4800
- 7 PCIe 5.0 x16 slots
Visualization & Design
Real-Time Collaboration, 3D Design, Game Development
- 4 NVIDIA L40 PCIe
- 8 NVMe drives
- 32 DIMMs DDR5-4800
- Optimized thermal capacity & airflow
- 4 NVIDIA L40S/L40 PCIe GPU
- Dual 4th Gen Intel® Xeon® Scalable
- 16 DIMM slots DDR5-4800
- 8 3.5" hot-swap NVMe/SATA/SAS
- Up to 4 NVIDIA A100 PCIe 80GB GPUs at Noise Level Under 30dBA
- 4th Gen Intel® Xeon® Scalable processor
- 6 PCIe 5.0 x16, 2 NVMe M.2 slots
Content Delivery & Virtualization
Content Delivery Networks (CDNs), Transcoding, Compression, Cloud Gaming/Streaming
- 1 NVIDIA L4 PCIe per node
- 6 2.5” NVMe drives per node
- 16 DIMMs DDR5-4800 per node
- Free-air cooling and liquid cooling options
2U CloudDC for Cloud Data Centre
- 2 NVIDIA L40 PCIe or 4 NVIDIA L4 PCIe
- 12 3.5” SATA drives
- 16 DIMMs DDR5-4800
- Up to 6 GPU
- 3 NVIDIA H100 PCIe
- 6 NVMe drives
- 32 DIMM Slots; Up to 8TB
- Dual 4th Gen Intel® Xeon®processors
AI Edge
Edge Video Transcoding, Edge Inference, Edge Training
- 1U Compact Edge/5G Server
- 2 NVIDIA L4
- 2 Internal Drive Bays
- 8 DIMMs DDR5-4800
- Single Socket Intel® Xeon-SP up to 32 Cores
- 3x PCI-E4.0 x16 slot
- 2x 10 Gigabit Ethernet Ports
- Ultra-compact Fanless Edge Server
- Up to 64GB DDR5
- M.2 M/B/E-Key with Nano SIM Card Slot