BrainMax® DL-E420T

4U 20x Tesla T4 GPU Inference Server

 

  • Deep Learning Inference

Request a Quote

  • Hyperscale Datacenters, Supercomputing Centers, Consumer Internet Companies, Higher Ed Research, Government, Healthcare, Financial Services, Retail, Manufacturing

Processor

  • Dual Socket P (LGA 3647)
  • 2nd Gen. Intel® Xeon® Scalable Processors (Cascade Lake/Skylake), 3 UPI up to 10.4GT/s
  • Supports CPU TDP up to 205W

Chipset

  • Intel® C621 Express Chipset

GPU Support & Quantity

  • 20 x Tesla T4

System Memory (Maximum)

  • 24 DDR4 DIMM slots up to 2933/2666/2400/2133 MHz
  • Supports up to 6TB ECC RDIMM

Expansion Slots

  • 20 x PCI-E 3.0 x16 support up to 20 single width GPU
  • 1 x PCI-E 3.0 x8 (FH, FL in x16) slot

Connectivity

  • Dual Port 10GbE
  • 1 x RJ45 Dedicated IPMI LAN port

VGA

  • Aspeed AST2500 BMC

Management

  • On board BMC (Baseboard Management Controllers)
  • Supports IPMI2.0, media/KVM over LAN with dedicated LAN for system management

Drive Bays

  • 24 x Hot-swap 3.5″ drive bays
  • 2 x Optional U.2 NVMe 2.5″ drives)
  • 1 x M.2 connector

Power Supply

  • 2000W(2+2) Redundant Power Supplies Titanium Level (96%+)

System Dimensions

  • 7.0″ x 17.2” x 29” / 178mm x 437mmx 737mm (H x W x D)

Optimized for Turnkey Solutions

Enable powerful design, training, and visualization with built-in software tools including TensorFlow, Caffe, Torch, Theano, BIDMach cuDNN, NVIDIA CUDA Toolkit and NVIDIA DIGITS.