BrainMax™ DL-E28T

2U 8x Tesla T4 GPU Inference Server

 

  • Deep Learning Inference

Request a Quote

  • Hyperscale Datacenters, Supercomputing Centers, Consumer Internet Companies, Higher Ed Research, Government, Healthcare, Financial Services, Retail, Manufacturing

Processor

  • Dual Socket P (LGA 3647)
  • 2nd Gen. Intel® Xeon® Scalable Processors (Cascade Lake/Skylake), 3 UPI up to 10.4GT/s
  • Supports CPU TDP up to 205W

Chipset

  • Intel® C621 Express Chipset

GPU Support & Quantity

  • 8 x Tesla T4 PCIe

System Memory (Maximum)

  • 24 DIMM slots
  • Up to 6TB 3DS ECC DDR4-2933/ 2666/2400 MHz RDIMM/LRDIMM
  • Supports Intel® Optane™ DCPMM

Expansion Slots

  • 8 x PCIe x16 (Gen3 x16 bus) connectors for 8x double width GPU cards
  • 2 x PCIe x16 (Gen3 x16 bus) Half-length low-profile slots

Connectivity

  • 2 x 10Gb/s BASE-T LAN ports (Intel® X550-AT2)
  • 1 x RJ45 Dedicated IPMI LAN port

VGA

  • Aspeed AST2500 BMC

Management

  • On-board management controller
  • IPMI 2.0 web interface

Drive Bays

  • 8 x Hot-swap 3.5″ HDD Bays

Power Supply

  • 2 x 2200W redundant PSUs
  • 80 PLUS Platinum

System Dimensions

  • 3.5″ x 17.6″ x 31.5” / 87.5mm x 448mm x 800mm (H x W x D)

Optimized for Turnkey Solutions

Enable powerful design, training, and visualization with built-in software tools including TensorFlow, Caffe, Torch, Theano, BIDMach cuDNN, NVIDIA CUDA Toolkit and NVIDIA DIGITS.