contact us
Tel: (65) 63742113 Operation Hours
Monday to Friday |
AGX Orin Inference Server with 12x AGX Orin
GPU-Accelerated Machine Learning and Inferencing Servers Bring Flexibility and Cost Efficiency to High-Powered Computing.
The AGX Orin Inference Server is an extremely low wattage, high performance AI workstation powered by the NVIDIA Jetson AGX Orin 64GB module. Running NVIDIA’s most powerful deep-learning software libraries, this inference server solves the challenge of deploying edge solutions at scale. The on-board 12x Jetson AGX Orin modules are all connected by a Gigabit Ethernet fabric through a specialized Managed Ethernet Switch with 10G uplink capability.
Processing Module | • 12x NVIDIA® Jetson AGX Orin™ • GPU: 2048-core CUDA GPU with 64 Tensor Cores, 11 TFLOPS (FP16), 275 TOPS (INT8) • CPU: 12-core Cortex ARM v8.2 64-bit CPU, 3MB L2 + 6MB L3 • Memory: 64GB 256-bit LPDDR5, 204.8 GB/s • Storage: 64 GB eMMC 5.1 |
Out-of-band Management Module | • ARM based OOBM • Enabling Serial console access, power status monitoring, and power control (ON/OFF) to all 12x AGX Orin modules • OOBM accessible via Ethernet or via its own integrated USB-to-Serial console |
Processor Module Carriers | • Each module carrier will allow up to 4x NVIDIA Jetson AGX Orin modules to be installed • 3x module carriers can be installed in the system for a total of 12 modules |
Internal Embedded Ethernet Switch | • Vitesse/Microsemi SparX-5i VSC7558TSN-V/5CC Managed Ethernet Switch Engine (XDG205) • CPU: 1 GHz Vcore • Memory: 8Gb DDR4 SDRAM • Storage: 1Gbit Serial NOR Flash • Multi 10G uplinks, with 12x 1G downstream • Complete TSN feature set (Time-Sensitive Networking) |
Internal Array Communication | • 12x Gigabit Ethernet / 1000BASE-T / IEEE 802.3ab channels • All AGX Orin modules can communicate to all other AGX Orin modules |
External Uplink Connections | • 4x SFP+ 10G uplink |
Misc / Additional IO | 1x 1GbE OOB management port via RJ-45; 1x USB UART management port; status LEDs |
Input Power | 100~240 VAC (dual redundant) with 1000W output each |
Internal Storage | Each AGX Orin module has its own M.2 NVMe interface |
Operating Temperature | 0°C to +50°C (+32°F to +122°F) |
Dimensions | • Standard 2U rackmount 88.9mm (H), 635mm (D) • With Mounting Bracket: 163.6 x 146.1mm x 99.4mm • Without Mounting Bracket: 163.6 x 108.0mm x 96.3mm |
Model | Description |
UAGX2U-13 | NVIDIA Orin AGX 2U Inference Server with 12x AGX Orin 64GB Modules, 12x 1TB NVMe |
UAGX2U-16 | NVIDIA Orin AGX 2U Inference Server with 12x AGX Orin 64GB Modules, 12x 2TB NVMe |