The Nvidia DGX-2 isn’t the initial off-the-peg Nvidia server to be targeted at AI. That honour mosts likely to the DGX-1, based on a mix of Intel Xeon cpus coupled with Nvidia’s very own AI-optimised Tesla V100 Volta-architecture GPUs. The DGX-2 proceeds that strategy, yet rather than eight Tesla V100s signed up with using Nvidia’s NVLink bus, the DGX-2 includes 16 of these magnificent GPUs linked utilizing its even more scalable NVswitch modern technology. According to Nvidia, this arrangement permits the Nvidia DGX-2 to take care of deep learning as well as various other demanding AI and HPC work up to 10 times faster than its smaller sized sibling.
Although it was revealed at the same time as the Nvidia DGX-2, it has taken an additional 6 months for the larger version to show up. Among the initial to make it to the UK was installed in the laboratories of Nvidia companion Boston Limited. They asked if we wish to have a look: we did, as well as below is what we found.
As well as efficiency, dimension is a big differentiator with the DGX-2 which has the very same crackle-finish gold bezel as the DGX-1 yet is literally a great deal bigger, weighing in at 154.2 kg (340lbs) contrasted to 60.8 kg (134lbs) for the DGX-1 and also consuming 10 rack systems instead of 3
It’s additionally worth keeping in mind that the Nvidia DGX-2 requires a whole lot even more power than its little sibling, needing as much as 10kW at full tilt, rising to 12kW for the just recently revealed DGX-2H design (about which extra soon). The picture below programs the power plans at Boston required to maintain this little monster happy. Cooling, likewise, will certainly require mindful factor to consider, particularly where more than one DGX-2 is deployed or where it’s mounted alongside various other hardware in the very same rack.
Distributing that power is a set of 6 hot-swap as well as redundant PSUs that slide in behind the framework in addition to the various components that comprise the remainder of the system. Cooling, at the same time, is taken care of by a selection of 10 fans situated behind the front bezel with space on either side for 16 2.5-inch storage devices in two banks of eight.
Read Also : BullGuard Antivirus 2019
Nvidia includes 8 3.84 TB Micron 9200 Pro NVMe drives as part of the base configuration, corresponding to simply over 30TB of high-performance storage. This, nevertheless, is primarily to manage neighborhood information, with additional storage space on the main motherboard for OS as well as application code. It additionally leaves eight bays vacant to include even more storage if needed. Additionally, the Nvidia DGX-2 is bristling with high-bandwidth network interfaces to attach to a lot more capacity as well as build server collections if needed.