Nvidia showcases a 4x Tesla V100 Volta system at Computex

Nvidia showcases a 4x Tesla V100 Volta system at Computex

Nvidia showcases a 4x Tesla V100 Volta system at Computex

 
Nvidia has been showcasing their new DGX Deep learning workstation at Computex, which has four PCIe Tesla V100 Volta GPUs which are connected using Nvidia’s latest version of their NVLink interconnect. 
 
This system will contain a 20-core Intel Xeon E5 2698 v4 CPU, 256GB of ECC 2133MHz DDR4 memory, dual 10Gb LAN ports and four Volta based Tesla V100 GPUs, all of which are water cooled. This will give this workstation a total of 20,480 CUDA cores and 2560 Tensor cores and 16GB of HBM2 memory per GPU.  
 
Storage wise this system will also contain a 1.92TB SSD for storage and a 3x 1.92GB SSD RAID-0 configuration for data storage, with the tower running Linux Ubuntu. 

  

     Nvidia showcases a 4x Tesla V100 Volta system at Computex  Nvidia showcases a 4x Tesla V100 Volta system at Computex

(Images from Tweaktown)

This server will contain a total of 60 TFLOPs of FP32 GPU performance, which does not even include the performance that can be provided by Nvidia’s dedicated Tensor cores for AI calculations. This is more performance that is available in any other quad-GPU system, which is an astounding achievement by Nvidia. 

 

  
Nvidia showcases a 4x Tesla V100 Volta system at Computex  

 

Attendees of Computex have been told that this system will cost a total of $69,000, making this system more expensive than most cars when Nvidia officially releases the system. Would you pay that much for a system without tempered glass or RGB lighting? 

 

You can join the discussion on Nvidia’s Tesla V100 powered system on the OC3D Forums. 

 

Nvidia showcases a 4x Tesla V100 Volta system at Computex

Nvidia showcases a 4x Tesla V100 Volta system at Computex

 
Nvidia has been showcasing their new DGX Deep learning workstation at Computex, which has four PCIe Tesla V100 Volta GPUs which are connected using Nvidia’s latest version of their NVLink interconnect. 
 
This system will contain a 20-core Intel Xeon E5 2698 v4 CPU, 256GB of ECC 2133MHz DDR4 memory, dual 10Gb LAN ports and four Volta based Tesla V100 GPUs, all of which are water cooled. This will give this workstation a total of 20,480 CUDA cores and 2560 Tensor cores and 16GB of HBM2 memory per GPU.  
 
Storage wise this system will also contain a 1.92TB SSD for storage and a 3x 1.92GB SSD RAID-0 configuration for data storage, with the tower running Linux Ubuntu. 

  

     Nvidia showcases a 4x Tesla V100 Volta system at Computex  Nvidia showcases a 4x Tesla V100 Volta system at Computex

(Images from Tweaktown)

This server will contain a total of 60 TFLOPs of FP32 GPU performance, which does not even include the performance that can be provided by Nvidia’s dedicated Tensor cores for AI calculations. This is more performance that is available in any other quad-GPU system, which is an astounding achievement by Nvidia. 

 

  
Nvidia showcases a 4x Tesla V100 Volta system at Computex  

 

Attendees of Computex have been told that this system will cost a total of $69,000, making this system more expensive than most cars when Nvidia officially releases the system. Would you pay that much for a system without tempered glass or RGB lighting? 

 

You can join the discussion on Nvidia’s Tesla V100 powered system on the OC3D Forums. 

Â