With the A100 already in full production, Nvidia is taking the GPU to market in multiple ways: with the eight-GPU DGX A100 deep learning system that will cost $200,000, with the HGX A100 server ...
The eight A100s, combined, provide 320 GB in total GPU memory and 12.4 TB per second in bandwidth while the DGX A100's six Nvidia NVSwitch interconnect fabrics, combined with the third-generation ...
By comparison, Nvidia's densest HGX/DGX A100 systems top out at eight GPUs ... a modest advantage in performance over an Nvidia A100 GPU, though it falls behind in both memory capacity and bandwidth.