Skip to content

NVIDIA · Ampere

A100 SXM (40GB)

BF16
FP8
FP4
Memory BW
1555 GB/s
GPU Memory
40 GiB
GPUs / Node
8
Interconnect
NVLINK 3
Scale
Node

Network devices

ConnectX-6 · 8× · COMPUTE