Skip to content

NVIDIA · Ampere

A100 SXM (80GB)

BF16
FP8
FP4
Memory BW
2039 GB/s
GPU Memory
80 GiB
GPUs / Node
8
Interconnect
NVLINK 3
Scale
Node

Network devices

ConnectX-6 · 8× · COMPUTE