Skip to content
Updated minutes ago
nvidia

A100 80GB PCIe

nvidia · ampere · 80 GB HBM2e · 300W TDP

VRAM

80 GB

BF16 TFLOPS

312

Bandwidth

2039 GB/s

From

$1.05/hr

Calculate ROI with this GPU →

Spec Sheet

VRAM80 GB HBM2e
Memory Bandwidth2039 GB/s
BF16 TFLOPS312
FP16 TFLOPS312
FP8 TFLOPS312
INT8 TOPS624
TDP300W
InterconnectPCIE
Max per Node8
PCIe Gen4
CUDA Compute Capability8
Tensor CoresYes

Pricing by Provider

ProviderOn-DemandReservedSpotBadge
fluidstack$1.49/hr-$1.05/hrCheapest
tensordock$1.59/hr-$1.09/hr
vast_ai$1.60/hr-$1.10/hr
runpod$2.29/hr-$1.79/hr
lambda$1.79/hr--

Compatible Models (249)

Training Capabilities

Estimated GPU count for full fine-tuning (AdamW, BF16) and QLoRA

Model SizeFull Fine-TuneQLoRA
7B model2 GPUs1 GPU
13B model4 GPUs1 GPU
70B model17 GPUs1 GPU

Energy Efficiency

Estimated tokens/second per Watt for popular models

Mistral 7B
0.93 t/s/WFP8
Qwen 2.5 7B
0.89 t/s/WFP8
Llama 3.1 8B
0.85 t/s/WFP8
Llama 3.1 70B
0.10 t/s/WFP8
Qwen 2.5 72B
0.09 t/s/WFP8

Similar GPUs

GPUVRAMBF16 TFLOPSBW (GB/s)From
A100 80GB SXM80 GB3122039$1.19/hr
A1664 GB16.8232$0.72/hr
RTX A600048 GB38.7768$0.49/hr
A4048 GB37.4696$0.42/hr
A100 40GB SXM40 GB3121555$0.85/hr