Skip to content
Updated minutes ago
nvidia

A100 40GB PCIe

nvidia · ampere · 40 GB HBM2e · 250W TDP

VRAM

40 GB

BF16 TFLOPS

312

Bandwidth

1555 GB/s

From

$0.69/hr

Calculate ROI with this GPU →

Spec Sheet

VRAM40 GB HBM2e
Memory Bandwidth1555 GB/s
BF16 TFLOPS312
FP16 TFLOPS312
FP8 TFLOPS312
INT8 TOPS624
TDP250W
InterconnectPCIE
Max per Node8
PCIe Gen4
CUDA Compute Capability8
Tensor CoresYes

Pricing by Provider

ProviderOn-DemandReservedSpotBadge
tensordock$0.99/hr-$0.69/hrCheapest
vast_ai$1.10/hr-$0.75/hr
runpod$1.44/hr-$1.09/hr
lambda$1.10/hr--

Compatible Models (238)

Training Capabilities

Estimated GPU count for full fine-tuning (AdamW, BF16) and QLoRA

Model SizeFull Fine-TuneQLoRA
7B model4 GPUs1 GPU
13B model7 GPUs1 GPU
70B model33 GPUs2 GPUs

Energy Efficiency

Estimated tokens/second per Watt for popular models

Mistral 7B
0.85 t/s/WFP8
Qwen 2.5 7B
0.82 t/s/WFP8
Llama 3.1 8B
0.77 t/s/WFP8
Llama 3.1 70B
0.09 t/s/WFP8
Qwen 2.5 72B
0.09 t/s/WFP8

Similar GPUs

GPUVRAMBF16 TFLOPSBW (GB/s)From
A100 40GB SXM40 GB3121555$0.85/hr
RTX A600048 GB38.7768$0.49/hr
A4048 GB37.4696$0.42/hr
A10G24 GB35600$0.30/hr
A3024 GB165933$0.35/hr