B200 SXM vs B100 SXM
Side-by-side comparison of the NVIDIA B200 SXM and the NVIDIA B100 SXM for AI inference workloads.
Specifications
| Spec | B200 SXM | B100 SXM |
|---|---|---|
| Generation | blackwell | blackwell |
| Memory Type | HBM3e | HBM3e |
| VRAM | 180 GB | 192 GB |
| Memory Bandwidth | 8,000 GB/s | 8,000 GB/s |
| BF16 TFLOPS | 2,250 | 1,750 |
| FP16 TFLOPS | 2,250 | 1,750 |
| FP8 TFLOPS | 4,500 | 3,500 |
| INT8 TOPS | 4,500 | 3,500 |
| TDP | 1,000 W | 700 W |
| Interconnect | nvlink | nvlink |
| NVLink Bandwidth | 1,800 GB/s | 1,800 GB/s |
| Max GPUs per Node | 8 | 8 |
| PCIe Gen | Gen 6 | Gen 6 |
| CUDA Compute Capability | 10 | 10 |
Pricing
B200 SXM
| Provider | On-Demand | Reserved | Spot |
|---|---|---|---|
| coreweave | $7.50/hr | $5.50/hr | - |
| lambda | $5.99/hr | $4.49/hr | - |
| runpod | $7.20/hr | - | - |
B100 SXM
| Provider | On-Demand | Reserved | Spot |
|---|---|---|---|
| coreweave | $6.00/hr | $4.50/hr | - |
| lambda | $4.99/hr | - | - |
Cheapest available rate: B200 SXM at $4.49/hr vs B100 SXM at $4.50/hr — B200 SXM is +0% cheaper
Efficiency Metrics
TFLOPS / Watt
2.3
B200 SXM
2.5
B100 SXM
BF16
VRAM / Dollar
40.1
B200 SXM
42.7
B100 SXM
GB/$/hr
Bandwidth / Watt
8.0
B200 SXM
11.4
B100 SXM
GB/s/W
Models (FP16, 1 GPU)
220.0
B200 SXM
221.0
B100 SXM
Model Compatibility (FP16, Single GPU)
Only on B200 SXM (0)
None
Both (220)
- Yi 1.5 34B
- Yi 1.5 9B
- Yi Coder 9B
- Jamba 1.5 Mini
- GTE Qwen2 7B
- Marco O1
- Qwen 1.5 MoE A2.7B
- Qwen 2 Audio 7B
- Qwen 2.5 14B
- Qwen 2.5 32B
- Qwen 2.5 3B
- Qwen 2.5 Coder 32B
- OLMo 2 13B
- OLMo 2 7B
- Amazon Nova Lite
- Amazon Nova Pro
- OpenELM 3B
- BGE Large EN v1.5
- BGE M3
- Baichuan 2 13B
- +200 more
Only on B100 SXM (1)
- Llama 3.2 90B Vision Instruct
Summary
The B200 SXM (blackwell generation) offers 180GB of HBM3e with 2,250 BF16 TFLOPS and 8,000 GB/s memory bandwidth at 1000W TDP.
The B100 SXM (blackwell generation) offers 192GB of HBM3e with 1,750 BF16 TFLOPS and 8,000 GB/s memory bandwidth at 700W TDP.
The B100 SXM has +7% more VRAM, allowing it to run larger models without multi-GPU setups.
From a cost perspective, the B200 SXM is more affordable at $4.49/hr vs $4.50/hr for the B100 SXM.