GPU Management
Admin
A
VRAM:-GB
Showing 58 of 60 GPUs
| ID | Name | Vendor | Gen | VRAM | BF16 TFLOPS | Bandwidth | TDP | CUDA CC | Tensor Cores | Pricing | Status | Actions |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| nvidia-a100-40gb-pcie | A100 40GB PCIe | NVIDIA | ampere | 40 GB | 312 | 1555 GB/s | 250W | 8 | Yes | 4 providers | ||
| nvidia-a100-40gb-sxm | A100 40GB SXM | NVIDIA | ampere | 40 GB | 312 | 1555 GB/s | 400W | 8 | Yes | 6 providers | ||
| nvidia-a100-80gb-pcie | A100 80GB PCIe | NVIDIA | ampere | 80 GB | 312 | 2039 GB/s | 300W | 8 | Yes | 5 providers | ||
| nvidia-a100-80gb-sxm | A100 80GB SXM | NVIDIA | ampere | 80 GB | 312 | 2039 GB/s | 400W | 8 | Yes | 9 providers | ||
| nvidia-a10g | A10G | NVIDIA | ampere | 24 GB | 35 | 600 GB/s | 150W | 8.6 | Yes | 3 providers | ||
| nvidia-a16 | A16 | NVIDIA | ampere | 64 GB | 16.8 | 232 GB/s | 250W | 8.6 | Yes | 1 provider | ||
| nvidia-a2 | A2 | NVIDIA | ampere | 16 GB | 18 | 200 GB/s | 60W | 8.6 | Yes | 1 provider | ||
| nvidia-a30 | A30 | NVIDIA | ampere | 24 GB | 165 | 933 GB/s | 165W | 8 | Yes | 3 providers | ||
| nvidia-a40 | A40 | NVIDIA | ampere | 48 GB | 37.4 | 696 GB/s | 300W | 8.6 | Yes | 4 providers | ||
| nvidia-a4000 | A4000 | NVIDIA | ampere | 16 GB | 76 | 448 GB/s | 140W | 8.6 | Yes | 2 providers | ||
| nvidia-b100-sxm | B100 SXM | NVIDIA | blackwell | 192 GB | 1750 | 8000 GB/s | 700W | 10 | Yes | 2 providers | ||
| nvidia-b200-sxm | B200 SXM | NVIDIA | blackwell | 180 GB | 2250 | 8000 GB/s | 1000W | 10 | Yes | 3 providers | ||
| qualcomm-cloud-ai-100 | Cloud AI 100 | Other | other | 32 GB | 150 | 134 GB/s | 75W | N/A | No | 1 provider | ||
| intel-gaudi-2 | Gaudi 2 | Intel | gaudi | 96 GB | 432 | 2460 GB/s | 600W | N/A | No | 1 provider | ||
| intel-gaudi-3 | Gaudi 3 | Intel | gaudi | 128 GB | 1835 | 3700 GB/s | 900W | N/A | No | 2 providers | ||
| intel-gaudi-3-hl325l | Gaudi 3 HL-325L | Intel | gaudi | 128 GB | 1835 | 3700 GB/s | 900W | N/A | No | 1 provider | ||
| nvidia-gb200-nvl72 | GB200 NVL72 (per GPU) | NVIDIA | blackwell | 192 GB | 2250 | 8000 GB/s | 1200W | 10 | Yes | 1 provider | ||
| nvidia-gb300-nvl72 | GB300 NVL72 (per GPU) | NVIDIA | blackwell | 192 GB | 2500 | 8000 GB/s | 1200W | 10 | Yes | 1 provider | ||
| nvidia-gh200 | GH200 | NVIDIA | hopper | 96 GB | 990 | 4000 GB/s | 900W | 9 | Yes | 2 providers | ||
| groq-lpu | Groq LPU | Other | other | 230 GB | 188 | 80000 GB/s | 300W | N/A | No | 1 provider | ||
| nvidia-h100-nvl | H100 NVL | NVIDIA | hopper | 94 GB | 835 | 3938 GB/s | 400W | 9 | Yes | 2 providers | ||
| nvidia-h100-nvl-94gb | H100 NVL 94GB (per GPU pair) | NVIDIA | hopper | 188 GB | 1670 | 7876 GB/s | 800W | 9 | Yes | 2 providers | ||
| nvidia-h100-pcie | H100 PCIe | NVIDIA | hopper | 80 GB | 756 | 2000 GB/s | 350W | 9 | Yes | 4 providers | ||
| nvidia-h100-sxm | H100 SXM | NVIDIA | hopper | 80 GB | 990 | 3350 GB/s | 700W | 9 | Yes | 9 providers | ||
| nvidia-h20 | H20 | NVIDIA | hopper | 96 GB | 148 | 4000 GB/s | 500W | 9 | Yes | 2 providers | ||
| nvidia-h200-sxm | H200 SXM | NVIDIA | hopper | 141 GB | 990 | 4800 GB/s | 700W | 9 | Yes | 4 providers | ||
| amd-mi100 | Instinct MI100 | AMD | cdna | 32 GB | 184.6 | 1229 GB/s | 300W | N/A | No | 1 provider | ||
| amd-mi210 | Instinct MI210 | AMD | cdna2 | 64 GB | 181 | 1638 GB/s | 300W | N/A | No | 1 provider | ||
| amd-mi250x | Instinct MI250X | AMD | cdna2 | 128 GB | 383 | 3277 GB/s | 560W | N/A | No | 2 providers | ||
| amd-mi300x | Instinct MI300X | AMD | cdna3 | 192 GB | 1307 | 5300 GB/s | 750W | N/A | No | 6 providers | ||
| amd-mi325x | Instinct MI325X | AMD | cdna3 | 256 GB | 1307 | 6000 GB/s | 750W | N/A | No | 3 providers | ||
| nvidia-l20 | L20 | NVIDIA | ada | 48 GB | 239 | 864 GB/s | 275W | 8.9 | Yes | 1 provider | ||
| nvidia-l4 | L4 | NVIDIA | ada | 24 GB | 121 | 300 GB/s | 72W | 8.9 | Yes | 6 providers | ||
| nvidia-l40 | L40 | NVIDIA | ada | 48 GB | 362 | 864 GB/s | 300W | 8.9 | Yes | 4 providers | ||
| nvidia-l40s | L40S | NVIDIA | ada | 48 GB | 362 | 864 GB/s | 350W | 8.9 | Yes | 8 providers | ||
| amd-w7900 | Radeon PRO W7900 | AMD | rdna3 | 48 GB | 122 | 864 GB/s | 295W | N/A | No | 1 provider | ||
| nvidia-rtx-3060 | RTX 3060 | NVIDIA | ampere | 12 GB | 12.7 | 360 GB/s | 170W | 8.6 | Yes | 2 providers | ||
| nvidia-rtx-3070 | RTX 3070 | NVIDIA | ampere | 8 GB | 20.4 | 448 GB/s | 220W | 8.6 | Yes | 2 providers | ||
| nvidia-rtx-3080 | RTX 3080 | NVIDIA | ampere | 10 GB | 47 | 760 GB/s | 320W | 8.6 | Yes | 2 providers | ||
| nvidia-rtx-3090 | RTX 3090 | NVIDIA | ampere | 24 GB | 35.6 | 936 GB/s | 350W | 8.6 | Yes | 3 providers | ||
| nvidia-rtx-4060 | RTX 4060 | NVIDIA | ada | 8 GB | 30 | 272 GB/s | 115W | 8.9 | Yes | 1 provider | ||
| nvidia-rtx-4060-ti-16gb | RTX 4060 Ti 16GB | NVIDIA | ada | 16 GB | 44 | 288 GB/s | 165W | 8.9 | Yes | 1 provider | ||
| nvidia-rtx-4070-super | RTX 4070 Super | NVIDIA | ada | 12 GB | 55 | 504 GB/s | 220W | 8.9 | Yes | 2 providers | ||
| nvidia-rtx-4070-ti | RTX 4070 Ti | NVIDIA | ada | 12 GB | 93 | 504 GB/s | 285W | 8.9 | Yes | 2 providers | ||
| nvidia-rtx-4080 | RTX 4080 | NVIDIA | ada | 16 GB | 97 | 717 GB/s | 320W | 8.9 | Yes | 2 providers | ||
| nvidia-rtx-4090 | RTX 4090 | NVIDIA | ada | 24 GB | 165 | 1008 GB/s | 450W | 8.9 | Yes | 5 providers | ||
| nvidia-rtx-5090 | RTX 5090 | NVIDIA | blackwell | 32 GB | 210 | 1792 GB/s | 575W | 8.9 | Yes | 3 providers | ||
| nvidia-rtx-6000-ada | RTX 6000 Ada | NVIDIA | ada | 48 GB | 91.1 | 960 GB/s | 300W | 8.9 | Yes | 4 providers | ||
| nvidia-rtx-a5000 | RTX A5000 | NVIDIA | ampere | 24 GB | 27.8 | 768 GB/s | 230W | 8.6 | Yes | 3 providers | ||
| nvidia-a6000 | RTX A6000 | NVIDIA | ampere | 48 GB | 38.7 | 768 GB/s | 300W | 8.6 | Yes | 4 providers | ||
| amd-rx-7900-xtx | RX 7900 XTX | AMD | rdna3 | 24 GB | 123 | 960 GB/s | 355W | N/A | No | 1 provider | ||
| nvidia-t4 | T4 | NVIDIA | turing | 16 GB | 65 | 300 GB/s | 70W | 7.5 | Yes | 6 providers | ||
| google-tpu-v4 | TPU v4 | tpu | 32 GB | 275 | 1200 GB/s | 175W | N/A | No | 1 provider | |||
| google-tpu-v5e | TPU v5e | tpu | 16 GB | 200 | 820 GB/s | 200W | N/A | No | 1 provider | |||
| google-tpu-v6e | TPU v6e (Trillium) | tpu | 32 GB | 460 | 1640 GB/s | 200W | N/A | No | 1 provider | |||
| aws-trainium2 | Trainium2 | Other | other | 96 GB | 756 | 3200 GB/s | 600W | N/A | No | 1 provider | ||
| nvidia-v100-16gb | V100 16GB | NVIDIA | volta | 16 GB | 28.3 | 900 GB/s | 250W | 7 | Yes | 5 providers | ||
| nvidia-v100-32gb | V100 32GB | NVIDIA | volta | 32 GB | 28.3 | 900 GB/s | 300W | 7 | Yes | 4 providers |