Skip to content
Updated minutes ago
Mistral

E5 Mistral 7B

IntFloat · dense · 7.11B parameters · 32,768 context

Quality
50.0

Architecture Details

TypeDENSE
Total Parameters7.11B
Active Parameters7.11B
Layers32
Hidden Dimension4,096
Attention Heads32
KV Heads8
Head Dimension128
Vocab Size32,000

Memory Requirements

BF16 Weights

14.2 GB

FP8 Weights

7.1 GB

INT4 Weights

3.6 GB

KV-Cache per Token131072 bytes
Activation Estimate0.80 GB

Fits on (single-node)

B200 SXM BF16B100 SXM BF16GB200 NVL72 (per GPU) BF16GB300 NVL72 (per GPU) BF16H200 SXM BF16H100 SXM BF16H100 PCIe BF16H100 NVL BF16

GPU Recommendations

A10Goptimal

BF16 · 1 GPU · vllm

100/100

score

Throughput

227.8 tok/s

Cost/Month

$285

Cost/M Tokens

$0.48

Use this config →
A30optimal

BF16 · 1 GPU · vllm

100/100

score

Throughput

354.3 tok/s

Cost/Month

$332

Cost/M Tokens

$0.36

Use this config →
RTX 4090optimal

BF16 · 1 GPU · vllm

100/100

score

Throughput

382.8 tok/s

Cost/Month

$370

Cost/M Tokens

$0.37

Use this config →

API Pricing Comparison

ProviderInput $/MOutput $/MBadges
together$0.02$0.02
Cheapest

Capabilities

Features

Tool Use Vision Code Math Reasoning Multilingual Structured Output

Supported Frameworks

vllmsglangtgitensorrt-llm

Supported Precisions

BF16 (default)FP8INT4

Similar Models