Author's Description
A 7.3B parameter model that outperforms Llama 2 13B on all benchmarks, with optimizations for speed and context length.
Key Specifications
Supported Parameters
This model supports the following parameters:
Performance Summary
Mistral 7B Instruct v0.1, a 7.3B parameter model from mistralai, consistently performs among the fastest models and offers highly competitive pricing across all evaluated benchmarks. While specific reliability data was not provided, the model's performance across various categories offers insights into its capabilities. It demonstrates a notable strength in acknowledging uncertainty, achieving 84.0% accuracy in Hallucinations (Baseline) tests, indicating a good ability to identify fictional concepts. The model also performs reasonably well in Ethics (Baseline) with 97.0% accuracy, suggesting a strong grasp of ethical principles. However, its performance in other areas is significantly weaker. General Knowledge (Baseline) shows moderate accuracy at 79.8%, placing it in the 24th percentile. Key weaknesses are evident in Email Classification (77.0% accuracy, 8th percentile), Reasoning (2.0% accuracy, 4th percentile), and Coding (1.0% accuracy, 8th percentile). Critically, the model achieved 0.0% accuracy in Instruction Following, indicating a severe limitation in executing complex directives. This suggests that while it excels in speed and cost-efficiency, and shows promise in specific areas like ethics and uncertainty acknowledgment, its current iteration struggles significantly with tasks requiring precise instruction adherence, complex reasoning, and coding proficiency.
Model Pricing
Current Pricing
| Feature | Price (per 1M tokens) |
|---|---|
| Prompt | $0.11 |
| Completion | $0.19 |
Price History
Available Endpoints
| Provider | Endpoint Name | Context Length | Pricing (Input) | Pricing (Output) |
|---|---|---|---|---|
|
Cloudflare
|
Cloudflare | mistralai/mistral-7b-instruct-v0.1 | 2K | $0.11 / 1M tokens | $0.19 / 1M tokens |
|
Together
|
Together | mistralai/mistral-7b-instruct-v0.1 | 32K | $0.11 / 1M tokens | $0.19 / 1M tokens |
Benchmark Results
| Benchmark | Category | Reasoning | Strategy | Free | Executions | Accuracy | Cost | Duration |
|---|
Other Models by mistralai
|
|
Released | Params | Context |
|
Speed | Ability | Cost |
|---|---|---|---|---|---|---|---|
| Mistral: Mistral Small Creative | Dec 16, 2025 | — | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Devstral 2 2512 | Dec 09, 2025 | ~123B | 262K |
Text input
Text output
|
★★ | ★★★★ | $$ |
| Mistral: Ministral 3 14B 2512 | Dec 02, 2025 | 14B | 262K |
Image input
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Ministral 3 8B 2512 | Dec 02, 2025 | 8B | 262K |
Image input
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Ministral 3 3B 2512 | Dec 02, 2025 | 3B | 131K |
Image input
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Mistral Large 3 2512 | Dec 01, 2025 | ~32B | 262K |
Image input
Text input
Text output
|
★★★ | ★★★ | $$$ |
| Mistral: Voxtral Small 24B 2507 | Oct 30, 2025 | 24B | 32K |
Text input
Audio input
Text output
|
★★★★★ | ★★★ | $$ |
| Mistral: Mistral Medium 3.1 | Aug 13, 2025 | — | 131K |
Image input
Text input
Text output
|
★★★ | ★★★★ | $$$$ |
| Mistral: Codestral 2508 | Aug 01, 2025 | ~2.5B | 256K |
Text input
Text output
|
★★★ | ★★ | $$$ |
| Mistral: Devstral Medium | Jul 10, 2025 | — | 131K |
Text input
Text output
|
★★★★ | ★★★★ | $$$ |
| Mistral: Devstral Small 1.1 | Jul 10, 2025 | ~24B | 131K |
Text input
Text output
|
★★★★★ | ★★★ | $$ |
| Mistral: Mistral Small 3.2 24B | Jun 20, 2025 | 24B | 131K |
Image input
Text input
Text output
|
★★★★ | ★★★ | $$ |
| Mistral: Magistral Small 2506 Unavailable | Jun 10, 2025 | ~24B | 40K |
Text input
Text output
|
★★★ | ★★ | $$$$ |
| Mistral: Magistral Medium 2506 Unavailable | Jun 07, 2025 | — | 40K |
Text input
Text output
|
★★★★ | ★★★★ | $$$$$ |
| Mistral: Devstral Small 2505 Unavailable | May 21, 2025 | ~24B | 128K |
Text input
Text output
|
★★★★★ | ★★★★ | $ |
| Mistral: Mistral Medium 3 | May 07, 2025 | — | 131K |
Image input
Text input
Text output
|
★★★ | ★★★★ | $$$$ |
| Mistral: Mistral Small 3.1 24B | Mar 17, 2025 | 24B | 131K |
Image input
Text input
Text output
|
★★★★ | ★★★ | $$ |
| Mistral: Saba | Feb 17, 2025 | ~24B | 32K |
Text input
Text output
|
★★ | ★★★★ | $$ |
| Mistral: Mistral Small 3 | Jan 30, 2025 | 24B | 32K |
Text input
Text output
|
★★★★★ | ★★★★★ | $ |
| Mistral: Codestral 2501 Unavailable | Jan 14, 2025 | ~2.5B | 256K |
Text input
Text output
|
★★★ | ★★★ | $$$ |
| Mistral Large 2411 | Nov 18, 2024 | ~32B | 131K |
Text input
Text output
|
★★★★ | ★★★★ | $$$$ |
| Mistral Large 2407 | Nov 18, 2024 | ~32B | 131K |
Text input
Text output
|
★★★★ | ★★★★ | $$$$ |
| Mistral: Pixtral Large 2411 | Nov 18, 2024 | ~124B | 131K |
Image input
Text input
Text output
|
★★★ | ★★★★ | $$$$$ |
| Mistral: Ministral 3B | Oct 16, 2024 | 3B | 131K |
Text input
Text output
|
★★★★★ | ★★ | $ |
| Mistral: Ministral 8B | Oct 16, 2024 | 8B | 131K |
Text input
Text output
|
★★★★ | ★★★ | $$ |
| Mistral: Pixtral 12B | Sep 09, 2024 | 12B | 32K |
Image input
Text input
Text output
|
★★★★★ | ★ | $$ |
| Mistral: Mistral Nemo | Jul 18, 2024 | ~12B | 131K |
Text input
Text output
|
★★★★ | ★ | $ |
| Mistral: Mistral 7B Instruct | May 26, 2024 | 7B | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Mistral 7B Instruct v0.3 | May 26, 2024 | 7B | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Mixtral 8x22B Instruct | Apr 16, 2024 | 22B | 65K |
Text input
Text output
|
★★★★★ | ★★★ | $$$ |
| Mistral Large | Feb 25, 2024 | ~32B | 128K |
Text input
Text output
|
★★★★ | ★★★★ | $$$$ |
| Mistral Medium Unavailable | Jan 09, 2024 | — | 32K |
Text input
Text output
|
— | — | $$$$$ |
| Mistral Tiny | Jan 09, 2024 | ~7B | 32K |
Text input
Text output
|
★★★★ | ★★ | $$ |
| Mistral Small Unavailable | Jan 09, 2024 | ~22B | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$ |
| Mistral: Mistral 7B Instruct v0.2 | Dec 27, 2023 | 7B | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$$ |
| Mistral: Mixtral 8x7B Instruct | Dec 09, 2023 | 56B | 32K |
Text input
Text output
|
★★★★★ | ★★ | $$ |