Mistral AI, in collaboration with All Hands AI, has released Devstral Small 1.1 (model ID: Devstral‑Small‑2507)—a 24-billion-parameter LLM designed specifically for software engineering agentic coding tasks(Hugging Face). Built on Mistral‑Small‑3.1, this model offers a massive 128K token context window and is released under an Apache 2.0 license(Hugging Face).
🚀 Key Highlights
- State‑of‑the‑Art on SWE‑Bench: Achieves 53.6 % on SWE‑Bench Verified—surpassing its predecessor by +6.8 % and outperforming larger closed‑source peers(Hugging Face).
- Agentic Workflow Design: Tailored to follow tool‑based workflows like file exploration, editing multiple files, and writing outputs—ideal for VS Code, CLiNe, and local dev environments(Hugging Face).
- Portable & Efficient: With only 24B parameters, it can run on a single RTX 4090 GPU or a 32 GB Mac, making local deployment accessible(Hugging Face).
- Enhanced Generalization: The 1.1 update boosts mushrooming of prompts and environments, supports Mistral function calling, and integrates seamlessly with scaffolding tools like OpenHands(Hugging Face).
Availability & Usage
API Access
- Listed under
devstral‑small‑2507
on Mistral’s API portal at the same pricing as Mistral Small 3.1:
- $0.10 per million input tokens
- $0.30 per million output tokens(Mistral AI)
Local Deployment Options
- vLLM (recommended):
- Use version ≥ 0.9.1 and
mistral_common
≥ 1.7.0.
- Run:
vllm serve mistralai/Devstral-Small-2507 ...
- Configure client calls with Hugging Face’s Hub download + a SYSTEM_PROMPT file(Hugging Face).
- Mistral‑inference / llama.cpp / LM Studio:
- Available in GGUF format including Q4_K_M, Q5_K_M, Q8_0, and BF16 quantizations(Hugging Face).
- Ideal for light deployments using limited compute.
Community Feedback
On r/LocalLLaMA, users praised Devstral’s “agentic/tool use patterns”—a clearer improvement over Codestral’s simpler copilot‑style systems. One redditor noted:
“Devstral for sure. It was trained specifically to follow the agentic / tool use patterns (… read_files, then edit_files, then write_files…)”(Reddit)
Why It Matters
The Devstral 2507 release highlights a growing wave of domain-specialized open‑source models that match or outperform proprietary giants—all while being cost-efficient and license-friendly(apidog). Its success on SWE‑Bench and smooth local usage make it a standout choice for both enterprises and solo developers.