Wednesday’s AI startup Mistral unveiled a new AI model focused on coding: Devstral.
According to Mistral, Devstral, developed in collaboration with AI Company All Hands AI, is openly available under the Apache 2.0 license. This means that it can be used commercially without restrictions. Mistral claims that DevStral is superior to other open models, such as Google’s Gemma 3 27b and SWE-Bench’s China Lab Deepseek’s V3, a benchmark that measures coding skills.
“Devstral uses tools to explore the codebase and excels at editing multiple files and power (ING) software engineering agents,” Mistral wrote in a blog post provided to TechCrunch. “(i)T runs a scaffolding of code agents such as OpenHands and SWE-Agent that define the interface between the model and the test case (…) Devstral is light enough to run on a single (NVIDIA) RTX 4090 or MAC with 32GB RAM, making it an ideal choice for local deployment and device use.”

DevStral has arrived as AI coding assistants, and models powered by them are becoming increasingly popular. Last month, Jetbrains, behind the range of popular app development tools, released its first “open” AI model for coding. Over the past few months, AI costumes such as Google, Windsurf, Openai have also been announced, with unique models that are openly available, optimized for programming tasks.
AI models still struggle to code high quality software. Code generation AI tends to introduce security vulnerabilities and errors due to weaknesses in areas such as the ability to understand programming logic. But their promise to increase coding productivity is to drive rapid adoption by businesses and developers. A recent poll found that 76% of developers either use or plan to use AI tools in the development process last year.
Mistral previously walked into the support programming space using Codestral, a code generation model. However, Codestral is not released under a license that allows developers to use the model in commercial applications. The license expressly prohibits “(a) internal use by employees in the context of the company’s business activities.”
Devstral, which Mistral calls “Research Preview,” can be downloaded from AI development platforms, such as hugging Face, and tapped through Mistral’s API. At $0.3 for an output token of $0.1 per input token and $0.3 per million, the token is the fabric of data that AI models work with. (1 million tokens equals about 750,000 words, or about 163,000 words than “war and peace.”
Mistral says “we are struggling to build a larger agent coding model that will be available in the coming weeks.” DevStral itself is not a small model, but it lies on the small side of 24 billion parameters. (Parameters are roughly compatible with the model’s problem-solving skills, and models with more parameters generally perform better than those with fewer parameters.)
Founded in 2023, Mistral is a frontier model lab and aims to build a variety of AI-powered services, including chatbot platforms, LE chats, and mobile apps. It is backed by VCS, including General Catalyst, and has raised more than 1.1 billion euros (approximately $1.24 billion) to date. Mistral’s customers include BNP Paribas, AXA and MIRAKL.
Devstral is Mistral’s third release this month. A few weeks ago, Mistral launched its Mistral Medium 3, an efficient, general purpose model. At about the same time, the company deployed Le Chat Enterprise, a company-centric chatbot service that provides tools such as AI “agent” builders and integrates Mistral’s model with third-party services such as Gmail, Google Drive, and SharePoint.