Mistral Releases an Open Source Model that Outperforms Gemma 3 and GPT-4o Mini

IBL News | New York

Paris–based Mistral AI unveiled Mistral Small 3.1, a new multimodal open-source model. According to the company, it is “the best model in its weight class ” and “outperforms comparable models like Gemma 3 and GPT-4o Mini.

Released under an Apache 2.0 license, Mistral Small 3.1 has an expanded context window of up to 128k tokens and a delivery inference speed of 150 tokens per second.

Experts say that Mistral Small 3 is competitive with larger models such as Llama 3.3 70B or Qwen 32B and replaces opaque proprietary models like GPT4o-mini.

Mistral Small 3 can be fine-tuned to specialize in specific domains, creating highly accurate experts. This is particularly useful in fields like legal advice, medical diagnostics, and technical support, where domain-specific knowledge is essential.

This model sets the stage for increased competition in a market dominated by U.S. tech giants. Mistral’s open-source approach highlights a growing divide in the AI industry between closed, proprietary systems and open, accessible alternatives.

After raising $1.04 billion, founded in 2023 by former researchers from Google DeepMind and Meta, Mistral AI has rapidly established itself as Europe’s leading AI startup, with a valuation of approximately $6 billion. While impressive for a European startup, this valuation remains a fraction of OpenAI’s reported $80 billion.

Mistral Small 3 Human Evals

Mistral Small 3.1 joins the company’s rapidly expanding suite of AI products.

Earlier this month, the company introduced Mistral OCR, an optical character recognition API that converts PDF documents into AI-ready Markdown files. This addresses a critical need for enterprises seeking to make document repositories accessible to AI systems.

These specialized tools complement Mistral’s broader portfolio, which includes Mistral Large 2 (their flagship large language model), Pixtral (for multimodal applications), Codestral (for code generation), and “Les Ministraux,” a family of models optimized for edge devices.