Skip to main content
MindStudio
Pricing
Blog About
My Workspace
Text Generation Model

Mistral Large 2

Single-node inference model with 128k context window supporting dozens of languages and 80+ coding languages.

Publisher Mistral
Type Text
Context Window 128,000 tokens
Training Data n/a
Input $2.00/MTok
Output $6.00/MTok

128k context across dozens of languages and code

Mistral Large 2 is a text generation model developed by Mistral, a French AI company. It has 123 billion parameters and a 128,000-token context window, making it suited for long-document processing and extended conversations within a single inference session. The model supports dozens of natural languages, including French, German, Spanish, Italian, Portuguese, Arabic, Hindi, Russian, Chinese, Japanese, and Korean.

One of the defining characteristics of Mistral Large 2 is that it is designed to run on a single node despite its large parameter count, enabling high-throughput deployment without multi-node infrastructure. It also supports over 80 programming languages, including Python, Java, C, C++, JavaScript, and Bash, making it applicable to code generation and analysis tasks. These properties make it a practical choice for multilingual applications, long-context document workflows, and coding assistants.

What Mistral Large 2 supports

Long Context Window

Processes up to 128,000 tokens in a single request, enabling analysis of lengthy documents, codebases, or extended conversations without truncation.

Multilingual Text Generation

Generates and understands text in dozens of languages including French, German, Spanish, Arabic, Hindi, Chinese, Japanese, and Korean.

Code Generation

Supports code generation and comprehension across 80+ programming languages, including Python, Java, C, C++, JavaScript, and Bash.

Single-Node Inference

Designed to run at large throughput on a single node despite having 123 billion parameters, reducing infrastructure complexity for deployment.

Instruction Following

Responds to complex, multi-step instructions in natural language, supporting task completion across writing, summarization, and question answering.

Function Calling

Supports function calling and tool use, allowing the model to interact with external APIs and structured workflows in agentic applications.

Ready to build with Mistral Large 2?

Get Started Free

Benchmark scores

Scores represent accuracy — the percentage of questions answered correctly on each test.

Benchmark What it tests Score
MMLU-Pro Expert knowledge across 14 academic disciplines 69.7%
GPQA Diamond PhD-level science questions (biology, physics, chemistry) 48.6%
MATH-500 Undergraduate and competition-level math problems 73.6%
AIME 2024 American math olympiad problems 11.0%
LiveCodeBench Real-world coding tasks from recent competitions 29.3%
HLE Questions that challenge frontier models across many domains 4.0%
SciCode Scientific research coding and numerical methods 29.2%

Common questions about Mistral Large 2

What is the context window size for Mistral Large 2?

Mistral Large 2 has a context window of 128,000 tokens, allowing it to process long documents or extended conversations in a single request.

How many parameters does Mistral Large 2 have?

Mistral Large 2 has 123 billion parameters. It is designed to run on a single node at high throughput despite this scale.

What languages does Mistral Large 2 support?

The model supports dozens of natural languages including French, German, Spanish, Italian, Portuguese, Arabic, Hindi, Russian, Chinese, Japanese, and Korean, as well as 80+ programming languages such as Python, Java, C, C++, JavaScript, and Bash.

What is the knowledge cutoff date for Mistral Large 2?

A specific training cutoff date is not listed in the available metadata for Mistral Large 2. For the most accurate information, consult Mistral's official documentation.

What types of tasks is Mistral Large 2 best suited for?

Based on its design, Mistral Large 2 is well suited for long-context document processing, multilingual text generation, code generation across 80+ languages, and single-node deployments requiring high throughput.

What people think about Mistral Large 2

The Reddit threads provided do not contain discussions specifically about Mistral Large 2, focusing instead on other models such as Llama 4, GPT-5.2, and Google Gemini.

As a result, no community sentiment or use case patterns specific to Mistral Large 2 can be derived from the available threads.

View more discussions →

Parameters & options

Max Temperature 1
Max Response Size 16,000 tokens

Start building with Mistral Large 2

No API keys required. Create AI-powered workflows with Mistral Large 2 in minutes — free.