This will help you get started with Mistral chat models. For detailed documentation of allDocumentation Index
Fetch the complete documentation index at: https://langchain-5e9cc07a-preview-mdrxyo-1777658790-7be347c.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
ChatMistralAI features and configurations head to the API reference. The ChatMistralAI class is built on top of the Mistral API. For a list of all the models supported by Mistral, check out this page.
Overview
Integration details
| Class | Package | Serializable | JS support | Downloads | Version |
|---|---|---|---|---|---|
ChatMistralAI | langchain-mistralai | beta | ✅ |
Model features
| Tool calling | Structured output | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ❌ |
Setup
To accessChatMistralAI models you’ll need to create a Mistral account, get an API key, and install the langchain-mistralai integration package.
Credentials
A valid API key is needed to communicate with the API. Once you’ve done this set the MISTRAL_API_KEY environment variable:Installation
The LangChain Mistral integration lives in thelangchain-mistralai package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
Head to the API reference for detailed documentation of all attributes and methods.Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

