Mistral

Features 🔥
- Fully generated C# SDK based on official Mistral OpenAPI specification using AutoSDK
- Same day update to support new features
- Updated and supported automatically if there are no breaking changes
- All modern .NET features - nullability, trimming, NativeAOT, etc.
- Support .Net Framework/.Net Standard 2.0
- Microsoft.Extensions.AI
IChatClient support
Usage
1
2
3
4
5
6
7
8
9
10
11
12
13 | using Mistral;
using var client = new MistralClient(apiKey);
ChatCompletionResponse response = await client.Agents.AgentsCompletionAsync(
agentId: "Test",
messages: new List<OneOf<UserMessage, AssistantMessage, ToolMessage>>
{
new UserMessage
{
Content = "Hello",
},
});
|
Microsoft.Extensions.AI
The SDK implements IChatClient for seamless integration with the .NET AI ecosystem:
| using Mistral;
using Meai = Microsoft.Extensions.AI;
Meai.IChatClient chatClient = new MistralClient(apiKey);
var response = await chatClient.GetResponseAsync(
[new Meai.ChatMessage(Meai.ChatRole.User, "Hello!")],
new Meai.ChatOptions { ModelId = "mistral-large-latest" });
Console.WriteLine(response.Text);
|
Note: Use the Meai alias because the Mistral SDK has its own generated IChatClient interface.
Chat Client Five Random Words Streaming
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20 | using var client = new MistralClient(apiKey);
Meai.IChatClient chatClient = client;
var updates = chatClient.GetStreamingResponseAsync(
[
new Meai.ChatMessage(Meai.ChatRole.User, "Generate 5 random words.")
],
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
});
var deltas = new List<string>();
await foreach (var update in updates)
{
if (!string.IsNullOrWhiteSpace(update.Text))
{
deltas.Add(update.Text);
}
}
|
Chat Client Five Random Words
| using var client = new MistralClient(apiKey);
Meai.IChatClient chatClient = client;
var response = await chatClient.GetResponseAsync(
[
new Meai.ChatMessage(Meai.ChatRole.User, "Generate 5 random words.")
],
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
});
|
| using var client = CreateTestClient();
Meai.IChatClient chatClient = client;
var metadata = Meai.ChatClientExtensions.GetService<Meai.ChatClientMetadata>(chatClient);
|
Chat Client Get Service Returns Null For Unknown Key
| using var client = CreateTestClient();
Meai.IChatClient chatClient = client;
var result = Meai.ChatClientExtensions.GetService<Meai.ChatClientMetadata>(chatClient, serviceKey: "unknown");
|
Chat Client Get Service Returns Self
| using var client = CreateTestClient();
Meai.IChatClient chatClient = client;
var self = Meai.ChatClientExtensions.GetService<MistralClient>(chatClient);
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44 | using var client = new MistralClient(apiKey);
var getWeatherTool = Meai.AIFunctionFactory.Create(
(string location) => $"The weather in {location} is 72°F and sunny.",
name: "get_weather",
description: "Gets the current weather for a given location.");
Meai.IChatClient chatClient = client;
var messages = new List<Meai.ChatMessage>
{
new(Meai.ChatRole.User, "What is the weather in Paris?"),
};
// First turn: model requests tool call
var response = await chatClient.GetResponseAsync(
messages,
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
Tools = [getWeatherTool],
});
var functionCall = response.Messages
.SelectMany(m => m.Contents)
.OfType<Meai.FunctionCallContent>()
.FirstOrDefault();
// Add assistant message with function call and tool result
messages.AddRange(response.Messages);
var toolResult = await getWeatherTool.InvokeAsync(
functionCall!.Arguments is { } args ? new Meai.AIFunctionArguments(args) : null);
messages.Add(new Meai.ChatMessage(Meai.ChatRole.Tool,
[
new Meai.FunctionResultContent(functionCall.CallId, toolResult),
]));
// Second turn: model should produce a final text response
var finalResponse = await chatClient.GetResponseAsync(
messages,
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
Tools = [getWeatherTool],
});
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22 | using var client = new MistralClient(apiKey);
var getWeatherTool = Meai.AIFunctionFactory.Create(
(string location) => $"The weather in {location} is 72°F and sunny.",
name: "get_weather",
description: "Gets the current weather for a given location.");
Meai.IChatClient chatClient = client;
var response = await chatClient.GetResponseAsync(
[
new Meai.ChatMessage(Meai.ChatRole.User, "What is the weather in Paris?")
],
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
Tools = [getWeatherTool],
});
var functionCall = response.Messages
.SelectMany(m => m.Contents)
.OfType<Meai.FunctionCallContent>()
.FirstOrDefault();
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23 | using var client = new MistralClient(apiKey);
var getWeatherTool = Meai.AIFunctionFactory.Create(
(string location) => $"The weather in {location} is 72°F and sunny.",
name: "get_weather",
description: "Gets the current weather for a given location.");
Meai.IChatClient chatClient = client;
var updates = chatClient.GetStreamingResponseAsync(
[
new Meai.ChatMessage(Meai.ChatRole.User, "What is the weather in Paris?")
],
new Meai.ChatOptions
{
ModelId = "mistral-small-latest",
Tools = [getWeatherTool],
});
var functionCalls = new List<Meai.FunctionCallContent>();
await foreach (var update in updates)
{
functionCalls.AddRange(update.Contents.OfType<Meai.FunctionCallContent>());
}
|
Test
| using var client = new MistralClient(apiKey);
ChatCompletionResponse response = await client.Chat.ChatCompletionAsync(
model: "mistral-small-latest",
messages: new List<MessagesItem>
{
new UserMessage
{
Content = "Hello",
},
});
|
Support
Priority place for bugs: https://github.com/tryAGI/Mistral/issues
Priority place for ideas and general questions: https://github.com/tryAGI/Mistral/discussions
Discord: https://discord.gg/Ca2xhfBf3v
Acknowledgments

This project is supported by JetBrains through the Open Source Support Program.