- Introduction
- Quickstart
- Documentation Overview
- TIPS for using the tutorial effectively
- DelphiMistralAI functional coverage
- Conversations vs. Chat Completions
- Tips and tricks
- Contributing
- License
Built with Delphi 12 Community Edition (v12.1 Patch 1)
The wrapper itself is MIT-licensed.
You can compile and test it free of charge with Delphi CE; any recent commercial Delphi edition works as well.
Core capabilities
- Unified access to text, vision and audio endpoints
- Agentic workflows via the
v1/conversations
endpoint, with built-in toolswebsearch
,Code Interpreter
,Image generation
andDocument library
. - Supports state-of-the-art models, including voxtral, devstral, mistral-ocr and the reasoning-centric magistral
Developer tooling
- Ready-made
Sync
,Async
, andAwait
code snippets (TutorialHUB compatible) - Batch processing, function calling, file management, and content moderation out of the box
- Built-in DUnit test helpers and a modular JSON configuration for streamlined setup
- Mock-friendly design: the HTTP layer is injected via dependency injection, so you can swap in stubs or fakes for testing
Important
This is an unofficial library. MistralAI does not provide any official library for Delphi
.
This repository contains Delphi
implementation over MistralAI public API.
To get started, head over to https://console.mistral.ai and either create a new Mistral account or sign in if you already have one. Once you’re in, open your Organization settings at https://admin.mistral.ai – this is where you’ll centralize billing and access control. In the Administration tab, locate the Billing section and enter your payment details to activate billing. From here you can manage all of your Workspaces as well as the broader Organization.
To initialize the API instance, you need to obtain an API key from MistralAI
Once you have a token, you can initialize IMistralAI interface, which is an entry point to the API.
Note
//uses MistralAI, MistralAI.Types;
//Declare
// Client: IMistralAI
// ClientCoding: IMistralAI;
Client := TMistralAIFactory.CreateInstance(api_key);
ClientCoding := TMistralAIFactory.CreateInstance(Codestral_api_key);
To streamline the use of the API wrapper, the process for declaring units has been simplified. Regardless of the methods being utilized, you only need to reference the following two core units:
MistralAI
and MistralAI.Types
.
Tip
To effectively use the examples in this tutorial, particularly when working with asynchronous methods, it is recommended to define the client interfaces with the broadest possible scope. For optimal implementation, these clients should be declared in the application's OnCreate
method.
Mistral AI’s API lets you plug the models into your apps and production pipelines in just a few lines of code. It’s currently accessible via La Plateforme—just make sure you’ve activated billing on your account so your API keys will work. Within moments of enabling payments, you’ll be able to call our chat endpoint:
//uses MistralAI, MistralAI.Types;
var API_Key := 'MISTRAL_API_KEY';
var MyClient := TMistralAIFactory.CreateInstance(API_KEY);
//Synchronous example
var Chat := MyClient.Chat.Create(
procedure (Params: TChatParams)
begin
Params
.Model('mistral-tiny')
.Messages([Payload.User('Explain to me what joual is for Quebecers.')])
.MaxTokens(1024);
end);
try
for var Item in Chat.Choices do
Memo1.Text := Item.Message.Content[0].Text;
finally
Chat.Free;
end;
var MyClient: IMistralAI;
procedure TForm1.Test;
begin
var API_Key := 'MISTRAL_API_KEY';
MyClient := TMistralAIFactory.CreateInstance(API_KEY);
//Asynchronous example
MyClient.Chat.AsyncCreate(
procedure (Params: TChatParams)
begin
Params
.Model('mistral-tiny')
.Messages([Payload.User('Explain to me what joual is for Quebecers.')])
.MaxTokens(1024);
end,
function : TAsynChat
begin
Result.OnStart :=
procedure (Sender: TObject)
begin
Memo1.Lines.Text := 'Please wait...';
end;
Result.OnSuccess :=
procedure (Sender: TObject; Value: TChat)
begin
for var Item in Value.Choices do
Memo1.Lines.Text := Item.Message.Content[0].Text;
end;
Result.OnError :=
procedure (Sender: TObject; Error: string)
begin
Memo1.Lines.Text := Error;
end;
end);
end;
Comprehensive Project Documentation Reference
-
Standard Completions
-
Specialized Completions
-
Experimental Conversations
- Files and Libraries
- Vector Representations
To streamline the implementation of the code examples provided in this tutorial, two support units have been included in the source code: MistralAI.Tutorial.VCL
and MistralAI.Tutorial.FMX
Based on the platform selected for testing the provided examples, you will need to initialize either the TVCLTutorialHub
or TFMXTutorialHub
class within the application's OnCreate event, as illustrated below:
Important
In this repository, you will find in the sample
folder two ZIP archives, each containing a template to easily test all the code examples provided in this tutorial.
Extract the VCL
or FMX
version depending on your target platform for testing.
Next, add the path to the DelphiMistralAI library in your project’s options, then copy and paste the code examples for immediate execution.
These two archives have been designed to fully leverage the TutorialHub middleware and enable rapid upskilling with DelphiMistralAI.
-
VCL
support with TutorialHUB: TestMistralAI_VCL.zip -
FMX
support with TutorialHUB: TestMistralAI_FMX.zip
This project, built with DelphiGenAI
, allows you to consult MistralAI
documentation and code in order to streamline and accelerate your upskilling.
Below, the table succinctly summarizes all MistralAI endpoints supported by the DelphiMistralAI.
Description | End point | supported |
---|---|---|
Chat Completion API | v1/chat/completions | ● |
Fill-in-the-middle API | v1/fim/completions | ● |
Agents completions API | v1/agents/completions | ● |
Embeddings API | v1/embeddings | ● |
Classifiers API | v1/moderations | ● |
v1/chat/moderations | ● |
|
v1/classifications | ||
v1/chat/classifications | ||
Files API | v1/files | ● |
Fine-tuning API | v1/fine_tuning/jobs | ● |
Model Management API | v1/models | ● |
v1/fine_tuning/models | ● |
|
Batch API | v1/batch/jobs | ● |
OCR API | v1/ocr | ● |
(beta) Agents API | v1/agents | ● |
(beta) Conversations API | v1/conversations | ● |
(beta) Libraries API to create and manage libraries | v1/libraries | ● |
(beta) Libraries API - manage documents in a library | v1/libraries/{library_id}/documents | ● |
(beta) Libraries API - manage access to a library | v1/libraries/{library_id}/share | ● |
Audio transcriptions | v1/audio/transcriptions | ● |
The v1/conversations
API is the new core API, designed as an agentic primitive that combines the simplicity of chat completions with the power of action execution. It natively includes several built‑in tools:
- Reasoning
- Web search
- Code interpreter
- Image generation
- Document library
With these integrated capabilities, you can build more autonomous, agent‑oriented applications that not only generate text but also interact with their environment.
The v1/conversations
endpoint is intended to gradually replace v1/chat/completions
, as it embodies a synthesis of current best practices in AI—especially for those looking to adopt an agentic approach.
To help you get up to speed on both endpoints, the two following documents provide detailed documentation, complete with numerous request examples and use cases:
Note
If you're a new user, we recommend using the Responses API.
Capabilities | Chat Completions API | Conversations API | Agents & connectors API |
---|---|---|---|
Text generation | ● |
● |
|
Vision | ● |
||
Audio | ● |
||
Function calling | ● |
● |
|
Structured Outputs | ● |
● |
|
Reasoning | ● |
● |
● |
Web search | ● |
● |
|
Image generation | ● |
● |
|
Document library | ● |
● |
|
Code interpreter | ● |
● |
Important
Note: Agent and Connector work in conjunction with v1/conversations
, meaning they are simply tools invoked within the context of those conversations.
Starting from version 1.3.0 of DelphiMistralAI, the MistralAI.Monitoring
unit is responsible for monitoring ongoing HTTP requests.
The Monitoring
interface is accessible by including the MistralAI.Monitoring
unit in the uses
clause.
Alternatively, you can access it via the HttpMonitoring
function, declared in the MistralAI
unit.
Usage Example
//uses MistralAI;
procedure TForm1.FormCloseQuery(Sender: TObject; var CanClose: Boolean);
begin
CanClose := not HttpMonitoring.IsBusy;
if not CanClose then
MessageDLG(
'Requests are still in progress. Please wait for them to complete before closing the application."',
TMsgDlgType.mtInformation, [TMsgDlgBtn.mbOK], 0);
end;
Pull requests are welcome. If you're planning to make a major change, please open an issue first to discuss your proposed changes.
This project is licensed under the MIT License.