Skip to content

The MistralAI API wrapper for Delphi utilizes the various advanced models developed by Mistral to provide robust capabilities for chat interactions, string embeddings, precise code generation with Codestral, batch and moderation.

License

Notifications You must be signed in to change notification settings

MaxiDonkey/DelphiMistralAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Delphi MistralAI API


Delphi async/await supported GitHub GetIt – Available GitHub GitHub


NEW:





Introduction

Built with Delphi 12 Community Edition (v12.1 Patch 1)
The wrapper itself is MIT-licensed.
You can compile and test it free of charge with Delphi CE; any recent commercial Delphi edition works as well.


Core capabilities

  • Unified access to text, vision and audio endpoints
  • Agentic workflows via the v1/conversations endpoint, with built-in tools websearch, Code Interpreter, Image generation and Document library.
  • Supports state-of-the-art models, including voxtral, devstral, mistral-ocr and the reasoning-centric magistral

Developer tooling

  • Ready-made Sync, Async, and Await code snippets (TutorialHUB compatible)
  • Batch processing, function calling, file management, and content moderation out of the box
  • Built-in DUnit test helpers and a modular JSON configuration for streamlined setup
  • Mock-friendly design: the HTTP layer is injected via dependency injection, so you can swap in stubs or fakes for testing

Important

This is an unofficial library. MistralAI does not provide any official library for Delphi. This repository contains Delphi implementation over MistralAI public API.



Quickstart


Account setup

To get started, head over to https://console.mistral.ai and either create a new Mistral account or sign in if you already have one. Once you’re in, open your Organization settings at https://admin.mistral.ai – this is where you’ll centralize billing and access control. In the Administration tab, locate the Billing section and enter your payment details to activate billing. From here you can manage all of your Workspaces as well as the broader Organization.


Obtain an api key

To initialize the API instance, you need to obtain an API key from MistralAI

Once you have a token, you can initialize IMistralAI interface, which is an entry point to the API.

Note

//uses MistralAI, MistralAI.Types;

//Declare 
//  Client: IMistralAI
//  ClientCoding: IMistralAI;

 Client := TMistralAIFactory.CreateInstance(api_key);
 ClientCoding := TMistralAIFactory.CreateInstance(Codestral_api_key);

Obtain the Codestral_api_key

To streamline the use of the API wrapper, the process for declaring units has been simplified. Regardless of the methods being utilized, you only need to reference the following two core units: MistralAI and MistralAI.Types.


Tip

To effectively use the examples in this tutorial, particularly when working with asynchronous methods, it is recommended to define the client interfaces with the broadest possible scope. For optimal implementation, these clients should be declared in the application's OnCreate method.


Getting started with Mistral AI API

Mistral AI’s API lets you plug the models into your apps and production pipelines in just a few lines of code. It’s currently accessible via La Plateforme—just make sure you’ve activated billing on your account so your API keys will work. Within moments of enabling payments, you’ll be able to call our chat endpoint:


Synchronous code example

//uses MistralAI, MistralAI.Types;

  var API_Key := 'MISTRAL_API_KEY';
  var MyClient := TMistralAIFactory.CreateInstance(API_KEY);

  //Synchronous example
  var Chat := MyClient.Chat.Create(
    procedure (Params: TChatParams)
    begin
      Params
        .Model('mistral-tiny')
        .Messages([Payload.User('Explain to me what joual is for Quebecers.')])
        .MaxTokens(1024);
    end);
  try
    for var Item in Chat.Choices do
      Memo1.Text := Item.Message.Content[0].Text;
  finally
    Chat.Free;
  end;


Asynchronous code example

var MyClient: IMistralAI;

procedure TForm1.Test;
begin
  var API_Key := 'MISTRAL_API_KEY';
  MyClient := TMistralAIFactory.CreateInstance(API_KEY);

  //Asynchronous example
  MyClient.Chat.AsyncCreate(
    procedure (Params: TChatParams)
    begin
      Params
        .Model('mistral-tiny')
        .Messages([Payload.User('Explain to me what joual is for Quebecers.')])
        .MaxTokens(1024);
    end,
    function : TAsynChat
    begin
      Result.OnStart :=
        procedure (Sender: TObject)
        begin
          Memo1.Lines.Text := 'Please wait...';
        end;

      Result.OnSuccess :=
        procedure (Sender: TObject; Value: TChat)
        begin
          for var Item in Value.Choices do
            Memo1.Lines.Text := Item.Message.Content[0].Text;
        end;

      Result.OnError :=
        procedure (Sender: TObject; Error: string)
        begin
          Memo1.Lines.Text := Error;
        end;
    end);
end;


Documentation Overview

Comprehensive Project Documentation Reference

1. Model Discovery & Metadata

2. Generation / Completions

3. Orchestration & Agents

4. Data & Persistence

5. Safety & Filtering

6. Customization / Tuning

7. Multimodal / Non-text Input



TIPS for using the tutorial effectively


Strategies for quickly using the code examples

To streamline the implementation of the code examples provided in this tutorial, two support units have been included in the source code: MistralAI.Tutorial.VCL and MistralAI.Tutorial.FMX Based on the platform selected for testing the provided examples, you will need to initialize either the TVCLTutorialHub or TFMXTutorialHub class within the application's OnCreate event, as illustrated below:


Important

In this repository, you will find in the sample folder two ZIP archives, each containing a template to easily test all the code examples provided in this tutorial. Extract the VCL or FMX version depending on your target platform for testing. Next, add the path to the DelphiMistralAI library in your project’s options, then copy and paste the code examples for immediate execution.

These two archives have been designed to fully leverage the TutorialHub middleware and enable rapid upskilling with DelphiMistralAI.

  • VCL support with TutorialHUB: TestMistralAI_VCL.zip

  • FMX support with TutorialHUB: TestMistralAI_FMX.zip


This project, built with DelphiGenAI , allows you to consult MistralAI documentation and code in order to streamline and accelerate your upskilling.

Preview



DelphiMistralAI functional coverage

Below, the table succinctly summarizes all MistralAI endpoints supported by the DelphiMistralAI.

Description End point supported
Chat Completion API v1/chat/completions
Fill-in-the-middle API v1/fim/completions
Agents completions API v1/agents/completions
Embeddings API v1/embeddings
Classifiers API v1/moderations
v1/chat/moderations
v1/classifications
v1/chat/classifications
Files API v1/files
Fine-tuning API v1/fine_tuning/jobs
Model Management API v1/models
v1/fine_tuning/models
Batch API v1/batch/jobs
OCR API v1/ocr
(beta) Agents API v1/agents
(beta) Conversations API v1/conversations
(beta) Libraries API to create and manage libraries v1/libraries
(beta) Libraries API - manage documents in a library v1/libraries/{library_id}/documents
(beta) Libraries API - manage access to a library v1/libraries/{library_id}/share
Audio transcriptions v1/audio/transcriptions


Conversations vs. Chat Completions

The v1/conversations API is the new core API, designed as an agentic primitive that combines the simplicity of chat completions with the power of action execution. It natively includes several built‑in tools:

  • Reasoning
  • Web search
  • Code interpreter
  • Image generation
  • Document library

With these integrated capabilities, you can build more autonomous, agent‑oriented applications that not only generate text but also interact with their environment.

The v1/conversations endpoint is intended to gradually replace v1/chat/completions, as it embodies a synthesis of current best practices in AI—especially for those looking to adopt an agentic approach.

To help you get up to speed on both endpoints, the two following documents provide detailed documentation, complete with numerous request examples and use cases:

Note

If you're a new user, we recommend using the Responses API.


Functional differences between the three endpoints

Capabilities Chat Completions API Conversations API Agents & connectors API
Text generation
Vision
Audio
Function calling
Structured Outputs
Reasoning
Web search
Image generation
Document library
Code interpreter

Important

Note: Agent and Connector work in conjunction with v1/conversations, meaning they are simply tools invoked within the context of those conversations.



Tips and tricks


How to prevent an error when closing an application while requests are still in progress?

Starting from version 1.3.0 of DelphiMistralAI, the MistralAI.Monitoring unit is responsible for monitoring ongoing HTTP requests.

The Monitoring interface is accessible by including the MistralAI.Monitoring unit in the uses clause.
Alternatively, you can access it via the HttpMonitoring function, declared in the MistralAI unit.

Usage Example

//uses MistralAI;

procedure TForm1.FormCloseQuery(Sender: TObject; var CanClose: Boolean);
begin
  CanClose := not HttpMonitoring.IsBusy;
  if not CanClose then
    MessageDLG(
      'Requests are still in progress. Please wait for them to complete before closing the application."',
      TMsgDlgType.mtInformation, [TMsgDlgBtn.mbOK], 0);
end;

Contributing

Pull requests are welcome. If you're planning to make a major change, please open an issue first to discuss your proposed changes.



License

This project is licensed under the MIT License.


About

The MistralAI API wrapper for Delphi utilizes the various advanced models developed by Mistral to provide robust capabilities for chat interactions, string embeddings, precise code generation with Codestral, batch and moderation.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages