azureaifoundry

package module
v1.1.4 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 15, 2025 License: Apache-2.0 Imports: 16 Imported by: 0

README

Azure AI Foundry Plugin for Genkit Go

A comprehensive Azure AI Foundry plugin for Genkit Go that provides text generation and chat capabilities using Azure OpenAI and other models available through Azure AI Foundry.

Features

  • Text Generation: Support for GPT-5, GPT-5 mini, GPT-4o, GPT-4o mini, GPT-4 Turbo, GPT-4, and GPT-3.5 Turbo models
  • Embeddings: Support for text-embedding-ada-002, text-embedding-3-small, and text-embedding-3-large models
  • Image Generation: Support for creating images from text prompts
  • Text-to-Speech: Convert text to natural-sounding speech with multiple voices
  • Speech-to-Text: Transcribe audio to text using with subtitle support
  • Streaming: Full streaming support for real-time responses
  • Tool Calling: Complete function calling capabilities for GPT-4 and GPT-3.5-turbo models
  • Multimodal Support: Support for text + image inputs (vision models like GPT-5, GPT-4o and GPT-4 Turbo)
  • Multi-turn Conversations: Full support for chat history and context management
  • Type Safety: Robust type conversion and schema validation
  • Flexible Authentication: Support for API keys, Azure Default Credential, and custom token credentials

Supported Models

Text Generation Models (with Tool Calling Support)
  • GPT-5: Latest advanced model (check Azure for availability)
  • GPT-5 mini: Smaller, faster version of GPT-5
  • GPT-4o: multimodal model with vision capabilities
  • GPT-4o mini: Smaller, faster version of GPT-4o
  • GPT-4 Turbo: High-performance GPT-4 with vision support
  • GPT-4: Standard GPT-4 model
  • GPT-3.5 Turbo: Fast and cost-effective model

All GPT-5, GPT-4 and GPT-3.5-turbo models support function calling (tools).

Installation

go get github.com/xavidop/genkit-azure-foundry-go

Quick Start

Initialize the Plugin
package main

import (
	"context"
	"log"
	"os"

	"github.com/firebase/genkit/go/ai"
	"github.com/firebase/genkit/go/genkit"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	ctx := context.Background()

	// Initialize Azure AI Foundry plugin
	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
		APIKey:   os.Getenv("AZURE_OPENAI_API_KEY"),
	}

	// Initialize Genkit
	g := genkit.Init(ctx,
		genkit.WithPlugins(azurePlugin),
		genkit.WithDefaultModel("azureaifoundry/gpt-5"),
	)

	// Optional: Define common models for easy access
	azureaifoundry.DefineCommonModels(azurePlugin, g)

	log.Println("Starting basic Azure AI Foundry example...")

	// Example: Generate text (basic usage)
	response, err := genkit.Generate(ctx, g,
		ai.WithPrompt("What are the key benefits of using Azure AI Foundry?"),
	)
	if err != nil {
		log.Printf("Error: %v", err)
	} else {
		log.Printf("Response: %s", response.Text())
	}
}
Define Models and Generate Text
package main

import (
	"context"
	"log"

	"github.com/firebase/genkit/go/ai"
	"github.com/firebase/genkit/go/genkit"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	ctx := context.Background()

	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint: "https://your-resource.openai.azure.com/",
		APIKey:   "your-api-key",
	}

	g := genkit.Init(ctx,
		genkit.WithPlugins(azurePlugin),
	)

	// Define a GPT-5 model (use your deployment name)
	gpt5Model := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
		Name:          "gpt-5", // Your deployment name in Azure
		Type:          "chat",
		SupportsMedia: true,
	}, nil)

	// Generate text
	response, err := genkit.Generate(ctx, g,
		ai.WithModel(gpt4Model),
		ai.WithMessages(ai.NewUserMessage(
			ai.NewTextPart("Explain quantum computing in simple terms."),
		)),
	)

	if err != nil {
		log.Fatal(err)
	}

	log.Println(response.Text())
}

Configuration Options

The plugin supports various configuration options:

azurePlugin := &azureaifoundry.AzureAIFoundry{
	Endpoint:   "https://your-resource.openai.azure.com/",
	APIKey:     "your-api-key",              // Use API key
	// OR use Azure credential
	// Credential: azidentity.NewDefaultAzureCredential(),
	APIVersion: "2024-02-15-preview",        // Optional
}
Available Configuration
Option Type Default Description
Endpoint string required Azure OpenAI endpoint URL
APIKey string "" API key for authentication
Credential azcore.TokenCredential nil Azure credential (alternative to API key)
APIVersion string Latest API version to use

Azure Setup and Authentication

Getting Your Endpoint and API Key
  1. Go to Azure Portal
  2. Navigate to your Azure OpenAI resource
  3. Go to "Keys and Endpoint" section
  4. Copy your endpoint URL and API key
Authentication Methods

The plugin supports multiple authentication methods to suit different deployment scenarios:

1. API Key Authentication (Quick Start)

Best for: Development, testing, and simple scenarios

export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"
import (
	"os"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

azurePlugin := &azureaifoundry.AzureAIFoundry{
	Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
	APIKey:   os.Getenv("AZURE_OPENAI_API_KEY"),
}

Best for: Production deployments, Azure-hosted applications

DefaultAzureCredential automatically tries multiple authentication methods in the following order:

  1. Environment variables (AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID)
  2. Managed Identity (when deployed to Azure)
  3. Azure CLI credentials (for local development)
  4. Azure PowerShell credentials
  5. Interactive browser authentication
# Required environment variables
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"

# Optional: For service principal authentication
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
import (
	"fmt"
	"os"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
	tenantID := os.Getenv("AZURE_TENANT_ID")

	// Create DefaultAzureCredential
	credential, err := azidentity.NewDefaultAzureCredential(&azidentity.DefaultAzureCredentialOptions{
		TenantID: tenantID,
	})
	if err != nil {
		fmt.Fprintf(os.Stderr, "ERROR: %s\n", err)
		return
	}

	// Initialize plugin with credential
	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint:   endpoint,
		Credential: credential,
	}

	// Use the plugin with Genkit...
}
3. Managed Identity (Azure Deployments)

Best for: Applications deployed to Azure (App Service, Container Apps, VMs, AKS)

When deployed to Azure, Managed Identity provides authentication without storing credentials:

import (
	"os"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")

	// Use Managed Identity
	credential, err := azidentity.NewManagedIdentityCredential(nil)
	if err != nil {
		panic(err)
	}

	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint:   endpoint,
		Credential: credential,
	}
}
4. Client Secret Credential (Service Principal)

Best for: CI/CD pipelines, automated deployments

export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
import (
	"os"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
	tenantID := os.Getenv("AZURE_TENANT_ID")
	clientID := os.Getenv("AZURE_CLIENT_ID")
	clientSecret := os.Getenv("AZURE_CLIENT_SECRET")

	credential, err := azidentity.NewClientSecretCredential(tenantID, clientID, clientSecret, nil)
	if err != nil {
		panic(err)
	}

	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint:   endpoint,
		Credential: credential,
	}
}
5. Azure CLI Credential (Local Development)

Best for: Local development with Azure CLI installed

# Login to Azure CLI first
az login

export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
import (
	"os"
	"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

func main() {
	endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")

	// Use Azure CLI credentials
	credential, err := azidentity.NewAzureCLICredential(nil)
	if err != nil {
		panic(err)
	}

	azurePlugin := &azureaifoundry.AzureAIFoundry{
		Endpoint:   endpoint,
		Credential: credential,
	}
}
Model Deployments

Important: The Name in ModelDefinition should match your deployment name in Azure, not the model name. For example:

  • If you deployed gpt-5 with deployment name my-gpt5-deployment, use "my-gpt5-deployment"
  • If you deployed gpt-4o with deployment name gpt-4o, use "gpt-4o"

Examples Directory

The repository includes comprehensive examples:

  • examples/basic/ - Simple text generation
  • examples/streaming/ - Real-time streaming responses
  • examples/chat/ - Multi-turn conversation with context
  • examples/embeddings/ - Text embeddings generation
  • examples/tool_calling/ - Function calling with multiple tools
  • examples/vision/ - Multimodal image analysis
  • examples/image_generation/ - Generate images
  • examples/text_to_speech/ - Convert text to speech
  • examples/speech_to_text/ - Transcribe audio to text
Running Examples
# Set environment variables
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"

# Run basic example
cd examples/basic
go run main.go

# Run streaming example
cd ../streaming
go run main.go

# Run chat example
cd ../chat
go run main.go

# Run tool calling example
cd ../tool_calling
go run main.go

# Run vision example
cd ../vision
go run main.go

# Run image generation example
cd ../image_generation
go run main.go

# Run text-to-speech example
cd ../text_to_speech
go run main.go

# Run speech-to-text example (requires audio files)
cd ../speech_to_text
go run main.go

Features in Detail

🔧 Tool Calling (Function Calling)
// Define a tool
weatherTool := genkit.DefineTool(g, "get_weather",
	"Get current weather",
	func(ctx *ai.ToolContext, input struct {
		Location string `json:"location"`
		Unit     string `json:"unit,omitempty"`
	}) (string, error) {
		return getWeather(input.Location, input.Unit)
	},
)

// Use the tool
response, err := genkit.Generate(ctx, g,
	ai.WithModel(gpt4Model),
	ai.WithTools(weatherTool),
	ai.WithPrompt("What's the weather in San Francisco?"),
)
🖼️ Multimodal Support (Vision)

GPT-5 and GPT-4o support image inputs:

response, err := genkit.Generate(ctx, g,
	ai.WithModel(gpt5Model),
	ai.WithMessages(ai.NewUserMessage(
		ai.NewTextPart("What's in this image?"),
		ai.NewMediaPart("image/jpeg", imageDataURL),
	)),
)
📡 Streaming
streamCallback := func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
	for _, part := range chunk.Content {
		if part.IsText() {
			fmt.Print(part.Text)
		}
	}
	return nil
}

response, err := genkit.Generate(ctx, g,
	ai.WithModel(gpt4Model),
	ai.WithPrompt("Tell me a story"),
	ai.WithStreaming(streamCallback),
)
💬 Multi-turn Conversations
// First message
response1, _ := genkit.Generate(ctx, g,
	ai.WithModel(gpt4Model),
	ai.WithMessages(
		ai.NewSystemMessage(ai.NewTextPart("You are a helpful assistant.")),
		ai.NewUserTextMessage("What is Azure?"),
	),
)

// Follow-up message with context
response2, _ := genkit.Generate(ctx, g,
	ai.WithModel(gpt4Model),
	ai.WithMessages(
		ai.NewSystemMessage(ai.NewTextPart("You are a helpful assistant.")),
		ai.NewUserTextMessage("What is Azure?"),
		response1.Message, // Previous assistant message
		ai.NewUserTextMessage("What are its key services?"),
	),
)
🔢 Embeddings
import (
	"github.com/firebase/genkit/go/ai"
	azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)

// Define an embedder (use your deployment name)
embedder := azurePlugin.DefineEmbedder(g, "text-embedding-3-small")

// Or use common embedders helper
embedders := azureaifoundry.DefineCommonEmbedders(azurePlugin, g)

// Generate embeddings
response, err := genkit.Embed(ctx, g,
	ai.WithEmbedder(embedder),
	ai.WithEmbedText("Azure AI Foundry provides powerful AI capabilities"),
)

if err != nil {
	log.Fatal(err)
}

// Access the embedding vector
embedding := response.Embeddings[0].Embedding // []float32
log.Printf("Embedding dimensions: %d", len(embedding))
🎨 Image Generation

Generate images with DALL-E models using the standard genkit.Generate() method:

// Define DALL-E model
dallE3 := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
	Name: azureaifoundry.ModelDallE3,
	Type: "chat",
}, nil)

// Generate image
response, err := genkit.Generate(ctx, g,
	ai.WithModel(dallE3),
	ai.WithPrompt("A serene landscape with mountains at sunset"),
	ai.WithConfig(map[string]interface{}{
		"quality": "hd",
		"size":    "1024x1024",
		"style":   "vivid",
	}),
)

if err != nil {
	log.Fatal(err)
}

log.Printf("Image URL: %s", response.Text())
🗣️ Text-to-Speech

Convert text to speech using the standard genkit.Generate() method:

import "encoding/base64"

// Define TTS model
ttsModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
	Name: azureaifoundry.ModelTTS1HD,
	Type: "chat",
}, nil)

// Generate speech
response, err := genkit.Generate(ctx, g,
	ai.WithModel(ttsModel),
	ai.WithPrompt("Hello! Welcome to Azure AI Foundry."),
	ai.WithConfig(map[string]interface{}{
		"voice":           "nova",
		"response_format": "mp3",
		"speed":           1.5,
	}),
)

if err != nil {
	log.Fatal(err)
}

// Decode base64 audio and save file
audioData, _ := base64.StdEncoding.DecodeString(response.Text())
os.WriteFile("output.mp3", audioData, 0644)
🎙️ Speech-to-Text

Transcribe audio to text using the standard genkit.Generate() method:

import "encoding/base64"

// Define Whisper model with media support (required for audio input)
whisperModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
	Name:          azureaifoundry.ModelWhisper1,
	Type:          "chat",
	SupportsMedia: true, // Required for media parts (audio)
}, nil)

// Read and encode audio file
audioData, _ := os.ReadFile("audio.mp3")
base64Audio := base64.StdEncoding.EncodeToString(audioData)

// Transcribe audio
response, err := genkit.Generate(ctx, g,
	ai.WithModel(whisperModel),
	ai.WithMessages(ai.NewUserMessage(
		ai.NewMediaPart("audio/mp3", "data:audio/mp3;base64,"+base64Audio),
	)),
	ai.WithConfig(map[string]interface{}{
		"language": "en",
	}),
)

if err != nil {
	log.Fatal(err)
}

log.Printf("Transcription: %s", response.Text())

Troubleshooting

Common Issues
  1. "Endpoint is required" Error

    • Verify AZURE_OPENAI_ENDPOINT is set correctly
    • Ensure the endpoint URL includes https:// and trailing /
  2. "Deployment not found" Error

    • Check that the deployment name in your code matches the actual deployment name in Azure
    • Verify the model is deployed in your Azure OpenAI resource
  3. Authentication Errors

    • Ensure your API key is correct
    • Check that your Azure subscription is active
    • Verify network connectivity to Azure
  4. Rate Limit Errors

    • Implement exponential backoff retry logic
    • Consider upgrading to higher rate limits
    • Distribute requests across time

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Follow Conventional Commits format
  4. Commit your changes (git commit -m 'feat: add amazing feature')
  5. Push to the branch (git push origin feature/amazing-feature)
  6. Open a Pull Request

License

Apache 2.0 - see LICENSE file for details.

Acknowledgments

  • Genkit team for the excellent Go framework
  • Azure AI team for the comprehensive AI platform
  • The open source community for inspiration and feedback

Built with ❤️ for the Genkit Go community

Documentation

Overview

Package azureaifoundry provides a comprehensive Azure AI Foundry plugin for Genkit Go. This plugin supports text generation and chat capabilities using Azure OpenAI and other models available through Azure AI Foundry.

Index

Constants

View Source
const (
	ModelDallE2       = "dall-e-2"
	ModelDallE3       = "dall-e-3"
	ModelGPTImageBeta = "gpt-image-1"
)

Common model names for image generation

View Source
const (
	ModelTTS1         = "tts-1"
	ModelTTS1HD       = "tts-1-hd"
	ModelGPT4oMiniTTS = "gpt-4o-mini-tts"
)

Common model names for text-to-speech

View Source
const (
	ModelWhisper1               = "whisper-1"
	ModelGPT4oMiniTranscribe    = "gpt-4o-mini-transcribe"
	ModelGPT4oTranscribe        = "gpt-4o-transcribe"
	ModelGPT4oTranscribeDiarize = "gpt-4o-transcribe-diarize"
)

Common model names for speech-to-text

Variables

This section is empty.

Functions

func DefineCommonEmbedders

func DefineCommonEmbedders(a *AzureAIFoundry, g *genkit.Genkit) map[string]ai.Embedder

DefineCommonEmbedders is a helper to define commonly used Azure OpenAI embedding models

func DefineCommonModels

func DefineCommonModels(a *AzureAIFoundry, g *genkit.Genkit) map[string]ai.Model

DefineCommonModels is a helper to define commonly used Azure OpenAI models

func Embedder

func Embedder(g *genkit.Genkit, name string) ai.Embedder

Embedder returns the Embedder with the given name.

func IsDefinedEmbedder

func IsDefinedEmbedder(g *genkit.Genkit, name string) bool

IsDefinedEmbedder reports whether an embedder is defined.

func IsDefinedModel

func IsDefinedModel(g *genkit.Genkit, name string) bool

IsDefinedModel reports whether a model is defined.

func Model

func Model(g *genkit.Genkit, name string) ai.Model

Model returns the Model with the given name.

Types

type AzureAIFoundry

type AzureAIFoundry struct {
	Endpoint   string                 // Azure AI Foundry endpoint URL (required)
	APIKey     string                 // API key for authentication (required if not using DefaultAzureCredential)
	APIVersion string                 // Azure OpenAI API version (e.g., "2024-12-01-preview", "2024-02-01"). Defaults to "2024-12-01-preview" if not specified
	Credential azcore.TokenCredential // Optional: Use Azure DefaultAzureCredential instead of API key
	// contains filtered or unexported fields
}

AzureAIFoundry provides configuration options for the Azure AI Foundry plugin.

func (*AzureAIFoundry) DefineEmbedder

func (a *AzureAIFoundry) DefineEmbedder(g *genkit.Genkit, modelName string) ai.Embedder

DefineEmbedder defines an embedder in the registry.

func (*AzureAIFoundry) DefineModel

func (a *AzureAIFoundry) DefineModel(g *genkit.Genkit, model ModelDefinition, info *ai.ModelInfo) ai.Model

DefineModel defines a model in the registry.

func (*AzureAIFoundry) Init

func (a *AzureAIFoundry) Init(ctx context.Context) []api.Action

Init initializes the Azure AI Foundry plugin.

func (*AzureAIFoundry) Name

func (a *AzureAIFoundry) Name() string

Name returns the provider name.

type GeneratedImage added in v1.1.0

type GeneratedImage struct {
	URL           string // URL of the generated image (if response_format=url)
	B64JSON       string // Base64-encoded image data (if response_format=b64_json)
	RevisedPrompt string // The revised prompt used for this image
}

GeneratedImage represents a generated image

type ImageGenerationRequest added in v1.1.0

type ImageGenerationRequest struct {
	Prompt         string // The text prompt to generate images from
	N              int    // Number of images to generate (1-10)
	Size           string // Size: "256x256", "512x512", "1024x1024", "1792x1024", "1024x1792"
	Quality        string // Quality: "standard" or "hd" (DALL-E 3 only)
	Style          string // Style: "vivid" or "natural" (DALL-E 3 only)
	ResponseFormat string // Format: "url" or "b64_json"
}

ImageGenerationRequest represents a request to generate images

type ImageGenerationResponse added in v1.1.0

type ImageGenerationResponse struct {
	Images        []GeneratedImage // Generated images
	RevisedPrompt string           // The revised prompt used (DALL-E 3)
}

ImageGenerationResponse represents the response from image generation

type ModelDefinition

type ModelDefinition struct {
	Name          string // Model deployment name in Azure AI Foundry
	Type          string // Type: "chat", "text"
	MaxTokens     int32  // Maximum tokens the model can handle (optional)
	SupportsMedia bool   // Whether the model supports media (images, audio) (optional)
}

ModelDefinition represents a model with its name and type.

type STTRequest added in v1.1.0

type STTRequest struct {
	Audio          []byte  // The audio file content
	Filename       string  // Filename with extension (e.g., "audio.mp3", "audio.wav") - required for format detection
	Language       string  // Language code (e.g., "en", "es")
	Prompt         string  // Optional text to guide the model's style
	ResponseFormat string  // Format: "json", "text", "srt", "verbose_json", "vtt"
	Temperature    float64 // Temperature (0 to 1)
}

STTRequest represents a speech-to-text request

type STTResponse added in v1.1.0

type STTResponse struct {
	Text     string  // Transcribed text
	Language string  // Detected language
	Duration float64 // Duration in seconds
}

STTResponse represents the speech-to-text response

type TTSRequest added in v1.1.0

type TTSRequest struct {
	Input          string  // The text to synthesize
	Voice          string  // Voice: "alloy", "echo", "fable", "onyx", "nova", "shimmer"
	ResponseFormat string  // Format: "mp3", "opus", "aac", "flac", "wav", "pcm"
	Speed          float64 // Speed (0.25 to 4.0)
}

TTSRequest represents a text-to-speech request

type TTSResponse added in v1.1.0

type TTSResponse struct {
	Audio []byte // The audio data
}

TTSResponse represents the text-to-speech response

Directories

Path Synopsis
examples
basic command
Package main demonstrates basic usage of the Azure AI Foundry plugin
Package main demonstrates basic usage of the Azure AI Foundry plugin
chat command
Package main demonstrates multi-turn chat conversation with Azure AI Foundry
Package main demonstrates multi-turn chat conversation with Azure AI Foundry
common
Package common provides shared utilities for Azure AI Foundry examples
Package common provides shared utilities for Azure AI Foundry examples
embeddings command
Package main demonstrates embeddings generation with Azure AI Foundry
Package main demonstrates embeddings generation with Azure AI Foundry
image_generation command
Package main demonstrates image generation using genkit.Generate()
Package main demonstrates image generation using genkit.Generate()
speech_to_text command
Package main demonstrates speech-to-text using genkit.Generate()
Package main demonstrates speech-to-text using genkit.Generate()
streaming command
Package main demonstrates streaming text generation with Azure AI Foundry
Package main demonstrates streaming text generation with Azure AI Foundry
text_to_speech command
Package main demonstrates text-to-speech using genkit.Generate()
Package main demonstrates text-to-speech using genkit.Generate()
tool_calling command
Package main demonstrates tool calling (function calling) with Azure AI Foundry
Package main demonstrates tool calling (function calling) with Azure AI Foundry
vision command
Package main demonstrates vision (multimodal) capabilities with Azure AI Foundry
Package main demonstrates vision (multimodal) capabilities with Azure AI Foundry