Skip to content

[Go] Streaming responses contain empty content parts that break OpenAI serialization #3606

@weeco

Description

@weeco

Describe the bug

When using ai.WithStreaming(), response messages contain empty content parts that cause OpenAI API failures when used as conversation history with ai.WithMessages(). The empty parts get serialized as null content fields, resulting in: Invalid value for 'content': expected a string, got null errors.

To Reproduce

package main

  import (
      "context"
      "os"
      "github.com/firebase/genkit/go/ai"
      "github.com/firebase/genkit/go/genkit"
      "github.com/firebase/genkit/go/plugins/compat_oai/openai"
  )

  func main() {
      g := genkit.Init(context.Background(),
          genkit.WithPlugins(&openai.OpenAI{APIKey: os.Getenv("OPENAI_API_KEY")}),
          genkit.WithDefaultModel("openai/gpt-4o-mini"),
      )

      // Step 1: Streaming request (works fine)
      resp, _ := genkit.Generate(context.Background(), g,
          ai.WithMessages(ai.NewUserTextMessage("My favorite color is blue")),
          ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
              return nil
          }),
      )

      // resp.Message.Content now contains empty parts at [0] and [last]

      // Step 2: Use response in conversation history (fails)
      history := []*ai.Message{
          ai.NewUserTextMessage("My favorite color is blue"),
          resp.Message, // Contains empty content parts
      }

      // This fails with 400 Bad Request
      _, err := genkit.Generate(context.Background(), g,
          ai.WithMessages(history...),
          ai.WithPrompt("What is my favorite color?"),
      )
      // Error: "Invalid value for 'content': expected a string, got null."
  }

Expected behavior

Streaming responses should not contain empty content parts, or the OpenAI compatibility layer should handle them properly. Using streaming response messages as
conversation history should work seamlessly.

Screenshots
N /A

Runtime (please complete the following information):

  • OS: MacOS
  • Version Sequoia 15.6.1

** Go version

go version go1.25.1 darwin/arm64

Additional context

This affects a common use case - storing LLM responses for conversation history in chatbots/agents. Workaround is to filter empty parts before storing:

  var cleanParts []*ai.Part
  for _, part := range resp.Message.Content {
      if part.Text != "" {
          cleanParts = append(cleanParts, part)
      }
  }

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinggo

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions