Skip to content

feat: translate response_format to Anthropic output_config for structured output#5640

Open
03-CiprianoG wants to merge 1 commit intoHelicone:mainfrom
03-CiprianoG:fix/anthropic-structured-output
Open

feat: translate response_format to Anthropic output_config for structured output#5640
03-CiprianoG wants to merge 1 commit intoHelicone:mainfrom
03-CiprianoG:fix/anthropic-structured-output

Conversation

@03-CiprianoG
Copy link

Fixes #5639

Problem

When using the AI Gateway with Anthropic models, response_format: { type: "json_schema" } is passed through untranslated. Anthropic ignores this OpenAI-specific parameter, so the model returns prose instead of JSON — breaking Output.object(), generateObject(), and any structured output workflow.

Solution

Map OpenAI's response_format to Anthropic's native output_config.format in toAnthropic().

Translation

OpenAI (input to gateway):
response_format: {
  type: "json_schema",
  json_schema: { schema: {...}, strict: true, name: "response" }
}

Anthropic (output from gateway):
output_config: {
  format: {
    type: "json_schema",
    json_schema: { schema: {...}, name: "response" }
  }
}

Note: strict is omitted — it's OpenAI-specific and not part of Anthropic's API.

Changes

  • packages/llm-mapper/transform/types/anthropic.ts: Added output_config and AnthropicOutputConfig type to AnthropicRequestBody
  • packages/llm-mapper/transform/providers/openai/request/toAnthropic.ts: Added response_formatoutput_config translation in toAnthropic()

Testing

The fix can be verified by sending a request with response_format: { type: "json_schema", json_schema: { schema: { type: "object", properties: { name: { type: "string" } } } } } to an Anthropic model through the gateway. The model should now return valid JSON matching the schema instead of prose.

…ured output

The toAnthropic() request translator now maps OpenAI's
response_format: { type: "json_schema", json_schema: { schema, name } }
to Anthropic's native output_config: { format: { type: "json_schema", json_schema: { schema, name } } }.

Previously, response_format was ignored during translation, causing
Anthropic to return prose instead of structured JSON. This broke
Output.object(), generateObject(), and any structured output workflow
when using Anthropic models through the AI Gateway.

Fixes Helicone#5639
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.

@vercel
Copy link

vercel bot commented Mar 17, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
helicone-bifrost Skipped Skipped Mar 17, 2026 4:21pm

Request Review

@vercel vercel bot temporarily deployed to Preview – helicone-bifrost March 17, 2026 16:21 Inactive
@vercel
Copy link

vercel bot commented Mar 17, 2026

@03-CiprianoG is attempting to deploy a commit to the Helicone Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AI Gateway: Anthropic structured output (response_format → output_config) not translated

1 participant