Description
The non-streaming OpenAI-to-GCP Vertex AI translator returns reasoning_content as a nested dict instead of a plain string for Gemini 3 models. This breaks OpenAI-compatible clients (e.g. litellm pydantic validation rejects the response).
The streaming path already handles this correctly using StreamReasoningContent. Gemini 2.5 models are unaffected.
Actual response:
"reasoning_content": {"reasoningContent": {"reasoningText": {"text": "...", "signature": "..."}}}
Expected:
"reasoning_content": "the thinking text..."
Root cause:
geminiCandidatesToOpenAIChoices in internal/translator/gemini_helper.go wraps the value in Bedrock types.
Suggested fix:
set Value to the string directly (3 occurrences):
// before
Value: &openai.ReasoningContent{ReasoningContent: &awsbedrock.ReasoningContentBlock{...}}
// after
Value: thoughtSummary
Error (client-side):
pydantic_core._pydantic_core.ValidationError: 1 validation error for Message
reasoning_content
Input should be a valid string [type=string_type, input_value={'reasoningContent': {'re...}}, input_type=dict]
Description
The non-streaming OpenAI-to-GCP Vertex AI translator returns
reasoning_contentas a nested dict instead of a plain string for Gemini 3 models. This breaks OpenAI-compatible clients (e.g. litellm pydantic validation rejects the response).The streaming path already handles this correctly using
StreamReasoningContent. Gemini 2.5 models are unaffected.Actual response:
Expected:
Root cause:
geminiCandidatesToOpenAIChoicesin internal/translator/gemini_helper.go wraps the value in Bedrock types.Suggested fix:
set Value to the string directly (3 occurrences):
Error (client-side):