feat: add ReasoningItem output type for Responses API#324
Open
robinnarsinghranabhat wants to merge 2 commits intollamastack:mainfrom
Open
feat: add ReasoningItem output type for Responses API#324robinnarsinghranabhat wants to merge 2 commits intollamastack:mainfrom
robinnarsinghranabhat wants to merge 2 commits intollamastack:mainfrom
Conversation
The Output discriminated union was missing a ReasoningItem variant, causing type="reasoning" output items from the Responses API to fall back to OutputOpenAIResponseMessageOutput with Pydantic warnings and broken content access. Adds OutputOpenAIResponseReasoningItem (with Content and Summary subtypes) to both non-streaming and streaming response types.
81922da to
dcd21d9
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
The LlamaStack client's
Outputunion was missing aReasoningItemvariant, causingtype: "reasoning"output items from the Responses API to be deserialized as the genericOutputOpenAIResponseMessageOutputinstead of a dedicated type. This addsOutputOpenAIResponseReasoningItemto both non-streaming and streaming response types.Problem
When a server returns reasoning output (from reasoning-capable models), the response includes items with
type: "reasoning". The client's discriminatedOutputunion had no variant for this type, so Pydantic fell back toOutputOpenAIResponseMessageOutput.Why it appears to "work" but is actually broken
The client's
BaseModelis configured withextra: 'allow', so Pydantic silently accepts unknown fields likesummaryandencrypted_contentas untyped extras on the wrong class:This means:
contentisNone—resp.output[0].content[0].textraisesTypeError: 'NoneType' object is not subscriptablesummaryexists but as raw dicts — no.textattribute, no type validation, no IDE autocompletionroleisNone— the class expects it as a requiredLiteral["system", "developer", "user", "assistant"]but Pydantic'sextra: 'allow'lets it slideisinstance(item, ReasoningItem)would failSetup to reproduce
Before (broken) — non-streaming
Using OpenAI client as reference (correct behavior):
Same call with LlamaStack client (broken):
After (fixed) — non-streaming
After (fixed) — streaming
Changes
src/llama_stack_client/types/response_object.pyOutputOpenAIResponseReasoningItem,OutputOpenAIResponseReasoningItemContent,OutputOpenAIResponseReasoningItemSummary+ added toOutputdiscriminated unionsrc/llama_stack_client/types/response_object_stream.pyOutputItemAddedandOutputItemDoneitem unionsTest plan
type: "reasoning"correctly deserializes intoOutputOpenAIResponseReasoningItemin both streaming and non-streaming modesgpt-oss:20b) as ground truth