Skip to content

feat: add spec-compliant streaming error event for Responses API#5224

Open
gyliu513 wants to merge 1 commit intollamastack:mainfrom
gyliu513:error
Open

feat: add spec-compliant streaming error event for Responses API#5224
gyliu513 wants to merge 1 commit intollamastack:mainfrom
gyliu513:error

Conversation

@gyliu513
Copy link
Contributor

@gyliu513 gyliu513 commented Mar 19, 2026

Summary

Replace the ad-hoc OpenAIErrorResponse dict-based error event in SSE streaming with a spec-compliant OpenAIResponseObjectStreamError Pydantic model.

Problem: When errors occur mid-stream (e.g., ValueError from a provider), the SSE generator previously emitted an OpenAIErrorResponse — which is the OpenAI HTTP error response format ({"error": {"message": "...", "code": "..."}}). This is the wrong format for streaming. OpenAI's streaming API uses a distinct ResponseErrorEvent with type: "error" as a top-level SSE event (openai-python SDK reference).

Fix: Introduce OpenAIResponseObjectStreamError matching OpenAI's ResponseErrorEvent schema:

{"type": "error", "code": "400", "message": "not found", "sequence_number": 5}

This allows OpenAI-compatible client SDKs to correctly deserialize streaming errors.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Mar 19, 2026
@gyliu513 gyliu513 marked this pull request as draft March 19, 2026 21:51
@github-actions
Copy link
Contributor

github-actions bot commented Mar 19, 2026

✱ Stainless preview builds

This PR will update the llama-stack-client SDKs with the following commit message.

feat: add spec-compliant streaming error event for Responses API

Edit this comment to update it. It will appear in the SDK's changelogs.

llama-stack-client-openapi studio · code · diff

Your SDK build had at least one "warning" diagnostic, but this did not represent a regression.
generate ⚠️

llama-stack-client-go studio · conflict

Your SDK build resulted in a merge conflict between your custom code and the newly generated changes, but this did not represent a regression.

llama-stack-client-node studio · conflict

Your SDK build resulted in a merge conflict between your custom code and the newly generated changes, but this did not represent a regression.

llama-stack-client-python studio · conflict

Your SDK build resulted in a merge conflict between your custom code and the newly generated changes, but this did not represent a regression.


This comment is auto-generated by GitHub Actions and is automatically kept up to date as you push.
If you push custom code to the preview branch, re-run this workflow to update the comment.
Last updated: 2026-03-26 12:48:03 UTC

@cdoern
Copy link
Collaborator

cdoern commented Mar 19, 2026

can you please add a description to this PR

@gyliu513
Copy link
Contributor Author

@cdoern Done, this is a draft PR, so I did not add description when submit it, it is still draft, I may need more test and the description may be updated later.

@gyliu513 gyliu513 marked this pull request as ready for review March 23, 2026 16:25
@gyliu513 gyliu513 force-pushed the error branch 5 times, most recently from 84b1d89 to a0883a5 Compare March 25, 2026 17:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants