Fix: Strip unsupported JSON Schema keywords for structured outputs#2733
Uh oh!
There was an error while loading. Please reload this page.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Changes being requested
Fixes#2718 -
responses.parse()now handlesDecimalfields correctly with GPT-5 modelsThe issue was that
Decimalfields in Pydantic models weren't being properly serialized in JSON Schema generation, causing 500 errors when using structured outputs with certain GPT-5 models (specifically gpt-5-nano).Root cause:
pydantic_function_tool()was stripping out metadata liketypeandtitlefromDecimalfields during JSON schema processing. This made the schema invalid for the API.Fix: Check if a field's metadata is numeric (using
is_numeric_type()) before stripping out thetypekey. For numeric types likeDecimal, we preserve thetypefield so the schema remains valid.Changed in:
src/openai/_utils/_transform.py- Added numeric type check before removing metadatatests/test_transform.py- Added test cases for Decimal field handlingAdditional context & links
Related to #2718 - multiple users reporting issues with Decimal fields
Affects: GPT-5 models using structured outputs with Pydantic models containing Decimal types