Skip to content

Update OpenAPI specification to use gpt-5 model and replace max_token…#131

Merged
prabhash-varma merged 2 commits intomasterfrom
update/max-completion-tokens-and-gpt5-models
Mar 25, 2026
Merged

Update OpenAPI specification to use gpt-5 model and replace max_token…#131
prabhash-varma merged 2 commits intomasterfrom
update/max-completion-tokens-and-gpt5-models

Conversation

@prabhash-varma
Copy link
Copy Markdown

@prabhash-varma prabhash-varma commented Mar 25, 2026

Update OpenAPI specification to use gpt-5 model and replace max_tokens with max_completion_tokens in examples and descriptions.

…s with max_completion_tokens in examples and descriptions.
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the Portkey OpenAPI specification to reflect newer OpenAI/Portkey chat completion conventions, including GPT‑5 model examples and the shift from max_tokens to max_completion_tokens.

Changes:

  • Updated /chat/completions code samples to use model: gpt-5 and include max_completion_tokens.
  • Deprecated max_tokens in CreateChatCompletionRequest and introduced max_completion_tokens with updated descriptions.
  • Extended CompletionUsage with optional token breakdown fields (completion_tokens_details, prompt_tokens_details).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 1861 to 1864
},
"max_tokens": 250,
"max_completion_tokens": 250,
"presence_penalty": 0.2
}'
Copy link

Copilot AI Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The /prompts/{promptId}/completions request schema description above still references max_tokens in its hyperparameters note, but the examples here were updated to use max_completion_tokens. Please update that note to mention max_completion_tokens (and/or note max_tokens is deprecated) to keep the documentation consistent.

Copilot uses AI. Check for mistakes.
Comment on lines 2015 to 2019
"user_input": "Hello world"
},
"max_tokens": 250,
"max_completion_tokens": 250,
"presence_penalty": 0.2
}'
Copy link

Copilot AI Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The /prompts/{promptId}/render request schema description above still references max_tokens in its hyperparameters note, but the examples here were updated to use max_completion_tokens. Please update that note to mention max_completion_tokens (and/or note max_tokens is deprecated) to keep the documentation consistent.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 1 out of 1 changed files in this pull request and generated no new comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@prabhash-varma prabhash-varma merged commit d210e2b into master Mar 25, 2026
4 of 5 checks passed
@narengogi narengogi deleted the update/max-completion-tokens-and-gpt5-models branch March 25, 2026 09:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants