Skip to content

[BUG] Azure Foundry with OpenAI compatability #10771

@NWessel

Description

@NWessel

Problem (one or two sentences)

Seems like they are migrating to the new openai endpoint, which instead of "completions" is now "responses"

https://{our instance url}.openai.azure.com/openai/responses?api-version=2025-04-01-preview

In the body, they changed from "messages" to "input" and
from "max_completion_tokens" to "max_output_tokens"

I think roo needs to be updated to support this?
See link: https://platform.openai.com/docs/api-reference/responses/create

Image Image Image

Context (who is affected and when)

People using azure foundry with newer models i think?

Reproduction steps

Setup azure foundry with a deployment modal: gpt-5.2-codex

Fill data in as seen in image before

Expected result

Model works and can be prompted

Actual result

API call to model fails because body is wrong

Variations tried (optional)

No response

App Version

Version: 3.41.1 (d49bd9a)

API Provider (optional)

None

Model Used (optional)

gpt-5.2-codex

Roo Code Task Links (optional)

No response

Relevant logs or errors (optional)

/time: 2026-01-16T08:38:06.712Z
Extension version: 3.41.1
Provider: openai (proxy)
Model: gpt-5.2-codex

400
OpenAI completion error: 400 Unsupported parameter: 'messages'. In the Responses API, this parameter has moved to 'input'. Try again with the new parameter. See the API documentation for more information: https://platform.openai.com/docs/api-reference/responses/create.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions