Summary
Using VS Code Copilot Chat BYOK with an Azure Foundry OpenAI-compatible endpoint and explicit /responses path can fail with a request validation error indicating an invalid/missing input[n].type.
Environment
- VS Code: latest stable (repro observed in May 2026)
- Extension: GitHub Copilot Chat (BYOK custom model)
- Provider in
chatLanguageModels.json: azure
- Endpoint style: Azure Foundry project OpenAI-compatible endpoint
Config Snippet
{
"models": {
"provider": "azure",
"url": "https://<foundry-host>.services.ai.azure.com/api/projects/<project>/openai/v1/responses",
"id": "<model-id>",
"name": "<display-name>",
"toolCalling": true,
"vision": false,
"maxInputTokens": 64000,
"maxOutputTokens": 8000
}
}
Steps to Reproduce
- Configure a BYOK Azure custom model in
chatLanguageModels.json with an explicit /responses URL (Foundry project endpoint).
- Select that model in Copilot Chat.
- Send a simple prompt (for example:
Hello).
- Observe request failure.
Actual Result
Request fails with 400-like validation response (sanitized):
{
"error": {
"message": "Invalid value: ''. Supported values are: 'input_text', 'input_image', 'output_text', 'refusal', 'input_file', 'computer_screenshot', and 'summary_text'.",
"type": "invalid_request_error",
"param": "input[1]",
"code": "invalid_value"
}
}
Expected Result
The BYOK request payload generated for /responses should be compatible with Azure Foundry OpenAI-compatible Responses API expectations.
Notes
- Using
/chat/completions is more reliable in this setup.
- This may be a payload-shape compatibility issue in the BYOK Responses path for Azure Foundry endpoints.
Summary
Using VS Code Copilot Chat BYOK with an Azure Foundry OpenAI-compatible endpoint and explicit
/responsespath can fail with a request validation error indicating an invalid/missinginput[n].type.Environment
chatLanguageModels.json:azureConfig Snippet
{ "models": { "provider": "azure", "url": "https://<foundry-host>.services.ai.azure.com/api/projects/<project>/openai/v1/responses", "id": "<model-id>", "name": "<display-name>", "toolCalling": true, "vision": false, "maxInputTokens": 64000, "maxOutputTokens": 8000 } }Steps to Reproduce
chatLanguageModels.jsonwith an explicit/responsesURL (Foundry project endpoint).Hello).Actual Result
Request fails with 400-like validation response (sanitized):
{ "error": { "message": "Invalid value: ''. Supported values are: 'input_text', 'input_image', 'output_text', 'refusal', 'input_file', 'computer_screenshot', and 'summary_text'.", "type": "invalid_request_error", "param": "input[1]", "code": "invalid_value" } }Expected Result
The BYOK request payload generated for
/responsesshould be compatible with Azure Foundry OpenAI-compatible Responses API expectations.Notes
/chat/completionsis more reliable in this setup.