mirror of
https://github.com/symfony/ai.git
synced 2026-03-23 23:42:18 +01:00
The Azure OpenAI platform previously used the Chat Completions API
(`/openai/deployments/{deployment}/chat/completions`) via `CompletionsModel`.
Azure now supports the unified Responses API (`/openai/v1/responses`), which
is consistent with how the standard OpenAI bridge already works.
- Remove `CompletionsModelClient` and replace with `Responses\ModelClient`
that authenticates via `api-key` header and targets `/openai/v1/responses`
- Update `ModelCatalog` to map `Gpt` models to `ResponsesModel` instead of
`CompletionsModel`, enabling file input support and unified streaming
- Update `PlatformFactory` to use the new `Responses\ModelClient` together
with `OpenAiContract` (which includes `DocumentNormalizer`) and the
`OpenResponses\ResultConverter`
- Fix `DocumentNormalizer::supportsModel()` to use capability-based check
(`Capability::INPUT_PDF`) instead of `$model instanceof Gpt`, so it works
for any model that declares PDF support (including Azure's `ResponsesModel`)
- Register `DocumentNormalizer` in `OpenAiContract::create()` so PDF documents
are serialized correctly when sent via agents
- Add `OpenResponsesPlatform` to `AzurePlatform` allowed deps in deptrac.yaml
- Add `.codespellrc` to suppress false positive on `Vektor` (intentional name
of the Vektor vector store bridge)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
3 lines
39 B
Plaintext
3 lines
39 B
Plaintext
[codespell]
|
|
ignore-words-list = vektor
|