Files
archived-ai/.codespellrc
paul 1e2df898a5 [Platform][Azure] Replace Chat Completions client with Responses API client
The Azure OpenAI platform previously used the Chat Completions API
(`/openai/deployments/{deployment}/chat/completions`) via `CompletionsModel`.
Azure now supports the unified Responses API (`/openai/v1/responses`), which
is consistent with how the standard OpenAI bridge already works.

- Remove `CompletionsModelClient` and replace with `Responses\ModelClient`
  that authenticates via `api-key` header and targets `/openai/v1/responses`
- Update `ModelCatalog` to map `Gpt` models to `ResponsesModel` instead of
  `CompletionsModel`, enabling file input support and unified streaming
- Update `PlatformFactory` to use the new `Responses\ModelClient` together
  with `OpenAiContract` (which includes `DocumentNormalizer`) and the
  `OpenResponses\ResultConverter`
- Fix `DocumentNormalizer::supportsModel()` to use capability-based check
  (`Capability::INPUT_PDF`) instead of `$model instanceof Gpt`, so it works
  for any model that declares PDF support (including Azure's `ResponsesModel`)
- Register `DocumentNormalizer` in `OpenAiContract::create()` so PDF documents
  are serialized correctly when sent via agents
- Add `OpenResponsesPlatform` to `AzurePlatform` allowed deps in deptrac.yaml
- Add `.codespellrc` to suppress false positive on `Vektor` (intentional name
  of the Vektor vector store bridge)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-04 20:54:13 +01:00

3 lines
39 B
Plaintext

[codespell]
ignore-words-list = vektor