Skip to content

Add chat_azure_anthropic() for Azure Foundry Anthropic Models#960

Open
dareneiri wants to merge 7 commits intotidyverse:mainfrom
dareneiri:585-add-azure-chat-anthropic
Open

Add chat_azure_anthropic() for Azure Foundry Anthropic Models#960
dareneiri wants to merge 7 commits intotidyverse:mainfrom
dareneiri:585-add-azure-chat-anthropic

Conversation

@dareneiri
Copy link
Copy Markdown

@dareneiri dareneiri commented Apr 10, 2026

Summary

A new function chat_azure_anthropic() allows users to access Anthropic models made available through Azure Foundry, which has a different access point from the existing Azure OpenAI implementation in chat_azure_openai()

Azure AI Foundry exposes Anthropic models via a different endpoint format than chat_azure_openai(), which targets *.openai.azure.com and uses the OpenAI chat completions format. This new function targets the Foundry endpoint and uses the standard Anthropic Messages API format (x-api-key, anthropic-version: 2023-06-01), meaning it inherits all existing ProviderAnthropic behaviour — streaming, tool calling, structured data extraction, prompt caching, batch chat, and image/PDF support — with no additional implementation required.

Changes

  • New function: chat_azure_anthropic() which connects to Anthropic models hosted at*.services.ai.azure.com/anthropic
  • Adds new line to news.md on new function

Implementation details

  • ProviderAzureAnthropic inherits from ProviderAnthropic with no additional properties — the Foundry Anthropic endpoint uses the same API format as api.anthropic.com
  • model is required with no default, since the value is a user-specific Azure deployment name rather than a canonical model identifier, similar to Azure Open AI
  • Trailing slashes on endpoint are normalised to avoid double-slash URLs
  • default_azure_anthropic_credentials() mirrors default_azure_credentials() with the same priority order

Testing

  • ✅ New test added: tests/testthat/test-provider-azure-anthropic.R
devtools::test(filter = "provider-azure-anthropic")
# [ FAIL 0 | WARN 0 | SKIP 1 | PASS 15 ]
# SKIP: connectcreds not installed (expected)
> chat <- chat_azure_anthropic(model = "claude-sonnet-4-6")
> chat$chat("What is the capital of France?")
The capital of France is **Paris**.

@dareneiri
Copy link
Copy Markdown
Author

I previously wrote that this addresses #585 but that specifically mentions the ability to call Mistral models through Azure Foundry. This new function only concerns Anthropic models. Mistral models will call *.services.ai.azure.com/openai/v1/.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant