feat(provider): add LiteLLM provider support#22179
feat(provider): add LiteLLM provider support#22179tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
Conversation
Add litellm entry to the custom() provider map. Uses the bundled @ai-sdk/openai-compatible SDK with user-configured baseURL. Supports LITELLM_API_KEY env var and apiKey in provider options. Autoloads when baseURL is configured in opencode.json under provider.litellm.
|
The following comment was made by an LLM, it may be inaccurate: Based on my search, I found several related PRs that are potentially duplicates or closely related: Potential Duplicates:
Related PRs (not direct duplicates but same area):
Most Likely Duplicate: PR #14468 appears to be doing the exact same thing - adding LiteLLM provider support. You should check if this was previously closed or merged before proceeding. |
|
This PR doesn't fully meet our contributing guidelines and PR template. What needs to be fixed:
Please edit this PR description to address the above within 2 hours, or it will be automatically closed. If you believe this was flagged incorrectly, please let a maintainer know. |
|
This pull request has been automatically closed because it was not updated to meet our contributing guidelines within the 2-hour window. Feel free to open a new pull request that follows our guidelines. |
Closes #22212
Adds LiteLLM as a provider using the existing
@ai-sdk/openai-compatibleadapter. Users running LiteLLM can point opencode at it instead of configuring each provider individually.Verified: 19 packages typecheck clean, tested against local LiteLLM proxy.