fix(provider): improve local provider connection error messages#22178
Closed
tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
Closed
fix(provider): improve local provider connection error messages#22178tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
Conversation
Detect ECONNREFUSED/ETIMEDOUT/ENOTFOUND errors to localhost URLs and append provider-specific hints. Ollama, LiteLLM, LM Studio, and LocalAI all get actionable messages like 'Is Ollama running? Start it with: ollama serve'.
Contributor
|
Thanks for your contribution! This PR doesn't have a linked issue. All PRs must reference an existing issue. Please:
See CONTRIBUTING.md for details. |
Contributor
|
This PR doesn't fully meet our contributing guidelines and PR template. What needs to be fixed:
Please edit this PR description to address the above within 2 hours, or it will be automatically closed. If you believe this was flagged incorrectly, please let a maintainer know. |
Contributor
|
This pull request has been automatically closed because it was not updated to meet our contributing guidelines within the 2-hour window. Feel free to open a new pull request that follows our guidelines. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #22190
Issue for this PR Closes # ### Type of change - [x] Bug fix ### What does this PR do? When a local provider (like Ollama or LM Studio) is unreachable, the error is a raw ECONNREFUSED that gives no hint about what is wrong. Users do not know if it is their server, their config, or a network issue. This adds provider-specific error context for connection failures to localhost/private IPs. It detects local provider URLs and appends actionable messages like suggesting to check if the server is running or if the port is correct. ### How did you verify your code works? - All 19 packages typecheck clean - Tested with local providers going offline ### Checklist - [x] I have tested my changes locally - [x] I have not included unrelated changes in this PR