-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Install: Add Helm Variables for LLM API Proxy Settings #736
Comments
Hi, I am interested in contributing to this. |
@jinjiaKarl Of course, very welcome! The helm repo is here. If there is any need, ping me at any time. |
@elliotxx Hi, thanks for the quick response. Is it ok to proceed or do you have a better idea?
|
@jinjiaKarl Hi, I think you’re on the right track with the proxy settings. Just a quick note: server:
ai:
proxy:
enabled: true
httpProxy: "http://proxy.example.com:8080"
httpsProxy: "https://proxy.example.com:8080"
noProxy: "localhost,127.0.0.1,example.com" Since the Spring Festival, the reply is later, sorry! |
When I tried to build image, deploy it locally and test the functionality, the UI component failed to build. I suppose some dependencies issue?
Also, Seems isRegExp is deprecated in node 23, see https://nodejs.org/api/deprecations.html#DEP0055. Downgrading to node 20 fixed this issue. |
@jinjiaKarl Thanks for the feedback! My local isRegExp is a warning, it won't compile block, my node version is v22, which looks like a compatibility issue for different versions, let we fix it. cc @hai-tian |
What would you like to be added?
Add proxy configuration parameters in Helm chart to support OpenAI and other LLM API calls in internal network environments.
Why is this needed?
Many enterprise users deploy Karpor in internal networks where direct internet access is restricted. This feature will:
This will make it easier to deploy Karpor in enterprise environments where proxy is required for external API calls.
The text was updated successfully, but these errors were encountered: