Skip to content

I have added offline support for LMStudio in the code!Can we directly update the proxy warehouse? #24

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bbcoll5219 opened this issue Mar 29, 2025 · 4 comments

Comments

@bbcoll5219
Copy link

bbcoll5219 commented Mar 29, 2025

I have added offline support for LMStudio in the code. LMStudio has a visual interface that is more convenient than Olama, but can I update the code directly? Do we still need permission from the original author?

Image

Image

2、Added Chinese option

Image

Image

3、Need to add timeout, otherwise it will timeout over time. Who knows where to add it?

@bbcoll5219
Copy link
Author

However, I still haven't figured out how to use this code completion. Can anyone explain it with a picture?

Image

@bbcoll5219
Copy link
Author

This smart codeinsight only allows offline ollama to be used
If only I could join LLMstudio

Image

@bbcoll5219
Copy link
Author

Need to add timeout, otherwise it will timeout over time. Who knows where to add it?

@Cesar4D
Copy link
Contributor

Cesar4D commented Apr 7, 2025

Hello @bbcoll5219, your contributions are really cool. I ask that you make a pull request with them so we can add them to the official project. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants