Replies: 1 comment
-
| Hey @Hdz, thanks for the message. To help narrow it down, could you share a bit more detail? 
 I'm not sure if you've seen our guide to using ollama with Continue, but this may be helpful as well. | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
Hey sorry, sorry if this question has already been answered.
I've installed ollama qwen3-coder:30b, that I auto detected with your extension.
https://docs.continue.dev/customization/models while saying it's one of the most recommended open model, I still have that kind of command instead of doing it
"<function=run_terminal_command> <parameter=command> docker ps -a </tool_call>"
I've read the docs but I don't find Is there any more config I should add anywhere ?
I tried to put tools with automatic and ask first, but they're both doing the same answer.
my config is plainly that and I can point to my qwen3-coder:30b.
Would you mind telling me what's wrong ? Sorry if this has already been answered
Beta Was this translation helpful? Give feedback.
All reactions