You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be highly beneficial to support Groq's API as a provider. I attempted to use https://api.groq.com/openai/v1 as an OpenAI-compatible endpoint, but function_tools are not working properly and are not fully supported (sometimes, I got timeout, or loops and / or json text)
GroqCloud offers impressive speed and supports multimodal models, making it a compelling option to integrate. Adding support for Groq's API would allow users to take advantage of its performance benefits.
The text was updated successfully, but these errors were encountered:
You experienced issues with function_tools (timeouts, loops, or JSON formatting problems)
You note Groq's impressive speed and multimodal model support as compelling reasons for integration
Technical Assessment
Based on my analysis of the MyCoder codebase:
Feasibility: This integration is technically feasible
MyCoder already supports multiple providers including OpenAI, Anthropic, Ollama, and XAI
Groq provides an OpenAI-compatible API that could be leveraged
Implementation Approach:
Add a new provider entry in packages/agent/src/core/llm/provider.ts
Likely extend or reuse the OpenAI provider implementation (similar to how XAI is implemented)
Add specific handling for Groq's function calling implementation differences
Potential Challenges:
Function tools support appears to be problematic with Groq's API
Special handling may be required to ensure proper function calling behavior
Recommendation
This feature request is approved for implementation. Adding Groq support would enhance MyCoder's capabilities by providing:
Faster response times for users
Additional multimodal model options
More provider choices for users
Next Steps
Add Groq provider to the provider registry
Implement proper function calling support for Groq
Add documentation for Groq provider usage
Add tests for the Groq provider implementation
I'll apply the appropriate labels to this issue. If you have any additional information about the specific function_tools issues you encountered, please share them to help with the implementation.
It would be highly beneficial to support Groq's API as a provider. I attempted to use https://api.groq.com/openai/v1 as an OpenAI-compatible endpoint, but function_tools are not working properly and are not fully supported (sometimes, I got timeout, or loops and / or json text)
GroqCloud offers impressive speed and supports multimodal models, making it a compelling option to integrate. Adding support for Groq's API would allow users to take advantage of its performance benefits.
The text was updated successfully, but these errors were encountered: