Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using paper-qa with GROQ LLMs ? #862

Open
jbdatascience opened this issue Feb 12, 2025 · 2 comments
Open

Using paper-qa with GROQ LLMs ? #862

jbdatascience opened this issue Feb 12, 2025 · 2 comments
Labels
question Further information is requested

Comments

@jbdatascience
Copy link

jbdatascience commented Feb 12, 2025

I would like to use paper-qa with GROQ LLMs. Is this possible and if so, how exactly to do that?

For reference see:

https://console.groq.com/docs/models

Etc.

Also for Python code examples using GROQ LLMs.

Copy link

dosubot bot commented Feb 12, 2025

Hey @jbdatascience! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together!

I couldn't find specific information on integrating paper-qa with GROQ LLMs. However, paper-qa is designed to be configurable with different language models through its Settings class. You can customize the LLM configuration by setting parameters like llm, llm_config, summary_llm, and summary_llm_config to specify the models and configurations you want to use [1].

To integrate a non-default LLM like GROQ, you would typically need to ensure that the LLM is compatible with the interfaces expected by paper-qa. This might involve implementing or adapting an interface that allows paper-qa to communicate with the GROQ LLM, similar to how it interacts with other LLMs via LiteLLM.

If GROQ LLMs have a specific API or SDK, you would need to configure paper-qa to use that API for embedding, summarization, and answer generation tasks. This might require custom code to handle the specifics of the GROQ LLM API.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other | Bug Report

@dosubot dosubot bot added the question Further information is requested label Feb 12, 2025
@jamesbraza
Copy link
Collaborator

Hi @jbdatascience yeah we use LiteLLM as our LLM layer. Check this: https://docs.litellm.ai/docs/providers/groq

Our docs here are pretty suboptimal, but hopefully you can get 90% of the way there by reading our issues and docs. Basically you'll want to configure the LLM stuff here to align with Groq.

Please feel free to let us know where you get stuck and can try to help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants