Skip to content

Local LLM using Ollama without OpenAI #731

Answered by jamesbraza
lindamathez asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @lindamathez , yes this is fixed by #728 on our current main branch. It will be released soon in v5.5.1.

In the meantime, you can either:

pip install git+https://github.com/Future-House/paper-qa.git
# or
pip install fhaviary<0.10.2 paper-qa>=5.5

Replies: 2 comments 6 replies

Comment options

You must be logged in to vote
1 reply
@floedaniel
Comment options

Comment options

You must be logged in to vote
5 replies
@dosubot
Comment options

@jamesbraza
Comment options

@lindamathez
Comment options

@jamesbraza
Comment options

Answer selected by maykcaldas
@jamesbraza
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants