This project contains a Flask server that connects with the Ollama -> Llama 3 model (other models can also be used).
Run the following command to install Ollama:
curl -fsSL https://ollama.com/install.sh | shollama run llama3ollama serve
pip install -r requirements.txtpython ollama.py