The Ollama Zig library provides the easiest way to integrate Zig 1.13+ projects with Ollama.
- Ollama should be installed and running
- Pull a model to use with the library:
ollama pull <model>
e.g.ollama pull llama3.2
- See Ollama.com for more information on the models available.
zig fetch --save git+https://github.com/dravenk/ollama-zig.git
Adding to build.zig
const ollama = b.dependency("ollama-zig", .{
.target = target,
.optimize = optimize,
});
exe.root_module.addImport("ollama", ollama.module("ollama"));
Import it in your code:
const ollama = @import("ollama");
See types.zig for more information on the response types.
Response streaming can be enabled by setting stream=True
.
The Ollama Zig library's API is designed around the Ollama REST API
const message = &[_]Ollama.chatOptions.message{
.{ .role = "user", .content = "Why is the sky blue?" },
};
const response = try ollama.chat(.{ .model = "llama3.2", .messages = message });
ollama.generate(model='llama3.2', prompt='Why is the sky blue?')
ollama.list()
ollama.show('llama3.2')
modelfile='''
FROM llama3.2
SYSTEM You are mario from super mario bros.
'''
ollama.create(model='example', modelfile=modelfile)
ollama.copy('llama3.2', 'user/llama3.2')
ollama.delete('llama3.2')
ollama.pull('llama3.2')
ollama.push('user/llama3.2')
ollama.embed(model='llama3.2', input='The sky is blue because of rayleigh scattering')
ollama.embed(model='llama3.2', input=['The sky is blue because of rayleigh scattering', 'Grass is green because of chlorophyll'])
ollama.ps()
Errors are raised if requests return an error status or if an error is detected while streaming.