Skip to content

Commit ab57e03

Browse files
committed
Publish post "Exploring selfhosted AI-Augmented coding"
New Emacs config connected to AI with local ollama.
1 parent 834514e commit ab57e03

File tree

1 file changed

+44
-0
lines changed

1 file changed

+44
-0
lines changed
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
---
2+
comments: true
3+
layout: post
4+
tags: [AI, Coding, Emacs, LLM]
5+
date: 2024/07/16
6+
title: "Exploring selfhosted AI-Augmented coding"
7+
permalink: /blog/2024/07/16/exploring-selfhosted-ai-augmented-coding
8+
---
9+
10+
In my ambitious quest to leverage AI for 80% of my work by year's end, I've been exploring alternatives to GitHub Copilot. This post summarizes my recent discoveries and experiments in the realm of AI-augmented coding.
11+
12+
## Local LLM Setup
13+
14+
At the core of my setup is [Ollama](https://ollama.com), serving as my local Large Language Model (LLM) provider. This approach offers greater control and privacy compared to cloud-based solutions.
15+
16+
## User Interfaces and Integrations
17+
18+
1. **Open WebUI**: Initially, I experimented with [Open WebUI](https://github.com/Open-WebUI) as a frontend for Ollama, providing a user-friendly interface for interactions.
19+
20+
2. **Ellama for Emacs**: The game-changer in my workflow has been integrating [Ellama](https://github.com/s-kostyaev/ellama) into my Emacs configuration. This allows me to access a custom AI assistant directly within Emacs, keeping my hands firmly on the keyboard.
21+
22+
Here's a snippet of my Ellama configuration:
23+
24+
{% highlight emacs-lisp %}
25+
(setopt ellama-providers
26+
'(("standup" . (make-llm-ollama
27+
:chat-model "standup:latest"
28+
:embedding-model "nomic-embed-text"
29+
:default-chat-non-standard-params '(("num_ctx" . 8192))))))
30+
{% endhighlight %}
31+
32+
## Additional Tools and Experiments
33+
34+
- **AI Renamer**: I tested [AI Renamer](https://github.com/ozgrozer/ai-renamer), a Node.js CLI tool for AI-assisted file renaming. A potential integration with Emacs' dired mode could be an interesting project. This experiment reminded me of my earlier tests with [LlamaFS](https://github.com/iyaja/llama-fs), an intriguing file system interface for LLMs.
35+
36+
- **Tabby**: My latest addition is [Tabby](https://tabby.tabbyml.com/), a self-hosted alternative to GitHub Copilot. It leverages Ollama or the built-in LLama.cpp. I plan to test it today using [tabby-mode.el](https://github.com/ragnard/tabby-mode) in Emacs.
37+
38+
## Reflections and Future Directions
39+
40+
The sheer number of tools developed by the community is astounding. My current focus is on fine-tuning these tools to respond intuitively, ideally anticipating my needs before I even articulate them or build my own.
41+
42+
This journey into AI-augmented coding is not just about replacing manual work; it's about enhancing creativity, efficiency, and the overall coding experience. As I continue to explore and integrate these tools, I'm excited about the potential for AI to transform not just how we write code, but how we think about programming itself.
43+
44+
Stay tuned for more updates on my AI integration journey, and feel free to share your own experiences or suggestions in the comments below.

0 commit comments

Comments
 (0)