Skip to content

feat: AI code review using ollama and qwen3:14b #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: init-proj
Choose a base branch
from

Conversation

EuclidStellar
Copy link
Collaborator

PR : Ollama Model Setup and AI code Review with qwen3:14B using GitHub Action

This workflow automates code review for pull requests using the Ollama LLM and the qwen3:14b model. It fetches the PR diff, sends it to the LLM for review, and posts the review as a comment on the PR.


📋 Workflow Overview

Step Description
1 Checkout Repository: Fetches the full repo history for accurate diffs.
2 Install Ollama: Installs the Ollama LLM runtime.
3 Start Ollama & Pull Model: Starts Ollama and downloads the qwen3:14b model.
4 Get PR Diff: Computes the code changes between the PR branch and its base.
5 Review with Ollama: Sends the diff and a review prompt to the LLM.
6 Remove LLM Thinking Process: Filters out non-actionable LLM output.
7 Set Review Output: Prepares the review for posting.
8 Create PR Comment: Posts the review as a comment on the PR.
9 Cleanup: Stops the Ollama service.
10 Show Diff File: Prints the diff for debugging.

🔍 Example PR Review

PR Title Model Used Link
Example PR reviewed qwen3:14b View PR

🛠️ Key Features

  • Automated LLM code review for every PR.
  • Removes LLM "thinking process" from comments for clean, actionable feedback.
  • Easy to extend for other models or prompts.

🧩 Workflow Table

Name Purpose Key Command/Action
Checkout Repository Fetch full git history for accurate diffs actions/checkout@v4
Install Ollama Install Ollama LLM runtime curl -fsSL https://ollama.com/install.sh | sh
Start Ollama & Pull Model Start Ollama server and pull the model ollama serve, ollama pull qwen3:14b
Get PR Diff Generate diff between PR and base branch git diff origin/${{ github.base_ref }}...HEAD
Review PR with Ollama Send diff and prompt to LLM, save output ollama run qwen3:14b
Remove LLM thinking process from review Filter out "Thinking..." sections from LLM output awk script
Set review output as environment var Prepare review for posting cat ollama_clean_review.txt
Create PR Comment Post review as PR comment actions/github-script@v6
Stop Ollama Service (Cleanup) Kill Ollama server process kill $OLLAMA_PID
Show diff file Print diff for debugging cat pr_diff.txt

📝 How It Works

  1. On PR or manual trigger, the workflow checks out the code and installs Ollama.
  2. Ollama is started and the qwen3:14b model is pulled.
  3. The workflow generates a git diff between the PR branch and its base.
  4. The diff is sent to the LLM with a detailed review prompt.
  5. The LLM's "thinking process" (if present) is filtered out for clarity.
  6. The actionable review is posted as a comment on the PR.
  7. Ollama is stopped and the diff is shown for debugging.

🛡️ Permissions

Permission Why Needed
contents: read To checkout code
pull-requests: write To post review comments
issues: write For general issue comments

🧩 Requirements

  • GitHub Actions runner: Ubuntu (Linux)
  • Ollama model: Sufficient disk space and memory for model download

🖼️ Mermaid Diagram

flowchart TD
    A[Start: PR Opened] --> B[Checkout Repository]
    B --> C[Install Ollama]
    C --> D[Start Ollama & Pull Model]
    D --> E[Get PR Diff]
    E --> F{Diff Exists?}
    F -- Yes --> G[Review PR with Ollama]
    G --> H[Remove LLM Thinking Process]
    H --> I[Set Review Output]
    I --> J[Create PR Comment]
    J --> K[Stop Ollama Service]
    F -- No --> K
    K --> L[Show Diff File]
    L --> M[End]
    
Loading

@Copilot Copilot AI review requested due to automatic review settings June 20, 2025 08:23
Copy link

keploy bot commented Jun 20, 2025

To generate Unit Tests for this PR, please click here.

Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds an automated GitHub Action workflow to perform AI-powered code review using the Ollama LLM and the qwen3:14b model, while also establishing static analysis configurations for Go and JavaScript.

  • Introduces workflows for static analysis and AI-based PR review
  • Configures installation and operation of the Ollama service with the new qwen3:14b model
  • Reorganizes Go code by removing the previous main.go and go.mod toward a new structure

Reviewed Changes

Copilot reviewed 9 out of 10 changed files in this pull request and generated no comments.

Show a summary per file
File Description
main.go Removed the legacy main.go for codereviewagent; likely part of a restructuring effort
js-code/package.json New JavaScript project configuration for static analysis implementation
js-code/.eslintrc.json New ESLint configuration added for JS static analysis
go.mod Removed the Go module file, indicating a possible change in module management location
code/main.go New Go source file introduced possibly to replace the removed main.go
README.md Added project documentation
.golangci.yml New golangci-lint configuration for Go static analysis
.github/workflows/static-analysis.yml Workflow for running static analysis on Go and JavaScript files with reviewdog integration
.github/workflows/ai-review.yml Workflow for setting up Ollama, pulling the qwen3:14b model, reviewing PRs, and posting comments
Comments suppressed due to low confidence (2)

Signed-off-by: euclidstellar <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant