A research project studying Mixture of Experts (MoE) router mechanisms using the nnterp library for mechanistic interpretability.
moe-router-study/
├── core/ # Core utilities and models
├── exp/ # Experiments and analysis scripts
├── viz/ # Visualization utilities
├── data/ # Data files (gitignored)
├── output/ # Output files and results (gitignored)
├── test/ # Test files
└── README.md
This project uses uv for fast Python package management.
- Python 3.12
- uv package manager
- Clone the repository:
git clone https://github.com/d0rbu/moe-router-study.git
cd moe-router-study- Install dependencies with uv:
uv sync --dev- Install pre-commit hooks (optional but recommended):
uv run pre-commit install# Run all tests
uv run pytest
# Run with coverage
uv run pytest --cov=core --cov=exp --cov=viz
# Run only fast tests (skip slow integration tests)
uv run pytest -m "not slow"# Run linting
uv run ruff check .
# Run formatting
uv run ruff format .
# Run type checking
uv run ty check core/ exp/ viz/ test/The project uses GitHub Actions for continuous integration:
- Linting & Type Checking: Runs ruff and ty on Python 3.12
- Testing: Runs pytest with coverage reporting
- Create a new branch for your feature
- Make your changes
- Run tests and linting
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.