Skip to content

padmanabhan-r/StringIQ

Repository files navigation

StringIQ

Real-time AI music coaching. Starting with guitar. Built for every instrument.

Alpha release — scales are live now. Chords, songs, and all instruments are on the way.

Hub ElevenLabs Gemini Supabase Electron React FastAPI ElevenHacks

Learning an instrument is one of the most rewarding things you can do. But it's hard.

Beginners start excited—then get stuck. Progress feels invisible. Many quit within a year. Lessons? Expensive. The problem: no real-time feedback.

StringIQ changes that.

StringIQ is a real-time AI music training system for guitar that gamifies learning—listening to every note, responding instantly, and adapting to your playing. Mistakes get caught as they happen, not after. And while it starts with guitar, this extends to any instrument.

Feedback shouldn't come after you play. It should be heard—and felt—in the moment. Research shows auditory feedback drives subconscious correction. You don't think—you adjust.

StringIQ adds multi-sensory feedback. You hear the coach. You see feedback. You feel it—ambient lights responding to your playing in real time. Think concert lights—but for practice. Drift off tempo or hit wrong notes—the lights turn red. Lock in and they stay green. No delay. No analysis. Just correction.

StringIQ Studio — Real-time AI guitar coaching


Table of Contents


What is StringIQ?

StringIQ is a real-time AI guitar practice system built around two deliverables:

  • StringIQ Studio — Electron desktop app. Connect your guitar, select a scale, practice with a live AI coach. The core experience.
  • StringIQ Web — Marketing landing page. Live at stringiq.netlify.app.

Under the hood: advanced digital signal processing coupled with ElevenLabs Agents and TTS deliver real-time coaching, ElevenLabs Voice Design crafted the coach's unique voice, the ElevenLabs Music API generates backing tracks for your sessions, and Tuya controls the smart lights—all driven by three core metrics computed continuously: pitch accuracy, scale conformity, and timing stability. Session data persists to PostgreSQL for trend analysis and personalized coaching.


How It Works

1. Connect — Plug in your guitar via a USB audio interface (e.g. Focusrite Solo). StringIQ detects your audio input automatically.

2. Configure — Manual mode or AI mode. Pick your own scale or let the AI recommend one based on your history. Optionally generate a backing track in your key.

3. Play — Hit start. The audio engine captures at 44.1kHz, runs pitch detection via librosa piptrack, and streams metrics at ~20Hz over WebSocket.

4. Get Coached — Every 20 seconds, Gemini Flash analyzes your metrics and generates contextual feedback. ElevenLabs TTS speaks it aloud — eyes on the fretboard, not the screen. Per-session conversation history (10 turns) keeps feedback progressive.

5. Review — Dashboard for session history, performance trends, and a hands-free voice agent for querying progress and generating practice plans.


Key Features

Feature Description
Real-Time Audio Analysis Pitch, scale conformity, and timing stability computed continuously at 44.1kHz
AI Coaching Gemini Flash delivers spoken feedback every 20 seconds via ElevenLabs TTS
Voice Agent Hands-free dashboard — query progress, view charts, generate practice plans by voice
Backing Tracks ElevenLabs Music API generates tracks matched to your key — preview before you play
Ambient Light Feedback Tuya smart bulb syncs to your playing — green when locked in, red when you drift
Natural Language Queries Ask "How did I do last week?" — Gemini translates to SQL and answers in plain English
AI Practice Plans Personalized plans generated from your session history, executable from the Dashboard
48 Scale Definitions 12 roots × major/minor × diatonic/pentatonic — every common guitar scale covered

Roadmap

StringIQ is in alpha. Scales are the foundation. The feedback architecture is instrument-agnostic — guitar is the start.

Phase Focus Status
Alpha Scales — pitch accuracy, scale conformity, ambient light feedback, voice coaching, practice plans ✅ Live
Coming Soon Chords — chord detection, progression analysis, strumming pattern timing, transition coaching 🔜 Soon
Planned Songs — full song recognition, play-along mode, section accuracy, tablature integration 📋 Planned
Future All Instruments — piano, violin, cello, bass, flute, brass. Same feedback loop, new frequency profiles 🔭 Future

Tech Stack

Category Technology
Backend Python 3.12+, FastAPI, Uvicorn, Pydantic
Audio librosa (piptrack), sounddevice, numpy
AI Coaching Google Gemini Flash (gemini-2.5-flash)
Voice Agent ElevenLabs Conversational AI (@11labs/react)
TTS ElevenLabs TTS API (eleven_flash_v2_5)
Backing Tracks ElevenLabs Music API (music.compose)
Database PostgreSQL on Supabase (psycopg2, StringIQ schema)
Desktop Electron + React + Tailwind CSS
Smart Bulb tinytuya (local Tuya BulbDevice, no cloud)
Package Manager uv (Python), npm (Studio/Hub)
Hub Vite + React + TypeScript + Tailwind, deployed on Netlify

Screenshots

Practice Mode — Live Metrics

StringIQ Practice Mode — live pitch, scale, and timing metrics

AI Coach Feedback

StringIQ AI Coach — spoken feedback every 20 seconds

Backing Track Preview

StringIQ Backing Track — generate, preview, accept

Dashboard — Voice Agent

StringIQ Dashboard — session history and voice agent

Hub Landing Page

StringIQ Hub — marketing landing page


Running Locally

Prerequisites: Python 3.12+, Node 18+, uv, audio input device (guitar interface or microphone)

Backend

# Install dependencies
uv sync

# Configure environment
cp backend/.env.example backend/.env
# Fill in your API keys (see Environment Variables below)

# Start the server
uvicorn backend.api.server:app --reload --host 127.0.0.1 --port 8000

Studio (Desktop App)

cd studio
npm install
npm run dev    # Starts Vite + Electron dev mode

Hub (Landing Page)

cd hub
npm install
npm run dev    # Local dev server
npm run build  # Production build → dist/

Environment Variables

Create backend/.env from backend/.env.example:

# ElevenLabs (required for TTS, voice agent, backing tracks)
ELEVENLABS_API_KEY=...
ELEVENLABS_VOICE_ID=...
ELEVENLABS_AGENT_ID=...

# Google Gemini (required for coaching, queries, plans)
GOOGLE_API_KEY=...
GEMINI_MODEL=gemini-2.5-flash

# PostgreSQL (Supabase)
DATABASE_URL=postgresql://...
DB_SCHEMA=StringIQ

# OpenAI (optional — used by ai_agent_service)
OPENAI_API_KEY=...

# Tuya Smart Bulb (optional)
HAVELLS_DEVICE_ID=...
HAVELLS_IP=...
HAVELLS_LOCAL_KEY=...

Security: Never commit .env files. Agent IDs (agent_*) are secrets — treat them like API keys.


ElevenLabs Integration

Product Usage
Conversational AI + TTS Together deliver real-time voice coaching — hands-free voice agent in the Dashboard, spoken feedback every 20 seconds
Music API Backing track generation matched to your key and scale
Voice Design Crafted the coach's unique voice

License

MIT License

This project is licensed under the MIT License.

About

StringIQ is a voice-driven AI guitar practice coaching system. Plug in your guitar, pick a scale, and get real-time spoken feedback on your playing

Topics

Resources

License

Stars

Watchers

Forks

Contributors