Website untuk membaca komik (manhwa, manhua, manga) dan streaming anime dengan subtitle Indonesia.
- π Baca Komik - Manhwa, Manhua, dan Manga dengan image viewer
- π¬ Streaming Anime - Nonton anime dengan berbagai resolusi
- π Pencarian - Cari komik dan anime berdasarkan judul
- π Bookmark - Simpan favorit (localStorage, sync dengan DB jika login)
- π History - Riwayat baca/tonton dengan progress tracking
- π Dark/Light Mode - Toggle tema sesuai preferensi
- π Authentication - Login via Email atau Google (Clerk) - Optional!
- π± Responsive - Mobile-first design
- β¨ Animations - Smooth transitions dengan Framer Motion
- π Pagination - Navigasi halaman untuk list yang panjang
| Category | Technology |
|---|---|
| Framework | Next.js 16 (App Router) |
| Language | TypeScript |
| Styling | Tailwind CSS v4 |
| State | Zustand |
| Data Fetching | TanStack Query |
| Forms | React Hook Form + Zod |
| Animation | Framer Motion |
| Auth | Clerk (optional) |
| Database | PostgreSQL + Prisma (optional) |
| API Source | Sansekai API |
- Node.js 20+
- npm atau yarn
- (Optional) Akun Clerk untuk autentikasi
- (Optional) Akun Supabase untuk database
# 1. Clone repository
git clone https://github.com/KanekiCraynet/komikstream.git
cd komikstream
# 2. Install dependencies
npm install
# 3. Run development server
npm run dev
# 4. Open browser
open http://localhost:3000π Note: Aplikasi dapat berjalan tanpa konfigurasi Clerk atau Supabase. Bookmark dan history akan tersimpan di localStorage.
# 1. Clone repository
git clone https://github.com/KanekiCraynet/komikstream.git
cd komikstream
# 2. Install dependencies
npm install
# 3. Setup environment variables
cp .env.example .env
# 4. Edit .env dengan kredensial Anda:
# - DATABASE_URL (dari Supabase)
# - NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY (dari Clerk)
# - CLERK_SECRET_KEY (dari Clerk)
# 5. Generate Prisma client & push schema
npx prisma generate
npx prisma db push
# 6. Run development server
npm run dev# Development dengan hot reload
docker-compose -f docker-compose.dev.yml up
# Production build
docker-compose up -d
# Build image only
docker build -t KuroManga .βββ .github/
β βββ workflows/ # GitHub Actions CI/CD
β β βββ ci.yml # Lint, test, build
β β βββ e2e.yml # Playwright E2E tests
β β βββ lighthouse.yml # Performance audits
β β βββ release.yml # Auto-release
β β βββ deploy-*.yml # Vercel deployments
β βββ dependabot.yml # Auto dependency updates
βββ e2e/ # Playwright E2E tests
βββ __tests__/ # Jest unit tests
βββ prisma/
β βββ schema.prisma # Database schema
βββ src/
β βββ app/ # Next.js App Router
β β βββ (auth)/ # Auth routes
β β βββ anime/ # Anime pages
β β βββ komik/ # Komik pages
β β βββ bookmark/ # Bookmark page
β β βββ history/ # History page
β β βββ api/ # API routes
β βββ components/
β β βββ layout/ # Navbar, Footer, Sidebar
β β βββ ui/ # Reusable UI components
β β βββ providers/ # Context providers
β βββ lib/ # Utilities & API client
β βββ stores/ # Zustand stores
β βββ types/ # TypeScript types
β βββ hooks/ # Custom React hooks
βββ Dockerfile # Production Docker image
βββ docker-compose.yml # Production compose
βββ docker-compose.dev.yml # Development compose
# Unit tests
npm test # Run all tests
npm run test:watch # Watch mode
npm run test:coverage # With coverage report
# E2E tests (Playwright)
npm run test:e2e # Run E2E tests
npm run test:e2e:ui # With UI mode
npm run test:e2e:headed # With browser visible
npm run test:e2e:report # View last reportPipeline otomatis yang berjalan di setiap push/PR:
| Workflow | Trigger | Description |
|---|---|---|
| CI | Push, PR | Lint, TypeScript, Unit Tests, Build, Security |
| E2E | Push, PR | Playwright tests (Chrome, Firefox, Mobile) |
| Lighthouse | Push, PR | Performance, Accessibility, SEO audits |
| Deploy Preview | PR | Deploy preview ke Vercel |
| Deploy Production | Push to main | Deploy ke production |
| Deploy Staging | Push to staging | Deploy ke staging environment |
| Release | Push to main | Auto-versioning & changelog |
Setup secrets di GitHub repository settings:
| Secret | Description |
|---|---|
VERCEL_TOKEN |
Vercel API token |
VERCEL_ORG_ID |
Vercel organization ID |
VERCEL_PROJECT_ID |
Vercel project ID |
CODECOV_TOKEN |
(Optional) Codecov token |
GET /komik/recommended?type={manhwa|manhua|manga}- RekomendasiGET /komik/latest?type={project|mirror}- TerbaruGET /komik/search?query={query}- PencarianGET /komik/popular?page={page}- PopulerGET /komik/detail?manga_id={id}- DetailGET /komik/chapterlist?manga_id={id}- Daftar chapterGET /komik/getimage?chapter_id={id}- Gambar chapter
GET /anime/latest- TerbaruGET /anime/recommended?page={page}- RekomendasiGET /anime/search?query={query}- PencarianGET /anime/detail?urlId={id}- DetailGET /anime/movie- List movieGET /anime/getvideo?chapterUrlId={id}&reso={resolution}- Streaming
GET/POST/DELETE /api/bookmarks- Manage bookmarksGET/POST/DELETE /api/history- Manage historyGET /api/health- Health check (for Docker/monitoring)POST /api/webhooks/clerk- Clerk webhooks
# Database (Optional - for user sync)
DATABASE_URL="postgresql://..."
# Clerk Authentication (Optional)
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY="pk_..."
CLERK_SECRET_KEY="sk_..."
CLERK_WEBHOOK_SECRET="whsec_..."
# Clerk URLs (optional)
NEXT_PUBLIC_CLERK_SIGN_IN_URL="/sign-in"
NEXT_PUBLIC_CLERK_SIGN_UP_URL="/sign-up"
# API
NEXT_PUBLIC_API_URL="https://api.sansekai.my.id/api"
# Build (CI only)
SKIP_DB_CONNECTION="true"- Hero section fade-in animations
- Card hover lift effects
- Staggered grid animations
- Page transition effects
- Scroll-triggered animations
useKomikLatest()- Fetch latest komikuseKomikPopular(page)- Fetch popular with paginationuseKomikSearch(query)- Search komikuseAnimeLatest()- Fetch latest animeuseAnimeRecommended(page)- Fetch recommended anime- ... and more!
- Push code ke GitHub
- Import project di Vercel
- Add environment variables
- Deploy!
# Build image
docker build -t komikstream .
# Run container
docker run -p 3000:3000 \
-e DATABASE_URL="..." \
-e NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY="..." \
-e CLERK_SECRET_KEY="..." \
KuroManganpm run build
npm startKuroManga uses a dual-environment architecture for optimal performance and reliability:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β User Request β
βββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββ
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Cloudflare Worker (Edge Proxy) β
β β’ Global edge locations (low latency) β
β β’ L1 Cache: CF Cache API (5-15 min TTL) β
β β’ API Proxy: Bypasses origin IP blocks β
β β’ Analytics: CF Analytics Engine β
βββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββ
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Azure App Service (Origin) β
β β’ Next.js SSR with full Node.js runtime β
β β’ L2 Cache: PostgreSQL (Supabase) with TTL β
β β’ L3: External API (sankavollerei.com) β
β β’ L4: Stale fallback (expired DB cache) β
β β’ Observability: Azure Application Insights β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Caching tiers:
| Tier | Location | TTL | Purpose |
|---|---|---|---|
| L1 | CF Cache API | 5-15 min | Edge caching, reduces origin load |
| L2 | PostgreSQL | 30 min | DB cache with structured data |
| L3 | External API | ISR | Fresh data from source |
| L4 | Stale DB | β | Fallback when API is blocked/down |
The app tracks comprehensive metrics via Azure App Insights:
Events tracked:
cache_hit/cache_miss/cache_staleβ Per-tier cache performanceapi_success/api_error/api_retry/api_timeoutβ External API healthrate_limit_hitβ 429 responses from external APIdb_errorβ Database connection/query failures
KQL Queries for App Insights:
// Cache hit rate by tier (last 24h)
customEvents
| where timestamp > ago(24h)
| where name in ("cache_hit", "cache_miss", "cache_stale")
| summarize count() by name, tostring(customDimensions.cacheTier)
| render piechart
// API latency percentiles (last 1h)
customMetrics
| where timestamp > ago(1h)
| where name endswith "_duration_ms"
| summarize
p50=percentile(value, 50),
p95=percentile(value, 95),
p99=percentile(value, 99)
by name
| order by p95 desc
// External API error rate (last 6h)
customEvents
| where timestamp > ago(6h)
| where name in ("api_success", "api_error", "api_timeout", "rate_limit_hit")
| summarize count() by name, bin(timestamp, 15m)
| render timechart
// Stale fallback usage (indicates API issues)
customEvents
| where timestamp > ago(24h)
| where name == "cache_stale"
| summarize count() by tostring(customDimensions.contentType), bin(timestamp, 1h)
| render timechart
// Slowest endpoints (last 1h)
customEvents
| where timestamp > ago(1h)
| where name == "api_success"
| extend durationMs = toint(customMeasurements.durationMs)
| summarize avg(durationMs), max(durationMs), count() by tostring(customDimensions.context)
| order by avg_durationMs desc
| take 10Recommended Alerts:
| Alert | Condition | Severity |
|---|---|---|
| High API error rate | api_error count > 50 in 15 min |
Warning |
| Stale fallback spike | cache_stale count > 20 in 15 min |
Warning |
| Rate limiting | rate_limit_hit count > 10 in 5 min |
Critical |
| Slow API response | P95 latency > 3000ms | Warning |
- Buat akun di UptimeRobot (free)
- Add new monitor:
- Monitor Type: HTTP(s)
- URL:
https://your-domain.vercel.app/api/health - Monitoring Interval: 5 minutes
- Setup alert contacts (email/Telegram/Slack)
curl https://your-domain.vercel.app/api/healthResponse:
{
"status": "healthy",
"timestamp": "2024-01-01T00:00:00.000Z",
"uptime": 12345.67,
"version": "0.1.0",
"checks": {
"database": "connected",
"api": "operational"
},
"responseTime": "5ms"
}MIT License - feel free to use for personal projects.
- Fork the repository
- Create feature branch (
git checkout -b feat/amazing-feature) - Commit changes (
git commit -m 'feat: add amazing feature') - Push to branch (
git push origin feat/amazing-feature) - Open a Pull Request
We use Conventional Commits:
feat:- New featurefix:- Bug fixdocs:- Documentationstyle:- Formattingrefactor:- Code refactoringtest:- Adding testschore:- Maintenance