A modern RSS feed aggregator built with Go, featuring real-time feed scraping, user authentication, and a RESTful API. This application allows users to subscribe to RSS feeds, automatically fetches new posts, and provides a clean API for accessing aggregated content.
- User Authentication: Secure user registration and login with API key-based authentication
- RSS Feed Management: Add, view, and manage RSS feeds
- Feed Following: Subscribe to feeds and get personalized post streams
- Automatic Scraping: Background service that periodically fetches new posts from RSS feeds
- RESTful API: Clean, well-structured API endpoints for all operations
- Database Integration: PostgreSQL database with proper schema management using SQLC
- Concurrent Scraping: Multi-threaded RSS feed scraping with configurable concurrency
- Error Handling: Robust error handling and logging throughout the application
- CORS Support: Cross-origin resource sharing enabled for web applications
- Health Checks: Built-in health check endpoint for monitoring
- Docker Support: Containerized deployment with Docker Compose
rssagg/
βββ main.go # Application entry point and routing
βββ models.go # Data models and conversion functions
βββ scrapper.go # RSS feed scraping logic
βββ handler_*.go # HTTP request handlers
βββ middleware_auth.go # Authentication middleware
βββ json.go # JSON response utilities
βββ docker-compose.yml # Docker Compose configuration
βββ go.mod # Go module dependencies
βββ sqlc.yaml # SQLC configuration
βββ internal/
β βββ auth/ # Authentication utilities
β βββ database/ # Database models and queries (generated)
βββ sql/
βββ schema/ # Database migration files
βββ queries/ # SQL queries for SQLC
- Users: User accounts with email, password, and API key
- Feeds: RSS feed information with URL and ownership
- Feed Follows: User subscriptions to feeds
- Posts: Individual RSS posts with metadata
- Backend: Go 1.24.5
- Database: PostgreSQL 15
- ORM: SQLC (type-safe SQL)
- Web Framework: Chi Router
- Authentication: API Key-based
- Containerization: Docker & Docker Compose
- Environment: Environment variables with godotenv
- Database Migrations: Goose
- Go 1.24.5 or higher
- Docker and Docker Compose
- PostgreSQL (if running locally)
-
Clone the repository
git clone <repository-url> cd rssagg
-
Create environment file
cp .env.example .env # Edit .env with your configuration -
Start the application
docker-compose up -d
-
Run database migrations
# If using goose for migrations goose -dir sql/schema postgres "host=localhost port=5432 user=rssagg_user password=rssagg_password dbname=rssagg sslmode=disable" up
-
Install dependencies
go mod download
-
Start PostgreSQL
docker-compose up postgres -d
-
Set environment variables
export PORT=8080 export DB_URL="postgres://rssagg_user:rssagg_password@localhost:5432/rssagg?sslmode=disable"
-
Run the application
go run .
All protected endpoints require an API key in the Authorization header:
Authorization: ApiKey <your-api-key>
POST /auth/register- Register a new userPOST /auth/login- Login user
GET /users/me- Get current user information
POST /feeds/- Create a new RSS feedGET /feeds/- Get all feedsGET /feeds/me- Get feeds created by current userGET /feeds/{feedId}- Get specific feed by ID
POST /feed-follows/- Follow a feedGET /feed-follows/- Get user's followed feedsDELETE /feed-follows/{feedFollowID}- Unfollow a feed
GET /posts/- Get posts for current user (from followed feeds)
GET /- Welcome messageGET /healthz- Health check endpoint
PORT: Server port (default: 8080)DB_URL: PostgreSQL connection stringSCRAPER_CONCURRENCY: Number of concurrent scrapers (default: 10)SCRAPER_INTERVAL: Time between scraping runs (default: 10 minutes)
The application uses PostgreSQL with the following default configuration:
- Database:
rssagg - User:
rssagg_user - Password:
rssagg_password - Port:
5432
The application includes a background scraper that:
- Runs every 10 minutes (configurable)
- Processes up to 10 feeds concurrently (configurable)
- Fetches new posts from RSS feeds
- Stores posts in the database
- Updates feed last fetched timestamp
Replace the current API key authentication with JWT tokens for better security and user experience.
Benefits:
- Stateless authentication
- Token expiration and refresh
- Better security practices
- Easier integration with frontend applications
Add pagination support to endpoints that return lists of data.
Benefits:
- Better performance with large datasets
- Reduced memory usage
- Better user experience
Add a command-line interface for interacting with the application.
Features:
- User management commands
- Feed management
- Post viewing
- Configuration management
Implement WebSocket support for real-time post notifications.
Benefits:
- Instant notifications of new posts
- Better user engagement
- Real-time updates
Add support for organizing feeds into categories and tagging posts.
Benefits:
- Better content organization
- Improved search and filtering
- Personalized content discovery
Implement search functionality for posts and feeds.
Features:
- Full-text search
- Filter by date, feed, category
- Advanced search operators
Add rate limiting to prevent abuse and ensure fair usage.
Implement Redis caching for frequently accessed data.
Benefits:
- Improved performance
- Reduced database load
- Better scalability
Add comprehensive metrics and monitoring.
Features:
- Prometheus metrics
- Health checks
- Performance monitoring
- Error tracking
Implement API versioning for better backward compatibility.
Implement a proper job queue for RSS scraping.
Benefits:
- Better resource management
- Retry mechanisms
- Job prioritization
- Scalability
Add user preference management.
Features:
- Notification preferences
- Default feed settings
- UI preferences
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.