Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
127 changes: 109 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,42 +4,133 @@

Mozambique Proenergia backend.

# Prerequisites
## Prerequisites

- [Docker](https://docs.docker.com/docker-for-mac/install/)
- [Docker](https://docs.docker.com/docker-for-mac/install/) (for Docker setup)
- RabbitMQ (for Celery task processing)
- Python 3.8+ and PostgreSQL (for local development)

# Local Development
## Quick Setup

### Local Development

**1. Set up Python environment:**

Start the dev server for local development:
```bash
docker-compose up
# Create databases
createdb proenergia
createdb proenergia_test

# Set environment variables
export DJANGO_DB_URL="postgis://user:password@localhost:5432/proenergia"
export DJANGO_SECRET_KEY="anyTextIsS3cr3t"

# Install dependencies
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
```

Run a command inside the docker container:
**2. Install RabbitMQ:**

*macOS:*
```bash
docker-compose run --rm web [command]
brew install rabbitmq
export PATH=$PATH:/usr/local/sbin
sudo rabbitmq-server -detached
```

*Ubuntu/Debian:*
```bash
sudo apt-get update && sudo apt-get install -y rabbitmq-server
sudo systemctl enable rabbitmq-server
sudo systemctl start rabbitmq-server
```

**3. Run migrations and start server:**

```bash
python manage.py migrate
python manage.py runserver
```

## Running without using Docker
## Celery Task Processing

- Create a PostgreSQL database named `proenergia`, and another one named `proenergia_test`
- Set the environment variables:
**Start Celery worker** (separate terminal):
```bash
source venv/bin/activate
DJANGO_DB_URL="postgis://postgres:postgres@localhost:5432/proenergia" \
DJANGO_SECRET_KEY="anyTextIsS3cr3t" \
celery -A proenergia worker --loglevel=info
```

**Optional - Start Celery beat for scheduled tasks:**
```bash
DJANGO_DB_URL="postgis://postgres:postgres@localhost:5432/proenergia" \
DJANGO_SECRET_KEY="anyTextIsS3cr3t" \
celery -A proenergia beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler
```
export DJANGO_DB_URL="postgis://user:password@localhost:5432/proenergia"
export DJANGO_SECRET_KEY="anyTextIsS3cr3t"

**Optional - Monitor with Flower:**
```bash
pip install flower
DJANGO_DB_URL="postgis://postgres:postgres@localhost:5432/proenergia" \
DJANGO_SECRET_KEY="anyTextIsS3cr3t" \
celery -A proenergia flower --address=127.0.0.1 --port=5555
```
Access at http://localhost:5555

- Install the python dependencies with `pip install -r requirements.txt`
- To run the server, use `./manage.py runserver`
- You can create a super user with `./manage.py createsuperuser`
## Testing Celery

GeoDjango may require some additional libraries to be installed in your system, check the [documentation](https://docs.djangoproject.com/en/5.2/ref/contrib/gis/install/#installation) or a Docker file in this repository.
**Test endpoints:**
```bash
# Test hello world task
curl -X POST http://localhost:8000/api/v1/tasks/hello/ -H "Content-Type: application/json" -d '{"name": "Alice"}'

To run the tests, use:
# Check task status (use task_id from above)
curl http://localhost:8000/api/v1/tasks/status/<task_id>/

# List recent tasks
curl http://localhost:8000/api/v1/tasks/list/
```
DJANGO_DB_URL="postgis://user:password@localhost:5432/proenergia_test" ./manage.py test --settings=proenergia.config.local

**Run tests:**
```bash
# All tests
python manage.py test

# Task tests only
python manage.py test proenergia.tasks.test_tasks

# With pytest
pytest proenergia/tasks/test_tasks.py -v
```

## Configuration

**Key environment variables:**
- `DJANGO_DB_URL` - Database connection string
- `DJANGO_SECRET_KEY` - Django secret key
- `CELERY_BROKER_URL` - RabbitMQ connection (default: `amqp://guest:guest@localhost:5672//`)

## Available Tasks

**Hello World Task** (`hello_world_task`): Demo task with 2-second delay for testing async processing framework.

## Troubleshooting

- **Connection refused to RabbitMQ**: Check `sudo rabbitmqctl status`
- **Tasks not executing**: Ensure Celery worker is running with proper environment variables
- **Database connection errors**: Set `DJANGO_DB_URL` when starting Celery worker
- **Settings import errors**: Ensure `DJANGO_SECRET_KEY` is set for Celery commands

**Logs:**
- Django: console output from `runserver`
- Celery: console output from worker and TaskResults in Django Admin
- Task results: Django admin → "Django Celery Results"
- Flower: http://localhost:5555 (if running)

## API Documentation

- Swagger UI: http://localhost:8000/api/v1/docs/
- API Schema: http://localhost:8000/api/v1/schema/
5 changes: 5 additions & 0 deletions proenergia/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ("celery_app",)
29 changes: 29 additions & 0 deletions proenergia/celery.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import os
from celery import Celery

# Set the default Django settings module for the 'celery' program.
# This project uses django-configurations, so we need to set both variables.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "proenergia.config")
os.environ.setdefault("DJANGO_CONFIGURATION", "Local")

# This import must come after setting the environment variables
import configurations

configurations.setup()

app = Celery("proenergia")

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
app.config_from_object("django.conf:settings", namespace="CELERY")

# Load task modules from all registered Django apps.
app.autodiscover_tasks()

# Configuration for connection retry
app.conf.broker_connection_retry_on_startup = True


@app.task(bind=True, ignore_result=True)
def debug_task(self):
print(f"Request: {self.request!r}")
38 changes: 38 additions & 0 deletions proenergia/celery_tasks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
import time
import logging
from celery import shared_task

logger = logging.getLogger(__name__)


@shared_task(bind=True)
def hello_world_task(self, name="World"):
"""
A simple hello world task for testing Celery integration.

Args:
name (str): Name to greet, defaults to "World"

Returns:
dict: Task result with greeting message and metadata
"""
task_id = self.request.id
logger.info(f"Starting hello_world_task with ID: {task_id}, greeting: {name}")

try:
# Simulate some work
time.sleep(2)

result = {
"message": f"Hello, {name}!",
"task_id": task_id,
"status": "success",
"timestamp": time.time(),
}

logger.info(f"Completed hello_world_task {task_id} successfully")
return result

except Exception as exc:
logger.error(f"hello_world_task {task_id} failed: {str(exc)}")
raise
19 changes: 19 additions & 0 deletions proenergia/config/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@ class Common(Configuration):
"rest_framework.authtoken", # token authentication
"django_filters", # for filtering rest endpoints
"drf_spectacular", # api-docs
# Celery apps
"django_celery_results",
"django_celery_beat",
# Your apps
"proenergia.users",
"proenergia.datasets",
Expand Down Expand Up @@ -220,3 +223,19 @@ class Common(Configuration):
"SITE_HEADER": "Mozambique PROENERGIA+",
"SITE_SUBHEADER": "Administration Interface",
}

# Celery Configuration
CELERY_BROKER_URL = os.getenv(
"CELERY_BROKER_URL", "amqp://guest:guest@localhost:5672//"
)
CELERY_RESULT_BACKEND = "django-db"
CELERY_CACHE_BACKEND = "django-cache"

# Celery serialization settings
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
CELERY_TIMEZONE = TIME_ZONE

# Celery results retention settings
CELERY_RESULT_EXPIRES = 86400 # 1 day in seconds
3 changes: 3 additions & 0 deletions proenergia/config/local.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,6 @@ class Local(Common):
EMAIL_HOST = "localhost"
EMAIL_PORT = 1025
EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"

# GDAL_LIBRARY_PATH = "/opt/homebrew/opt/gdal/lib/libgdal.dylib"
# GEOS_LIBRARY_PATH = "/opt/homebrew/opt/geos/lib/libgeos_c.dylib"
Empty file added proenergia/tasks/__init__.py
Empty file.
Empty file.
34 changes: 34 additions & 0 deletions proenergia/tasks/tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import pytest
from celery import current_app
from django.contrib.auth import get_user_model


@pytest.fixture(autouse=True)
def enable_db_access_for_all_tests(db):
"""
Enable database access for all tests.
"""
pass


@pytest.fixture
def celery_eager():
"""
Configure Celery to execute tasks synchronously for testing.
"""
current_app.conf.task_always_eager = True
current_app.conf.task_eager_propagates = True
yield
current_app.conf.task_always_eager = False
current_app.conf.task_eager_propagates = False


@pytest.fixture
def test_user(db):
"""
Create a test user.
"""
User = get_user_model()
return User.objects.create_user(
username="testuser", email="test@example.com", password="testpass123"
)
74 changes: 74 additions & 0 deletions proenergia/tasks/tests/test_tasks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
from unittest.mock import patch
from django.test import TestCase
from celery import current_app

import proenergia.celery_tasks as celery_tasks


class TestHelloWorldTask(TestCase):
"""Test cases for the hello world task."""

def setUp(self):
"""Configure Celery to execute tasks synchronously for testing."""
current_app.conf.task_always_eager = True
current_app.conf.task_eager_propagates = True
current_app.conf.task_store_eager_result = True

def test_hello_world_task_success(self):
"""Test hello world task executes successfully."""
result = celery_tasks.hello_world_task.apply(["Alice"])

self.assertTrue(result.successful())
self.assertIn("message", result.result)
self.assertEqual(result.result["message"], "Hello, Alice!")
self.assertEqual(result.result["status"], "success")
self.assertIn("task_id", result.result)
self.assertIn("timestamp", result.result)

def test_hello_world_task_default_name(self):
"""Test hello world task with default name."""
result = celery_tasks.hello_world_task.apply()

self.assertTrue(result.successful())
self.assertEqual(result.result["message"], "Hello, World!")

@patch("time.sleep")
def test_hello_world_task_with_mock_sleep(self, mock_sleep):
"""Test hello world task with mocked sleep to speed up test."""
result = celery_tasks.hello_world_task.apply(["Bob"])

mock_sleep.assert_called_once_with(2)
self.assertTrue(result.successful())
self.assertEqual(result.result["message"], "Hello, Bob!")

def test_task_result_database_storage(self):
"""Test that essential task results are stored in TaskResult for Django Admin."""
from django_celery_results.models import TaskResult
import json

# Clear any existing task results
TaskResult.objects.all().delete()

# Execute a task
name = "Database Test"
result = celery_tasks.hello_world_task.apply([name])
task_id = result.id

# Verify TaskResult was created in database
task_result = TaskResult.objects.get(task_id=task_id)

# Test essential fields that are always populated in eager mode
self.assertEqual(task_result.task_id, task_id)
self.assertEqual(task_result.status, "SUCCESS")

# Test result data is properly stored as JSON
stored_result = json.loads(task_result.result)
self.assertEqual(stored_result["message"], f"Hello, {name}!")
self.assertEqual(stored_result["status"], "success")
self.assertIn("task_id", stored_result)
self.assertIn("timestamp", stored_result)

# Test timestamps are populated
self.assertIsNotNone(task_result.date_created)
self.assertIsNotNone(task_result.date_done)
self.assertIsNone(task_result.traceback) # No error occurred
Loading