Install Docker
Note: Windows 10 version requires Pro, Enterprise or Education editions. For Home edition use Docker Toolbox
Fork or clone the tutorial repo:
https://github.com/rsolano/clepy-docker-compose
Clone command:
git clone https://github.com/rsolano/clepy-docker-compose
Edit the Dockerfile
so that it pulls from a Python image. You can use python:3.7
here.
Edit the docker-compose.yml
in order to define our app services:
-
postgres
The database. Use thepostgres:alpine
image. -
redis
We'll use this as a message broker to communicate with Celery. Use theredis:alpine
image. -
web
Our Django image defined in Dockerfile. Usepython manage.py runserver 0.0.0.0:8000
as thecommand
for the container to run. -
worker
Celery, our task worker. Usecelery -A tutorial worker -l info
as thecommand
for the container to run.
Run:
docker-compose run web django-admin startproject tutorial .
Run:
docker-compose up -d
The -d
option runs the services in the background.
If you are using Docker Toolbox, you need to start the VM, note its IP address and add it to the ALLOWED_HOSTS
property in settings.py
. Use this IP instead of localhost
to access the application.
docker-machine start
docker-machine ip
To view the application logs:
docker-compose logs -f
The -f
option follows the log output.
You can also specify which service(s) to display logs for:
docker-compose logs -f web worker
Edit the tutorial/settings.py
file and modify the DATABASES
section to point to our PostgreSQL instance.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'PASSWORD': os.environ.get('DB_PASSWORD'),
'HOST': os.environ.get('DB_HOST'),
'PORT': '5432',
}
}
This will create the initial database schema in our PostgreSQL database:
docker-compose exec web python manage.py migrate
Log into the db
container:
docker-compose exec db bash
Switch to user postgres
su postgres
Use the psql
utility's \dt
command to view the tables created by Django:
psql
\dt
To exit the utility enter \q
.
To exit the container enter exit
twice.
You can now delete the default db.sqlite3
database file which comes with Django.
The Django app should now be available at http://localhost:8000 at this point.
First, move the celery.py
file provided into the tutorial/
directory in order to set up Django to use Celery.
In tutorial/__init__.py
, add the following lines to ensure Django starts Celery:
from .celery import app as celery_app
__all__ = ('celery_app',)
Now create a Django app to hold our task logic:
docker-compose exec web python manage.py startapp myapp
Add the newly created app to the INSTALLED_APPS
property in settings.py
:
INSTALLED_APPS = [
'myapp.apps.MyappConfig',
...
]
Create a myapp/tasks.py
file with the following code:
import time
from random import random
from celery import task
@task
def perform_lengthy_task():
print("Beginning time-consuming task...")
time.sleep(random() * 10)
print("Done!")
return True
Now we need to restart the worker
since our changes there won't auto-reload like Django does.
docker-compose restart worker
To run the task, open a Django shell:
docker-compose exec web python manage.py shell
Import the task and invoke using delay()
in order for Celery to run it:
from myapp.tasks import *
perform_lengthy_task.delay()
The output should be an AsyncResult
object which indicates the task was sent to Celery successfully.
<AsyncResult: 2dfd7bc5-c46c-4d21-ae23-46eebc933186>