Complete Django + Celery example with RabbitMQ broker for asynchronous task processing.
- Copy file
.env.localto.env - Run command
make runto run the project (the first time run build images) - Create super user to access django admin by command
make createsuperuser
- Backend Core admin app: http://localhost:8000/admin/
- Celery monitoring (Flower): http://localhost:5555
The Makefile contains a few helpers to make your life easier. Run make help to know more.
port: 8000 path: /healthcheck/ status code: 200
This project demonstrates the complete Celery workflow for asynchronous task processing:
Django View/Admin → Celery Task (.delay) → RabbitMQ Broker → Celery Worker → Result Backend (DB)
-
Broker (RabbitMQ): Message broker between Django and workers
- Configuration in
.env.local:CELERY_BROKER_URL=amqp://admin:admin@broker:5672/vhost - Queues tasks from Django to be processed by workers
- Configuration in
-
Celery App (
core/celery.py): Main configuration- Auto-discovers tasks in
tasks.pyfrom all Django apps - Loaded on Django startup via
core/__init__.py
- Auto-discovers tasks in
-
Tasks (
apps/client/tasks.py): Asynchronous task definitions- Use
@shared_taskdecorator to define tasks - Example:
collect_post_by_client(client_id)creates 200 posts asynchronously
- Use
-
Result Backend: Stores task results in database
- Uses
django-celery-resultspackage - Configuration:
CELERY_RESULT_BACKEND="django-db" - Allows checking task status and retrieving results
- Uses
-
Flower: Real-time monitoring dashboard
- Access at http://localhost:5555
- Monitor tasks, workers, and queue status
The project includes a Django admin action in apps/client/admin.py:11-14:
@admin.action(description="Run collect post task")
def collect_post(modeladmin, request, queryset):
for c in queryset.all():
collect_post_by_client.delay(c.id) # Sends task to brokerUsage:
- Go to http://localhost:8000/admin/client/client/
- Select one or more clients
- Choose "Run collect post task" from the actions dropdown
- The task will be sent to RabbitMQ and processed by a Celery worker
You can trigger tasks from any Django view:
# apps/client/views.py
from django.http import JsonResponse
from .tasks import collect_post_by_client
def trigger_collect_posts(request, client_id):
"""Trigger async task to collect posts for a client"""
task = collect_post_by_client.delay(client_id)
return JsonResponse({
'task_id': task.id,
'status': 'Task sent to broker',
'client_id': client_id
})
def check_task_status(request, task_id):
"""Check the status of a running task"""
from celery.result import AsyncResult
task = AsyncResult(task_id)
return JsonResponse({
'task_id': task_id,
'status': task.status, # PENDING, STARTED, SUCCESS, FAILURE, RETRY
'result': task.result if task.ready() else None
})Add these views to your urls.py:
from apps.client import views
urlpatterns = [
path('collect-posts/<int:client_id>/', views.trigger_collect_posts),
path('task-status/<str:task_id>/', views.check_task_status),
]- User action: Admin selects a client and runs "Run collect post task"
- Django: Calls
collect_post_by_client.delay(client_id) - Celery: Serializes task and sends to RabbitMQ broker
- RabbitMQ: Queues the task message
- Celery Worker: Picks up task from queue and executes it
- Task execution: Creates 200 posts in database
- Result: Stores task result in
django_celery_resultstable - Monitoring: View progress in Flower dashboard