Member-only story

Distributed Task Queue with Celery

Luca Berton
2 min readFeb 14, 2025

Celery is a distributed task queue that allows you to run background tasks asynchronously, distribute work across multiple worker nodes, and handle long-running processes efficiently.

1. Core Components of Celery

Celery consists of the following key components:

  • Broker: Handles message queuing (Redis, RabbitMQ, Amazon SQS, etc.).
  • Workers: Execute tasks from the queue.
  • Result Backend: Stores task results (Redis, PostgreSQL, RabbitMQ, etc.).
  • Task Queue: Holds pending tasks before execution.

2. Installing Celery

To install Celery with Redis as a message broker:

pip install celery redis

If using RabbitMQ:

pip install celery

3. Setting Up a Basic Celery Application

Create a file celery_app.py:

from celery import Celery
app = Celery(
'tasks',
broker='redis://localhost:6379/0', # Message queue (Redis)
backend='redis://localhost:6379/0' # Result storage
)
@app.task
def add(x, y):
return x + y

4. Running Celery Workers

--

--

Luca Berton
Luca Berton

Written by Luca Berton

I help creative Automation DevOps, Cloud Engineer, System Administrator, and IT Professional to succeed with Ansible Technology to automate more things everyday

No responses yet