Working with Celery for Background Tasks in Django


Celery is an asynchronous task queue/job queue based on distributed message passing. It is used to handle background tasks in Django applications, such as sending emails, processing files, or performing long-running computations without blocking the main application. Celery allows you to offload time-consuming tasks and keep your web application responsive.

1. Introduction to Celery

Celery works by defining tasks that are executed outside of the request-response cycle. These tasks can be scheduled to run asynchronously in the background, either immediately or after a delay. Celery is typically used in combination with a message broker like Redis or RabbitMQ, which is used to manage and queue tasks.

In Django, Celery can be integrated to run background tasks, such as:

  • Sending email notifications
  • Processing images or videos
  • Performing long-running calculations
  • Integrating with external APIs

2. Installing Celery in Django

To start using Celery with Django, you need to install the Celery package and a message broker like Redis. You can install them using pip:

            
    pip install celery redis
            
        

Once installed, you need to configure Celery in your Django project.

3. Setting Up Celery in Django

First, you need to configure Celery in your Django project by creating a celery.py file in the root of your project (the same directory as settings.py).

Example: Configuring Celery

            
    from __future__ import absolute_import, unicode_literals
    import os
    from celery import Celery

    # Set the default Django settings module for the 'celery' program.
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your_project.settings')

    app = Celery('your_project')

    # Using a string here means the worker doesn't have to serialize
    # the configuration object to child processes.
    # - namespace='CELERY' means all celery-related config keys should have a `CELERY_` prefix.
    app.config_from_object('django.conf:settings', namespace='CELERY')

    # Load task modules from all registered Django app configs.
    app.autodiscover_tasks()

    @app.task(bind=True)
    def debug_task(self):
        print('Request: {0!r}'.format(self.request))
            
        

In the code above:

  • os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your_project.settings') ensures that the settings module is set for the Celery instance.
  • app.config_from_object('django.conf:settings', namespace='CELERY') loads the Celery configuration from Django settings with a prefix of CELERY_.
  • app.autodiscover_tasks() tells Celery to look for tasks in all Django apps that have a tasks.py file.
  • debug_task() is a simple task that Celery can use for debugging.

Now, you need to update your __init__.py file to make sure that Celery is loaded when Django starts:

Example: Update \_\_init\_\_.py

            
    from __future__ import absolute_import, unicode_literals

    # This will make sure the app is always imported when
    # Django starts so that shared_task will use this app.
    from .celery import app as celery_app

    __all__ = ('celery_app',)
            
        

4. Creating Tasks in Django

Once Celery is set up, you can start creating tasks in your Django apps. A task is a function that you want to run in the background.

Example: Defining a Celery Task

            
    from celery import shared_task

    @shared_task
    def send_email_task(email, subject, message):
        # Simulating sending an email (you can integrate an email sending service here)
        print(f"Sending email to {email} with subject '{subject}' and message: {message}")
        return f"Email sent to {email}"
            
        

In this example:

  • @shared_task is a decorator that marks the function as a Celery task.
  • The send_email_task function simulates sending an email, but you can replace the print statement with actual email sending logic (e.g., using Django's send_mail function).

5. Running Celery Worker

To process the tasks asynchronously, you need to start a Celery worker. The worker is responsible for executing tasks from the task queue. Run the following command to start the Celery worker:

            
    celery -A your_project worker --loglevel=info
            
        

In this command, your_project is the name of your Django project, and the worker command starts the worker process that listens for and executes tasks.

6. Running Periodic Tasks with Celery Beat

Celery also supports periodic tasks, which are tasks that need to run on a schedule, such as sending daily reports or cleaning up temporary files. To manage periodic tasks, you can use Celery Beat.

Example: Setting Up Celery Beat

            
    from celery import Celery
    from celery.schedules import crontab

    app = Celery('your_project')

    app.conf.beat_schedule = {
        'send-daily-emails': {
            'task': 'your_app.tasks.send_email_task',
            'schedule': crontab(hour=7, minute=0),  # Runs every day at 7 AM
            'args': ('example@example.com', 'Daily Report', 'Your daily report is ready.'),
        },
    }

    app.conf.timezone = 'UTC'
            
        

In this example:

  • crontab(hour=7, minute=0) is used to schedule the task to run daily at 7 AM.
  • The beat_schedule dictionary defines periodic tasks and their schedules.

To run Celery Beat, use the following command in addition to the Celery worker:

            
    celery -A your_project beat --loglevel=info
            
        

7. Monitoring Celery Tasks

Celery provides tools for monitoring task execution and managing your Celery workers and beat schedules. You can use tools like Flower for real-time monitoring of Celery tasks and workers.

Example: Installing Flower

            
    pip install flower
            
        

To start Flower, run:

            
    celery -A your_project flower
            
        

This will start Flower at http://localhost:5555, where you can monitor your tasks, workers, and schedules in real-time.

8. Conclusion

Celery is a powerful tool for handling background tasks in Django, allowing you to offload time-consuming tasks to a separate process. With Celery, you can execute tasks asynchronously, run periodic tasks, and manage task queues with a message broker. Whether you are sending emails, processing data, or interacting with external APIs, Celery can improve the performance and scalability of your Django application.





Advertisement