python test development django-160.Celery timed task (beat)


Celery can be executed asynchronously or triggered by scheduled tasks

Environmental preparation

redis is used as the middleware, and the version used by django is v2.1.2
Install the third-party package required by django. Pay attention to the version number

pip install celery==3.1.26.post2
pip install django-celery==3.3.1
pip install redis==2.10.6

Refer to the previous for detailed basic tutorials
This chapter mainly talks about how to implement the scheduled task. The Celery beat scheduled task in the figure below

Five roles of celery

  • Task s are tasks, including asynchronous tasks and celery beats
  • The Broker broker receives the message from the producer, that is, the Task, and stores the Task in the queue. The consumer of the Task is Worker. Celery itself does not provide queue services. Redis or RabbitMQ is recommended to implement queue services.
  • Worker is the unit that executes tasks. It monitors the message queue in real time. If there is a task, it obtains the task and executes it.
  • The Beat timing task scheduler sends tasks to the Broker according to the configuration timing.
  • Backend is used to store the execution results of tasks.

Using Celery in Django

To use Celery in a Django project, you must first define an instance of the Celery Library (called an "application")

If you have a modern Django project layout, for example:

- proj/
  - proj/

The recommended method is to create a new proj / proj / module to define the cell instance:

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

app = Celery('proj')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django apps.

def debug_task(self):
    print(f'Request: {self.request!r}')

Where debug_task is a test task, which can be logged out

# @app.task(bind=True)
# def debug_task(self):
#     print('Request: {0!r}'.format(self.request))

Just change this sentence in the above paragraph, 'proj' is the app name of your own django project

app = Celery('proj')

Then you need to be in your proj/proj/ module. This ensures that the application is loaded at Django startup for @ shared_ The task decorator (mentioned later) will use it:


# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

The above paragraph is fixed and doesn't need to be changed


Create a new under the app. If the name of the file is required, django will automatically find the file under the app

from __future__ import absolute_import
from celery import shared_task

def add(x, y):
    return x + y

def mul(x, y):
    return x * y can write task functions add and mul. The most direct way to make them effective is to add app.task or shared_task this decorator

Add setting configuration add configuration

CELERY_TIMEZONE = 'Asia/Shanghai'

#  Configuring a connection to redis with celery
BROKER_URL = 'redis://'

# Configure scheduled tasks
from celery.schedules import crontab
from datetime import timedelta

    'add': {
        'task': 'yoyo(django app name).tasks.add',  # task
        'schedule': timedelta(seconds=5),  # The add function is executed every 5 seconds
        'args': (11, 12)  # Operating parameters
    'mul': {
        'task': 'yoyo(django app name).tasks.mul',  # task
        'schedule': timedelta(seconds=10),  # Execute the mul function every 10 seconds
        'args': (11, 2)  # Operating parameters

CELERYBEAT_SCHEDULE is to configure scheduled tasks. You can add multiple tasks. The task name can be consistent with the function name in tasks, or you can define a task name yourself.

  • The task parameter is the name of the task function in the tasks file in the corresponding app directory
  • schedule run cycle, support contrab expression
  • args parameters brought when running the task

Start the worker and beat services

Start the worker and execute the task

celery -A MyDjango(django entry name) worker -l info

Run log

D:\202107django\MyDjango>celery -A MyDjango worker -l info

 -------------- celery@DESKTOP-HJ487C8 v3.1.26.post2 (Cipater)
---- **** -----
--- * ***  * -- Windows-10-10.0.17134-SP0
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         yoyo:0x219342ff978
- ** ---------- .> transport:   redis://
- ** ---------- .> results:     redis://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

  . yoyo.tasks.add
  . yoyo.tasks.mul

[2021-10-21 12:20:32,079: INFO/MainProcess] Connected to redis://
[2021-10-21 12:20:32,167: INFO/MainProcess] mingle: searching for neighbors
[2021-10-21 12:20:33,700: INFO/MainProcess] mingle: all alone

Start beat scheduled task listening

celery -A MyDjango(django entry name) beat -l info

start log

MyDjango>celery -A MyDjango beat -l info
celery beat v3.1.26.post2 (Cipater) is starting.
__    -    ... __   -        _
Configuration ->
    . broker -> redis://
    . loader ->
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> now (0s)
[2021-10-21 12:22:56,867: INFO/MainProcess] beat: Starting...

After startup, you will see the beat running log, and the scheduled task has been pushed

worker run log, execute tasks

crontab cycle task

The previous step is to set the number of seconds to execute the task. This is just to test the function. The task is very simple. We usually use crontab to implement periodic tasks. For example, we can execute the task once every 1-5 morning, which can be easily implemented with crontab

# crontab task
# Call task.add at 8:30 every Monday
from celery.schedules import crontab

    # Executes every Monday morning at 8:30 A.M
    'add': {
        'task': 'yoyo(django app name).tasks.add',  # task
        'schedule': crontab(hour=8, minute=30, day_of_week=1),
        'args': (11, 12)  # Operating parameters

crontab scheduled task command rules:

  • Minute: represents the minute, which can be any integer from 0 to 59.
  • Hour: represents the hour, which can be any integer from 0 to 23.
  • day: represents the date and can be any integer from 1 to 31.
  • Month: represents the month, which can be any integer from 1 to 12.
  • Week: represents the day of the week. It can be any integer from 0 to 7. Here, 0 or 7 represents Sunday.
  • Command: the command to be executed can be a system command or a script file written by yourself.
  • Path: the file to be executed, using the absolute path

Special characters commonly used by crontab commands

*Represents any time
,Represents segmentation
-Indicates a paragraph. For example, in the second paragraph: 1-5, it means 1 to 5 points
/nIt means that each n unit is executed once. For example, * / 1 in the second paragraph means that the command is executed every 1 hour. It can also be written as 1-23 / 1

If the scheduled task is configurable and stored in the database, it can be implemented with djcelery

Tags: Python Django Back-end

Posted on Sat, 23 Oct 2021 00:41:28 -0400 by cody44