SoFunction
Updated on 2024-10-28

Django+Django-Celery+Celery Integration in Practice

This post is mainly due to the plan to use django to write a plan out of the task, can be timed to rotate the name of the duty officer or timed to execute the script and other functions, Baidu countless pits, can finally get together to deploy this set of things. I am not good at English, good English or want to learn or use in-depth people, it is recommended to refer to the official documents, and this record is not necessarily correct, only to achieve the function of crontab only.

For those who wish to study in depth/docs/celery/

To start with a brief introduction, Celery is a powerful distributed task queue that allows tasks to be executed completely independent of the main program, and can even be assigned to run on other hosts. We usually use it to implement asynchronous tasks (async task) and timed tasks (crontab). Its architecture consists of the following diagram

这里写图片描述

As you can see, Celery contains the following modules:

Task

Contains asynchronous tasks and timed tasks. Asynchronous tasks are usually triggered in business logic and sent to the task queue, while timed tasks are periodically sent to the task queue by the Celery Beat process.

Message Middleware Broker

Broker, that is, the task scheduling queue, receive messages (i.e., tasks) from the task producer, the task will be stored in the queue. Celery itself does not provide queuing services, the official recommendation to use RabbitMQ and Redis and so on.

Task execution unit Worker

The worker is the processing unit that executes the task, it monitors the message queue in real time, gets the task scheduled in the queue and executes it.

Task result storage Backend

Backend is used to store the results of task execution for querying. As with messaging middleware, the storage can also use RabbitMQ, Redis and MongoDB.

asynchronous task
Implementing asynchronous tasks with Celery involves three main steps:

  • Creating a Celery Instance
  • Starting the Celery Worker
  • Application calls asynchronous tasks

I. Quick Start

Local Environment:

  • OS:centOS6.5
  • django-1.9
  • python-2.7.11
  • celery==3.1.20
  • django-celery

python, pip, django related installation is not written in detail, direct reference to Baidu can be;

pip install django==1.9   mountingdjango 
pip install celery==3.1.20 mountingcelery
pip install django-celery  mountingdjango-celery

Installation, if any, fails, and the required dependent environments resolve themselves. For example: mysql-python, etc;
Use do redis as the messaging middleware and install redis:https:///article/

Second, create a django project to start testing

1, create django project named djtest

 startproject djtest1

2. Create an app named apps

cd djtest
python  startapp apps1

3, created after the completion of the django directory structure is as follows:

djtest1
├── apps1
│ ├── 
│ ├── 
│ ├── 
│ ├── migrations
│ │ └── 
│ ├── 
│ ├── 
│ └── 
├── djtest1
│ ├── 
│ ├── 
│ ├── 
│ ├── 
│ ├── 
│ └── 
└── 

4. Modify the django configuration file to add the following:

import djcelery ###
djcelery.setup_loader() ###
CELERY_TIMEZONE='Asia/Shanghai' # does not have a Beijing time zone, which should be consistent with TIME_ZONE below.
BROKER_URL='redis://192.168.217.77:16379/8' # Any available redis will do, it doesn't have to be on the host where the django server is running
CELERYBEAT_SCHEDULER = '' ###
 
INSTALLED_APPS = (
  '',
  '',
  '',
  '',
  '',
  '',
  'djcelery',  ### Add the djcelery app
  'apps1',   ### Add the newly created apps1
)
TIME_ZONE='Asia/Shanghai' ###

Add the configuration file as above at the beginning, configure the redis address and port as appropriate, and be sure to set the time zone toAsia/Shanghai. Otherwise the time inaccuracy back affects the operation of the timed task.

The above code first exports the djcelery module and calls the setup_loader method to load the relevant configuration; pay attention to the configuration of the time zone, otherwise the default use of UTC time will be 8 hours slower than the East 8. INSTALLED_APPS add two items at the end of the code, respectively, to add celery service and their own definition of the apps service.

5. Preparation of celery files: djtest/djtest/

#!/bin/python
from __future__ import absolute_import
 
import os
 
from celery import Celery
 
('DJANGO_SETTINGS_MODULE', '')
#Specifying the settings here means the celery command line program will know where your Django project is. 
#This statement must always appear before the app instance is created, which is what we do next: 
from  import settings
 
app = Celery('djtest1')
 
app.config_from_object(':settings')
#This means that you don't have to use multiple configuration files, and instead configure Celery directly from the Django settings.
#You can pass the object directly here, but using a string is better since then the worker doesn't have to serialize the object.
 
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#With the line above Celery will automatically discover tasks in reusable apps if you define all tasks in a separate  module.
#The  should be in dir which is added to INSTALLED_APP in . 
#So you do not have to manually add the individual modules to the CELERY_IMPORT in .
 
@(bind=True)
def debug_task(self):
  print('Request: {0!r}'.format()) #dumps its own request information

6. Modify djtest1/djtest1/init.py

#!/bin/python
from __future__ import absolute_import
 
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

7, next write you want django to complete the app, in this paper to write is registered in the INSTALLED_APPS apps. set up in the INSTALLED_APPS to do autodiscover_tasks, in this paper, I hope that the apps can accept such a directory organization: all apps can be put into the apps below, and each app has a separate directory, just like the above app1, app2, each app has its own respective directory. apps, and each app has its own directory, just like app1 and app2 above, and each app has its owninit.py and (note that each app needs theinit(the .py file, which can be blank). But such a structured organization would report an error at startup saying that module apps could not be found. Then under apps, I added ainit.py file, the error is gone, but the task functions in each app under apps are still not found by django and celery worker.

Then I tried writing a __init__.py under apps1 (blank) and, with all the task functions written in, as follows

8. Synchronization of databases

python  makemigrations
python  migrate

9. Creating super users

python  createsuperuser
 
Username (leave blank to use 'work'): admin
Email address: yyc@
Password: 
Password (again): 
Superuser created successfully.

10, start django-web, start celery beat start celery worker process

python  runserver 0.0.0.0:8001#Start the django application, you can dynamically use django-admin to manage the tasks
python  celery beat #Should be used to monitor task changes
python  celery worker -c 6 -l debug #mandate implementation process,workerstep

11. Add the registered tasks via django-admin and see if the output is normal.

http://192.168.217.77:8001/admin/ Enter password to log in

(1) Add tasks after logging in:

这里写图片描述

Click on the list marked with a red line to add it via add;

(2)

这里写图片描述

Once you click on it, you can see the tasks that already exist, just click on Add;

(3)

这里写图片描述

Follow the prompts, enter name, and select the registered function service via task (registered).
Select the run mode, blocking mode, for how often to run at intervals, or crontab form to run.
Click Arguments (show) to add the arguments that need to be passed into the registered function.

(4)

这里写图片描述

instance, the specific name as well as the runtime and incoming parameters, etc.

(5)

这里写图片描述

After saving, you can view the list.

(6) Inpython celery worker -c 6 -l debugThe startup window shows the following runtime, proving that it has taken effect.

这里写图片描述

In the first line, marked in red, you can see the registered function being called, and in the second line, marked in red, you can see the return value of the function.

This has been basically completed. In practice, we just need to modify or add to the file some functions, so that he is registered to the inside. We write tasks from the foreground django-web, you can make it dynamically loaded to the task. And pass the correct parameters, it can be executed normally. Completion of what we want through this django-celery tool to produce regular backups, unified management of the crontab platform and so on.

Reference Article:

/vintage_1/article/details/47664297
/docs/celery/getting-started/
/p/f78ed01969b3
/p/b7f843f21c46

To this article on the Django + Django-Celery + Celery integration of the actual article is introduced to this , more related Django Celery content please search for my previous articles or continue to browse the following related articles I hope you will support me in the future !