Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Methods of using Celery and Docker to deal with periodic tasks in Django

2025-02-24 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly explains the use of Celery and Docker to deal with regular tasks in Django, the content is clear, interested friends can learn, I believe it will be helpful after reading.

When building and extending Django applications, it is inevitable that certain tasks need to be run automatically in the background on a regular basis.

Some examples:

Generate periodic reports

Clear cach

Send bulk email notification

Perform nightly maintenance work

This is one of the few features required to build and extend Web applications that are not part of the Django core. Fortunately, Celery provides a powerful solution that is very easy to implement, called Celery Beat.

In the following articles, we will show you how to set up Django,Celery and Redis using Docker so that custom Django Admin commands can be run periodically through Celery Beat.

Dependencies:

Django v3.0.5

Docker v19.03.8

Python v3.8.2

Celery v4.4.1

Redis v5.0.8

Django + Celery series:

Asynchronous tasks for Django and Celery

Use Celery and Docker to handle periodic tasks in Django (this article! )

target

At the end of this tutorial, you should be able to:

Using Docker to containerize Django,Celery and Redis

Integrate Celery into Django applications and create tasks

Write custom Django Admin commands

Schedule custom Django Admin commands to run periodically through Celery Beat

Project setup

Clone the base project from the django-celery-beat repository and check out the base branch:

$git clone

Https://github.com/testdrivenio/django-celery-beat

-branch base-single-branch

$cd django-celery-beat

Since we need to manage a total of four processes (Django,Redis,worker and Scheduler), we will use Docker to simplify their workflows by connecting them so that they can all be run from a terminal window with a single command.

Create an image from the project root and start the Docker container:

$docker-compose up-d-build$ docker-compose exec web python manage.py migrate

After the build is complete, navigate to http:// localhost:1337 to ensure that the application works as expected. You should see the following text:

Orders

No orders found!

Project structure:

├── .gitignore

├── docker-compose.yml

└── project

├── Dockerfile

├── core

│ ├── _ _ init__.py

│ ├── asgi.py

│ ├── settings.py

│ ├── urls.py

│ └── wsgi.py

├── entrypoint.sh

├── manage.py

├── orders

│ ├── _ _ init__.py

│ ├── admin.py

│ ├── apps.py

│ ├── migrations

│ │ ├── 0001_initial.py

│ │ └── _ _ init__.py

│ ├── models.py

│ ├── tests.py

│ ├── urls.py

│ └── views.py

├── requirements.txt

└── templates

└── orders

└── order_list.html

Celery and Redis

Now we need to add containers for Celery,Celery Beat and Redis.

First, add the dependency to the requirements.txt file:

Django==3.0.5celery==4.4.1redis==3.4.1

Contents of docker-compose.yml file:

Redis: image: redis:alpinecelery: build:. / project command: celery- A core worker-l info volumes: -. / project/:/usr/src/app/ environment:-DEBUG=1-SECRETROKEYKEYBABA 1 "i7%" 3r9 Mustang "Mz4r" (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [:: 1] depends_on:-rediscelery-beat: build:. / project command: celery- A core beat-l info) Volumes: -. / project/:/usr/src/app/ environment:-DEBUG=1-SECRETROKEYKEYBAA1 depends_on i7% 3r9 Muyuki / mz4r Mustang / qeed@ (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [:: 1] depends_on:-redis)

We also need to update the depends_on part of the Web service:

Web: build:. / project command: python manage.py runserver 0.0.0.0.0 python manage.py runserver 8000 volumes: -. / project/:/usr/src/app/ ports:-1337web 8000 environment:-DEBUG=1-SECRETIX KEYRETABAA1 redis i7% 3r9KEYBALA1 redis @ (- Aqr (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [: 1] depends_on:-redis # NEW)

The complete docker-compose file is as follows:

Version: '3.7' services: web: build:. / project command: python manage.py runserver 0.0.0.0Part 8000 volumes: -. / project/:/usr/src/app/ ports:-1337VOL8000 environment:-DEBUG=1-SECRETABLEKEYDAA1 environment:-DEBUG=1-SECRETROKEYBALA1 environment i7% "3r9After" (- Aguilr (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [:: 1] depends_on:-redis redis: Image: redis:alpine celery: build:. / project command: celery- A core worker-l info volumes: -. / project/:/usr/src/app/ environment:-DEBUG=1-SECRETZKEYLAKEYbaa1Exhibit i7% 3r9KEYBAA1For mz4rMuiqeed@ (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [:: 1] depends_on:-redis celery-beat: build:. / project command: celery- A core beat-l info Volumes: -. / project/:/usr/src/app/ environment:-DEBUG=1-SECRETROKEYKEYBAA1 depends_on i7% 3r9 Muyuki / mz4r Mustang / qeed@ (g@k8jo8y3r27%m-DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [:: 1] depends_on:-redis)

Before building the new container, we need to configure Celery in the Django application.

Celery configuration

Setting

In the Core directory, create a celery.py file and add the following code:

Import osfrom celery import Celeryos.environ.setdefault ("DJANGO_SETTINGS_MODULE", "core.settings") app = Celery ("core") app.config_from_object ("django.conf:settings", namespace= "CELERY") app.autodiscover_tasks ()

What's going on here?

First, we set a default value for the DJANGO_SETTINGS_MODULE environment variable so that Celery knows how to find the Django project.

Next, we create a new Celery instance named core and assign this value to a variable named app.

Then, we load the celery configuration value from the settings object of django.conf. We use namespace = "CELERY" to prevent conflicts with other Django settings. In other words, all configuration settings for Celery must be prefixed with CELERY_.

Finally, app.autodiscover_tasks () tells Celery to look for Celery tasks from the application defined in settings.INSTALLED_APPS.

Add the following code to core / _ _ init__.py:

From .celery import app as celery_app _ _ all__ = ("celery_app",)

Finally, update the core / settings.py file so that it can connect to Redis with the following Celery settings:

CELERY_BROKER_URL = "redis://redis:6379" CELERY_RESULT_BACKEND = "redis://redis:6379"

Build:

$docker-compose up-d-build

View the log:

$docker-compose logs' web'$ docker-compose logs' celery'$ docker-compose logs' celery-beat'$ docker-compose logs' redis'

If all goes well, we now have four containers, each providing a different service.

Now we are ready to create a sample task to see if it works properly.

Create a task

Create a new file core / tasks.py and add the following code to the sample task that prints only to the console:

From celery import shared_task@shared_taskdef sample_task (): print ("The sample task just ran.")

Schedule tasks

At the end of the settings.py file, add the following code to schedule sample_task to run every minute using Celery Beat:

CELERY_BEAT_SCHEDULE = {"sample_task": {"task": "core.tasks.sample_task", "schedule": crontab (minute= "* / 1"),},}

Here, we define periodic tasks using the CELERY_BEAT_SCHEDULE settings. We named the task sample_task and declared two settings:

The task declares the task to run.

The timesheet sets the interval at which the task should run. This can be an integer, time increment, or crontab. We used crontab mode in the task, telling it to run every minute. You can find more information about Celery scheduling here.

Be sure to add the import:

From celery.schedules import crontab import core.tasks

Restart the container and apply the changes:

$docker-compose up-d-build

View the log:

$docker-compose logs-f 'celery'celery_1 |-[queues] celery_1 |. > celery exchange=celery (direct) key=celerycelery_1 | celery_1 | celery_1 | [tasks] celery_1 | .core.tasks.sample_task

We can see that Celery gets the sample task core.tasks.sample_task.

Every minute, you should see a line in the log that ends with "the sample task just ran":

Celery_1 | [2020-04-15 22 22: INFO/MainProcess]

Received task: core.tasks.sample_ task [8ee5a84f-c54b-4e41-945b-645765e7b20a]

Celery_1 | [2020-04-15 22 22 The sample task just ran.

Custom Django Admin command

Django provides many built-in django-admin commands, such as:

Transfer

Start the project

Startapp

Dump data

Emigration

In addition to the built-in commands, Django gives us the option to create our own custom commands:

Custom administrative commands are especially useful for running stand-alone scripts or scripts that are executed periodically from the UNIX crontab or Windows scheduled task dashboard.

Therefore, we will first configure a new command and then run it automatically using Celery Beat.

First create a new file called orders / management / commands / my_custom_command.py. Then, add the minimum code required to run it:

From django.core.management.base import BaseCommand, CommandError class Command (BaseCommand): help = "A description of the command" def handle (self, * args, * * options): pass

BaseCommand has some methods that can be overridden, but the only one that is needed is handle. Handle is the entry point for custom commands. In other words, this method will be called when we run the command.

For testing, we usually only add a quick print statement. However, it is recommended that you use stdout.write instead of Django documentation:

When you use administrative commands and want to provide console output, you should write to self.stdout and self.stderr instead of printing directly to stdout and stderr. By using these agents, it becomes easier to test custom commands. Also note that you do not need to end the message with a newline character, it will be added automatically unless you specify the end parameter.

Therefore, add a self.stdout.write command:

From django.core.management.base import BaseCommand, CommandError class Command (BaseCommand): help = "A description of the command" def handle (self, * args, * * options): self.stdout.write ("My sample command just ran.") # NEW

Test:

$docker-compose exec web python manage.py my_custom_commandMy sample command just ran.

So, let's tie everything together!

Use Celery Beat to schedule custom commands

Now that we have started and run the container, have been tested, can schedule tasks to run regularly, and have written custom Django Admin sample commands, it is time to set up to run custom commands on a regular basis.

Setting

In the project, we have a very basic application called orders. It contains two models, the product and the order. Let's create a custom command that sends an email report confirming the order from the same day.

First, we will add some products and orders to the database through the fixtures included in this project:

$docker-compose exec web python manage.py loaddata products.json

Create a superuser:

$docker-compose exec web python manage.py createsuperuser

When prompted, please fill in your user name, email and password. Then navigate to http://127.0.0.1:1337/admin in your Web browser. Log in with the superuser you just created and create several orders. Make sure at least one date is today.

Let's create a new custom command for our email report.

Create a file called orders / management / commands / email_report.py:

From datetime import timedelta, time, datetime from django.core.mail import mail_adminsfrom django.core.management import BaseCommandfrom django.utils import timezonefrom django.utils.timezone import make_aware from orders.models import Order today = timezone.now () tomorrow = today + timedelta (1) today_start = make_aware (datetime.combine (today, time ()) today_end = make_aware (datetime.combine (tomorrow, time ()) class Command (BaseCommand): help = "Send Today's Orders Report to Admins" def handle (self, * args) * * options): orders = Order.objects.filter (confirmed_date__range= (today_start, today_end)) if orders: message= "" for order in orders: message + = f "{order}\ n" subject= (f "Order Report for {today_start.strftime ('% Ymuri% mmurf% d')}" f "to {today_end.strftime ('% Ymuri% mmurt% d')}") mail_admins (subject=subject, message=message Html_message=None) self.stdout.write ("E-mail Report was sent.") Else: self.stdout.write ("No orders confirmed today.")

In the code, we query the database for the order with the date Confirmed_date, merge the order into a single message for the body of the email, and then use the Django built-in mail_admins command to send the email to the administrator.

Add a virtual administrator email and set EMAIL_BACKEND to use the console backend to send the email to stdout in the settings file:

EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend" DEFAULT_FROM_EMAIL = "noreply@email.com" ADMINS = [("testuser", "test.user@email.com"),]

Run:

$docker-compose exec web python manage.py email_report

Content-Type: text/plain; charset= "utf-8"

MIME-Version: 1.0

Content-Transfer-Encoding: 7bit

Subject: [Django] Order Report for 2020-04-15 to 2020-04-16

From: root@localhost

To: test.user@email.com

Date: Wed, 15 Apr 2020 23:10:45-0000

Message-ID:

Order: 337ef21c-5f53-4761-9f81-07945de385ae-product: Rice

E-mail Report was sent.

Celery Beat

Now we need to create a recurring task to run this command every day.

Add a new task to core / tasks.py:

From celery import shared_taskfrom django.core.management import call_command # NEW@shared_taskdef sample_task (): print ("The sample task just ran.") # NEW@shared_taskdef send_email_report (): call_command ("email_report")

So first we added a call_command import that is used to invoke the django-admin command programmatically. In the new task, call_command is then used as a parameter along with the name of the custom command.

To schedule this task, open the core / settings.py file and update the CELERY_BEAT_SCHEDULE settings to include the new task.

CELERY_BEAT_SCHEDULE = {"sample_task": {"task": "core.tasks.sample_task", "schedule": crontab (minute= "* / 1"),}, "send_email_report": {"task": "core.tasks.send_email_report", "schedule": crontab (hour= "* / 1"),},}

Here, we add a new entry called send_email_report to CELERY_BEAT_SCHEDULE. As we did for the previous task, we declared the task that the task should run-such as core.tasks.send_email_report--and set repeatability using the crontab mode.

Restart the container to ensure that the new settings are active:

$docker-compose up-d-- build Log View: $docker-compose logs-f 'celery'celery_1 |-[queues] celery_1 |. > celery exchange=celery (direct) key=celerycelery_1 | celery_1 | celery_1 | [tasks] celery_1 | .core.tasks.sample_taskcelery_1 | .core.tasks.send_email_report

A minute later, the email was sent:

Celery_1 | [2020-04-15 2323: WARNING/ForkPoolWorker-1] Content-Type: text/plain; charset= "utf-8"

Celery_1 | MIME-Version: 1.0

Celery_1 | Content-Transfer-Encoding: 7bit

Celery_1 | Subject: [Django] Order Report for 2020-04-15 to 2020-04-16

Celery_1 | From: root@localhost

Celery_1 | To: test.user@email.com

Celery_1 | Date: Wed, 15 Apr 2020 23:20:00-0000

Celery_1 | Message-ID:

Celery_1 |

Celery_1 | Order: 337ef21c-5f53-4761-9f81-07945de385ae-product: Rice

Celery_1 | [2020-04-15 23 23: WARNING/ForkPoolWorker-1]

Celery_1 | [2020-04-15 23 23 20 E-mail Report was sent: WARNING/ForkPoolWorker-1]

Conclusion

In this article, we guide you through setting up Docker containers for Celery,Celery Beat and Redis. Then we showed how to use Celery Beat to create a custom Django Admin command and periodic tasks to run the command automatically.

After reading the above, do you have a better understanding of how to use Celery and Docker to handle regular tasks in Django? if you want to learn more, you are welcome to follow the industry information channel.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report