In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-03 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
In this issue, Xiaobian will bring you about how to use django-celery to complete asynchronous tasks in Django. The article is rich in content and analyzed and described from a professional perspective. After reading this article, I hope you can gain something.
How to use django-celery for asynchronous tasks in Django
Install Celery
We can use pip to install in vietualenv:
pip install django-celery celery
django settings
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379/2' ---Use redis when message queuing
registered
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'devops',
'apps',
'common',
'djcelery',
]
Create a task.py file under apps
from celery import task
@task
def add(x, y):
return x + y
@task def pp(): return 'ffffffffffffffffffffff'
Go back to settings and import them.
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://127.0.0.1:6379/2'
CELERY_IMPORTS = ('apps.task')
Choose a method call in views and change it to task test.
#Project List
def project_list(request):
admin = Admin.objects.get(id=get_current_admin_id(request))
pt=admin.projects.all().order_by('-id')
from apps import task
tt=task.add.delay(2,2)
print 'vv',tt ----Call the add method of task when accessing this feature
page_objects = pages(pt, request, 5) ##pagination
return render_to_response('project/project_list.html',locals())
Install redis: slightly
Start runserver and celery
python manage.py runserver
python manage.py celery worker --loglevel=info
Visit the corresponding page to see the log
[2017-11-10 14:55:17,016: INFO/MainProcess] Task apps.task.add[9bafe6d2-8411-4f5f-8eed-10444da0ae3a] succeeded in 0.00280212797225s: 4 --You can see worker logs, return results
Manual test task
Open a new terminal, activate virtualenv, and switch to the django project directory:
$ python manage.py shell
>>> from apps.task import add
>>> add.delay(2, 2)
At this point, you can see the worker executing the task in the worker window:
[2014-10-07 08:47:08,076: INFO/MainProcess] Got task from broker: myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc]
[2014-10-07 08:47:08,299: INFO/MainProcess] Task myapp.tasks.add[e080e047-b2a2-43a7-af74-d7d9d98b02fc] succeeded in 0.183349132538s: 4
Eager model
Eager model
If you set in settings.py:
CELERY_ALWAYS_EAGER = True
Then Celery runs in eager mode, and task does not need to run with delay:
#If eager mode is enabled, the following two lines of code are the same
add.delay(2, 2)
add(2, 2)
dj-celery timed task
settings Configuration Add
If you need to pass on references, you can write like this
CELERYBEAT_SCHEDULE = {
'add-every-3-minutes': {
'task': 'apps.task.add',
'schedule': timedelta(seconds=3),
'args': (16, 16)
},
}
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' #scheduled tasks
CELERYBEAT_SCHEDULE = {
'add-every-3-seconds': {
'task': 'apps.task.pp',
'schedule': timedelta(seconds=3) ---executes pp function under task every 3 seconds
},
}
Start Celery Beat
In fact, there is a simple way to start the worker and beat together:
python manage.py celery worker --loglevel=info --beat
Start Beat
Celery performs timed tasks through the celerybeat process. Celerybeat keeps running and adds a timed task to the queue when it needs to be performed. Unlike worker processes, Celerybeat requires only one.
Start:
python manage.py celery beat --loglevel=info
View worker logs
[2017-11-10 16:17:25,853: INFO/MainProcess] Received task: apps.task.pp[8a3af6fb-5189-4647-91f2-8aa07489dd1e]
[2017-11-10 16:17:25,858: INFO/MainProcess] Task apps.task.pp[8a3af6fb-5189-4647-91f2-8aa07489dd1e] succeeded in 0.00379144400358s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:28,858: INFO/MainProcess] Received task: apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25]
[2017-11-10 16:17:28,864: INFO/MainProcess] Task apps.task.pp[d87e4ea0-8881-449a-b993-e7657f50ef25] succeeded in 0.0049942266196s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:31,859: INFO/MainProcess] Received task: apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a]
[2017-11-10 16:17:31,865: INFO/MainProcess] Task apps.task.pp[4d05b4f3-92ff-4922-a8f4-7e047749239a] succeeded in 0.00537821277976s: 'ffffffffffffffffffffff'
[2017-11-10 16:17:34,859: INFO/MainProcess] Received task: apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452]
[2017-11-10 16:17:34,865: INFO/MainProcess] Task apps.task.pp[5b21afc1-ebf1-4858-be68-20b9bf318452] succeeded in 0.00530335493386s: 'ffffffffffffffffffffff'
There is another way to supplement that is celery. The above is how to use django-celery to complete asynchronous tasks in Django shared by everyone. If you happen to have similar doubts, you may wish to refer to the above analysis for understanding. If you want to know more about it, please pay attention to the industry information channel.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.