django async :
https://docs.djangoproject.com/en/4.1/topics/async/
python thread library:
https://docs.python.org/3/library/threading.html
down side for python threading for web app is that 1 user sends a request, multi threads run, server has to wait for desired thread to finish to give response.
Task based threading :
http request send, server immidately responbds. multi thread jobs are then ran in the background.
Faster response, more complex setup, acheived through celery:
https://medium.com/@ravisarath64/are-you-working-on-django-is-celery-confuse-you-629fedf8287b
Working on technologies like celery can be confusing at first. But it is not as hard as you think. I will briefly explain Celery and its uses while working with Django in this article.
Celery is a nice tool to use whenever you don’t want to wait for some other process to finish. Let us see this with an example.
in the normal way
User sends request
Django receives => spawns a thread to do something else.
wait for the main thread to finishes & the other thread to finishes after completion of both tasks
response is sent to the user as a package 😭 --> it takes too much time.
by using celery
User sends request
Django receives => lets Celery know "hey! do this!"
main thread finishes
response is sent to the user
The user receives the balance of the transaction 😄 --> it's so fast
Can Django do multi-thread works? Yes it can do multi-thread works, but generally one uses Celery to do the equivalent
Wait a minute what is single-threaded & multi-threaded?
Single-threaded processes contain the execution of instructions in a single sequence whereas, multithreaded processes allow the execution of multiple parts of a program at the same time.
let's look into deep.
Celery is an asynchronous queue/job based on distributed message passing. For managing the queue, Celery requires a solution to send and receive messages; usually, this comes in the form of a separate service called a message broker.
Redis, RabbitMQ, Amazon SQS are the choices available for brokers.
With Asynchronous execution, you will begin a task and while running the task in the background you can start your next task, then at some point, say “wait for this to finish”. Let’s look at an example:
Django says => Celery starts a task A.Django says => Celery starts a task B.Django says => Celery starts a task C.at some point Celery says => Wait for A to finish
The advantage is that you can execute both tasks B & C while A is still running in the background, on a separate thread. This allows you to take better advantage of your resources.
Is celery used only for this?
No, it supports scheduling as well. Let us see this with an example.
User sends request
Django receives => Celery send an email at midnight for the user.
main thread finishesthe response is sent to the user
Celery sends the email at midnight. This is how task scheduling is done in celery.
Let us see another example.
User sends request
Django receives => Celery sent alert to the user in every 10 min.
main thread finishesthe response is sent to the user
celery will send an alert to the user every 10 mins. so celery can be used to schedule as well as to do the periodic task.
Let’s see how to install celery.
You can install celery using standard Python tools like pip
or easy_install
:
pip install celery
Now connect the celery to your application
First, you have to add the broker URL in project settings.py. You can use either Redis or RabbitMQ. Here I am using Redis
BROKER_URL = 'redis://localhost:6379'
Add a new file called celery.py inside your Django project
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('picha')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)# If you need to schedule a task or run a task periodically you can # add this code app.conf.beat_schedule = {
# to run a task daily at 12.30 AM midnight 'generate_daily_settlement_report’: {
'task’: 'app.tasks.function_A’,
'schedule’: crontab(hour=1, minute=30),
},
# to run a task in every 30 seconds
'add-every-30-seconds’: {
'task’: 'app.tasks.function_B’,
'schedule’: 30.0,
'args’: (16, 16)
},
}@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Next, we will create a file tasks.py in your app. Then create the functions, function_A, function_B, etc in your app.
from celery.decorators import task
@task(name="function A")
def function_A():
'''do something '''@task(name="function B")
def function_B():
'''do something '''@task(name="function C")
def function_C():
'''do something '''
let the function_C another celery task to do something else. This function can be called from our view using the delay()
method. So we can do the task in other threads without affecting the main thread
function_C.delay()
What’s next?
we have to run the celery worker server
let’s run the celery worker server
You can now run your worker by executing the program with the worker
argument:
celery -A tasks worker --loglevel=INFO
so the worker will execute the task. we have another argument called beat
which will kick the periodic as well as scheduled tasks.
celery -A tasks beat --loglevel=INFO
celery beat is a scheduler; It kicks off tasks at regular intervals, that are then executed by available worker nodes in the cluster. By default, the entries are taken from the beat_schedule setting, but custom stores can also be used, like storing the entries in a SQL database.
Celery beat is a scheduler. It kicks off tasks at regular intervals, which are then executed by available worker nodes in the cluster. By default, the entries are taken from the beat_schedule setting, but custom stores can also be used, like storing the entries in an SQL database.
A short version of the steps you need to do:
- Run your Django and Redis.
- Open two new terminal windows/tabs.
- In each new window, navigate to your project directory.
- Activate your virtualenv.
- Run the celery beat and worker commands.
Conclusion
This stuffs are only for beginners for more details you use the below links.
No comments:
Post a Comment