Friday, 19 January 2024

django, celery, celery worker, celery heart beat 101, broker

Detailed django, celery(worker), redis:

https://testdriven.io/blog/django-and-celery/


detailed django, celery(worker), celery heartbeat:

https://testdriven.io/blog/django-celery-periodic-tasks/


Django communicate to celery through a broker(rabbitmq, redis db),

Django send tasks to broker, then celery picks up tasks there.

For Django to send tasks to broker, it needs to have celery installed and use its function.


For celery to pick up tasks celery needs to spawn a woker, celery should only spawn 1 worker, 1 worker can create many child process it depends on how many cpu core you have.


https://medium.com/@iamlal/scale-up-messaging-queue-with-python-celery-processes-vs-threads-402533be269e#:~:text=Celery%20recommends%201%20worker%20per,that%20number%20of%20CPU%20cores.

You can have the above django container to spawn worker, as well but if worker fail you dont want the django application to fail, since separating it and use same volume is better


!!!

https://stackoverflow.com/questions/75245127/why-would-you-separate-a-celery-worker-and-django-container


https://stackoverflow.com/questions/36439024/can-you-run-celery-in-a-different-container-from-django

...
python manage.py migrate
celery -A api worker -l INFO --detach
python manage.py runserver 0.0.0.0:8000

!!!!


worker sampler:

https://www.revsys.com/tidbits/celery-and-django-and-docker-oh-my/

celery:

    build: .

    command: celery -A proj worker -l info

    volumes:

      - .:/code

    depends_on:

      - db

      - redis

This code adds a Celery worker to the list of services defined in docker-compose. Now our app can recognize and execute tasks automatically from inside the Docker container once we start Docker using docker-compose up.


The celery worker command starts an instance of the celery worker, which executes your tasks. -A proj passes in the name of your project, proj, as the app that Celery will run. -l info





Celery heart beat is to schedule jobs to celery worker, 

https://docs.celeryq.dev/en/stable/userguide/periodic-tasks.html

need to sepcify timezone(its using crontab)

You can have the above django container to spawn heart beat, as well but if worker fail you dont want the django application to fail, since separating it and use same volume is better




MSG broker for celery and django can be redis or RabbitMQ

https://docs.celeryq.dev/en/latest/getting-started/first-steps-with-celery.html#keeping-results


Keeping Results

If you want to keep track of the tasks’ states, Celery needs to store or send the states somewhere. There are several built-in result backends to choose from: SQLAlchemy/Django ORM, MongoDB, Memcached, Redis, RPC (RabbitMQ/AMQP), and – or you can define your own.


rabbit MQ sampler

pp = Celery('test_celery',

             broker='amqp://jimmy:jimmy123@localhost/jimmy_vhost',

             backend='rpc://',

             include=['test_celery.tasks'])


Here, we initialize an instance of Celery called app, which is used later for creating a task.


The first argument of Celery is just the name of the project package, which is “test_celery”.


The broker argument specifies the broker URL, which should be the RabbitMQ we started earlier. Note that the format of broker URL should be:

transport://userid:password@hostname:port/virtual_host

https://tests4geeks.com/blog/python-celery-rabbitmq-tutorial/#:~:text=For%20RabbitMQ%2C%20the%20transport%20is,set%20a%20backend%20for%20Celery.

For RabbitMQ, the transport is amqp.

The backend argument specifies a backend URL. A backend in Celery is used for storing the task results. So if you need to access the results of your task when it is finished, you should set a backend for Celery.


rpc means sending the results back as AMQP messages, which is an acceptable format for our demo. More choices for message formats can be found here.


rabbitMQ git sampler:

https://gist.github.com/mmautner/b0821fa054cf584db6275f6253e740ca





No comments:

Post a Comment