Task queues in K8s environment

There are several task-queues for Python: Celery, rq, …

I have never deployed my code to a Kubernetes environment.

But, AFAIK K8s has a concept of jobs: Jobs | Kubernetes

If I plan to run my new code to run in K8s, does it makes sense to use K8s jobs, or should I do it like before in celery/rq?

Hi, Thomas,
I don’t have experience of k8s. But I saw my teammate use k8s to deploy rq and celery’s workers.
why my teammate use celery? the reason is celery has supply some nice functions, such as: easy extending your tasks(using decorator of ‘celery_app.task’), perodic tasks(using decorator of ’ current_app.add_periodic_task’) and so on :slight_smile:

1 Like

I have the gut feeling that sooner or later this job handling gets done by K8s. Things change. Things which were cool in the past won’t be done in the future: double-fork magic to create a daemon, systemd, config management to set up new machines, …

1 Like