celery: daemonic processes are not allowed to have children

前端 未结 4 1632
走了就别回头了
走了就别回头了 2021-02-19 03:32

In Python (2.7) I try to create processes (with multiprocessing) in a celery task (celery 3.1.17) but it gives the error:

daemonic processes are not allowed to h         


        
相关标签:
4条回答
  • 2021-02-19 03:38

    I got this when I use multiprocessing with Celery 4.2.0 and Python3.6. Solved this by using billiard.

    I changed my source code from

    from multiprocessing import Process

    to

    from billiard.context import Process

    solved this error.

    Attention, import source is billiard.context not billiard.process

    0 讨论(0)
  • 2021-02-19 03:42

    I got a similar Error trying to call a multiprocessing method from a Celery task in django. I solved using billiard instead of multiprocessing

    import billiard as multiprocessing
    

    Hope it helps.

    0 讨论(0)
  • 2021-02-19 03:49

    billiard and multiprocessing are different libraries - billiard is the Celery project's own fork of multiprocessing. You will need to import billiard and use it instead of multiprocessing

    However the better answer is probably that you should refactor your code so that you spawn more Celery tasks instead of using two different ways of distributing your work.

    You can do this using Celery canvas

    from celery import group
    
    @app.task
    def sleepawhile(t):
        print("Sleeping %i seconds..." % t)
        time.sleep(t)
        return t    
    
    def work(num_procs):
        return group(sleepawhile.s(randint(1, 5)) for x in range(num_procs)])
    
    def test(self):
        my_group = group(work(randint(1, 5)) for x in range(5))
        result = my_group.apply_async()
        result.get()
    

    I've attempted to make a working version of your code that uses canvas primitives instead of multiprocessing. However since your example was quite artificial it's not easy to come up with something that makes sense.

    Update:

    Here is a translation of your real code that uses Celery canvas:

    tasks.py:

    @shared_task
    run_training_method(saveindex, embedder_id):
        embedder = Embedder.objects.get(pk=embedder_id)
        embedder.training_method(saveindex)
    

    models.py:

    from tasks import run_training_method
    from celery import group
    
    class Embedder(Model):
    
        def embedder_update_task(self):
            my_group = []
    
            for saveindex in range(self.start_index, self.start_index + self.nsaves):
                self.create_storage(saveindex)
                # Add to list
                my_group.extend([run_training_method.subtask((saveindex, self.id)) 
                             for i in range(self.nproc)])
    
            result = group(my_group).apply_async()
    
    0 讨论(0)
  • 2021-02-19 03:58

    If you are using a submodule/library with multiprocessing already baked in, it may make more sense to set the -P threads argument of the worker:

    celery worker -P threads
    

    https://github.com/celery/celery/issues/4525#issuecomment-566503932

    0 讨论(0)
提交回复
热议问题