Writing to a file with multiprocessing

前端 未结 3 874
孤街浪徒
孤街浪徒 2020-12-09 04:34

I\'m having the following problem in python.

I need to do some calculations in parallel whose results I need to be written sequentially in a file. So I created a fu

3条回答
  •  北海茫月
    2020-12-09 05:23

    If anyone is looking for a simple way to do the same, this can help you. I don't think there are any disadvantages to doing it in this way. If there are, please let me know.

    import multiprocessing 
    import re
    
    def mp_worker(item):
        # Do something
        return item, count
    
    def mp_handler():
        cpus = multiprocessing.cpu_count()
        p = multiprocessing.Pool(cpus)
        # The below 2 lines populate the list. This listX will later be accessed parallely. This can be replaced as long as listX is passed on to the next step.
        with open('ExampleFile.txt') as f:
            listX = [line for line in (l.strip() for l in f) if line]
        with open('results.txt', 'w') as f:
            for result in p.imap(mp_worker, listX):
                # (item, count) tuples from worker
                f.write('%s: %d\n' % result)
    
    if __name__=='__main__':
        mp_handler()
    

    Source: Python: Writing to a single file with queue while using multiprocessing Pool

提交回复
热议问题