Many threads to write log file at same time in Python

前端 未结 3 994
独厮守ぢ
独厮守ぢ 2020-12-10 04:57

I am writing a script to retrieve WMI info from many computers at the same time then write this info in a text file:

f = open(\"results.txt\", \'w+\') ## to          


        
3条回答
  •  春和景丽
    2020-12-10 05:44

    For another solution, use a Pool to calculate data, returning it to the parent process. This parent then writes all data to a file. Since there's only one proc writing to the file at a time, there's no need for additional locking.

    Note the following uses a pool of processes, not threads. This makes the code much simpler and easier than putting something together using the threading module. (There is a ThreadPool object, but it's not documented.)

    source

    import glob, os, time
    from multiprocessing import Pool
    
    def filesize(path):
        time.sleep(0.1)
        return (path, os.path.getsize(path))
    
    paths = glob.glob('*.py')
    pool = Pool()                   # default: proc per CPU
    
    with open("results.txt", 'w+') as dataf:
        for (apath, asize) in pool.imap_unordered(
                filesize, paths,
        ):
            print >>dataf, apath,asize
    

    output in results.txt

    zwrap.py 122
    usercustomize.py 38
    tpending.py 2345
    msimple4.py 385
    parse2.py 499
    

提交回复
热议问题