How to process a list in parallel in Python? [duplicate]

不问归期 提交于 2019-12-01 11:28:05

You may try a basic example like:

from threading import Thread

def process(data):
    print "processing %s" % data

all = ["data1", "data2", "data3"]

for task in all:
    t = Thread(target=process, args=(task,))
    t.start()

Here's a repl and an brief tutorial which shows how to let your caller pause for the threads to join if desired.

In regards to using all your cores, I don't have any information on that, but here are some resources that might be helpful: [1], [2], [3]

Or:

from threading import Thread

def process(data):
    print("processing {}".format(data))

l= ["data1", "data2", "data3"]

for task in l:
    t = Thread(target=process, args=(task,))
    t.start()

Or (only python version > 3.6.0):

from threading import Thread

def process(data):
    print(f"processing {data}")

l= ["data1", "data2", "data3"]

for task in l:
    t = Thread(target=process, args=(task,))
    t.start()

There is a template of using multiprocessing, hope helpful.

from multiprocessing.dummy import Pool as ThreadPool

def process(data):
    print("processing {}".format(data))
alldata = ["data1", "data2", "data3"]

pool = ThreadPool()

results = pool.map(process, alldata)

pool.close()
pool.join()
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!