Python multiprocessing: dealing with 2000 processes
问题 Following is my multi processing code. regressTuple has around 2000 items. So, the following code creates around 2000 parallel processes. My Dell xps 15 laptop crashes when this is run. Can't python multi processing library handle the queue according to hardware availability and run the program without crashing in minimal time? Am I not doing this correctly? Is there a API call in python to get the possible hardware process count? How can I refactor the code to use an input variable to get