Python multiprocessing Pool.apply_async with shared variables (Value)

前端 未结 2 2177
情歌与酒
情歌与酒 2021-01-01 07:29

For my college project I am trying to develop a python based traffic generator.I have created 2 CentOS machines on vmware and I am using 1 as my client and 1 as my server ma

2条回答
  •  夕颜
    夕颜 (楼主)
    2021-01-01 07:37

    Possibly, because Python Multiprocess diff between Windows and Linux (I seriously, don't know how multiprocessing works in VMs, as is the case here.)

    This might work;

    import multiprocessing
    import random
    import myurllist    #list of all destination urls for all 10 servers
    import time
    
    def send_request3(response_time, error_count):    #function to send requests from alias client ip 1
        opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)    #bind to alias client ip1
        try:
            tstart=time.time()
            for i in range(myurllist.url):
                x=random.choice(myurllist.url[i])
                opener.open(x).read()
                print "file downloaded:",x
                response_time.append(time.time()-tstart)
        except urllib2.URLError, e:
            error_count.value=error_count.value+1
    def send_request4(response_time, error_count):    #function to send requests from alias client ip 2
        opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)    #bind to alias client ip2
        try:
            tstart=time.time()
            for i in range(myurllist.url):
                x=random.choice(myurllist.url[i])
                opener.open(x).read()
                print "file downloaded:",x
                response_time.append(time.time()-tstart)
        except urllib2.URLError, e:
            error_count.value=error_count.value+1
    #50 such functions are defined here for 50 clients
    def func():
        m=multiprocessing.Manager()
        response_time=m.list()    #some shared variables
        error_count=multiprocessing.Value('i',0)
    
        pool=multiprocessing.Pool(processes=750)
        for i in range(5):
            pool.apply_async(send_request3, [response_time, error_count])
            pool.apply_async(send_request4, [response_time, error_count])
            # pool.apply_async(send_request5)
    #append 50 functions here
        pool.close()
        pool.join()
        print"All work Done..!!"
        return
    
    
    start=float(time.time())
    func()
    end=float(time.time())-start
    print end
    

提交回复
热议问题