For my college project I am building one traffic generation tool using python. I have developed my own linux server and client on Vmware. I am using urllib2 for traffic generation in python. The problem that I am facing here is when I run my scripts on client machine(that are continuously sending request to linux server using multiprocessing), it works fine for first few minutes say for around 2000 requests but after that it shows "connection reset by peer" error and my script than collapses. What could be the problem? I tried doing this, but it was not helpful.
How can I prevent this time-out error and run my script continuously for hours?
'''
Traffic Generator Script:
Here I have used IP Aliasing to create multiple clients on single vm machine. same I have done on server side to create multiple servers. I have around 50 clients and 10 servers
'''
import multiprocessing
import urllib2
import random
import myurllist #list of all destination urls for all 10 servers
import time
import socbindtry #script that binds various virtual/aliased client ips to the script
response_time=[]
error_count=multiprocessing.Value('i',0)
def send_request3(): #function to send requests from alias client ip 1
opener=urllib2.build_opener(socbindtry.BindableHTTPHandler3)#bind to alias client ip1
try:
tstart=time.time()
for i in range(myurllist.url):
x=random.choice(myurllist.url[i])
opener.open(x).read()
print "file downloaded:",x
response_time.append(time.time()-tstart)
except urllib2.URLError, e:
error_count.value=error_count.value+1
def send_request4(): #function to send requests from alias client ip 2
opener=urllib2.build_opener(socbindtry.BindableHTTPHandler4)#bind to alias client ip2
try:
tstart=time.time()
for i in range(myurllist.url):
x=random.choice(myurllist.url[i])
opener.open(x).read()
print "file downloaded:",x
response_time.append(time.time()-tstart)
except urllib2.URLError, e:
error_count.value=error_count.value+1
#50 such functions are defined here for 50 clients
process=[]
def func():
global process
process.append(multiprocessing.Process(target=send_request3))
process.append(multiprocessing.Process(target=send_request4))
process.append(multiprocessing.Process(target=send_request5))
process.append(multiprocessing.Process(target=send_request6))
#append 50 functions here
for i in range(len(process)):
process[i].start()
for i in range(len(process)):
process[i].join()
print"All work Done..!!"
return
start=float(time.time())
func()
end=float(time.time())-start
print end