I have two servers (let me name them A and B).
Facts:
- They have same CPU, memory, motherboard, hard drive, uplink speed.
- They are both on Ubuntu 12.04 with Python 2.7.3 and Django latest revision.
- They also locate in the same data center with same name server setup.
- They have similar ping & traceroute results to name servers.
Server A works fine. My problem is Server B is very slow when using python to connect to the internet.
Below is the tests I did on both servers (domain_list_1 and domain_list_2 are two lists containing 100 unique domains in each list):
Test One:
starttime = time.time()
for domain in domain_list_1:
ip = socket.gethostbyname(domain)
print '%.1f items per second' % (100/(time.time()-starttime))
>> Server A Results: 3.3 items per second
>> Server B Results: 0.7 items per second
Test Two:
starttime = time.time()
for domain in domain_list_2:
os.system('nslookup %s > /dev/null' % domain)
print '%.1f items per second' % (100/(time.time()-starttime))
>> Server A Results: 3.3 items per second
>> Server B Results: 3.3 items per second
As you may see from Test Two, networking on Server B has no problem.
I did similar tests with urllib2 and results is the same (Server A is ok but Server B is slower using urllib2 than using wget or curl to do the same job). So I believe it's a Python problem. I just don't know what went wrong with the Python setup on server B.
Is there a way I can profile into the internal process and find out which part of the code slow down the whole process?
Thank you in advance!