0

Urllib for python seems to be incedibly slow at uploading a file (using multipart/form-data)

The browser (Chrome) does it in under 20 seconds, while the script uses almost i minute for the same file.

I'm using urllib2 for the connection, and poster to create the http headers and data, the version of python in question is 2.7.

def upSong(fileName):
    datagen, headers = multipart_encode({"mumuregularfile_0": open(fileName, "rb")})

    uploadID = math.floor(random.random()*1000000)
    request = urllib2.Request("http://upload0.mumuplayer.com:443/?browserID=" + browserID + "&browserUploadID=" + str(uploadID), datagen, headers)

    urllib2.urlopen(request).read()

Is there a way to speed up pythons/urllibs connection, or is this just a limitation of the python language?

EDIT: it should be noted that i already tested all parts, and it is without a doubt urllib.read()

Delusional Logic
  • 808
  • 10
  • 32
  • Have you instrumented your function to see where it's actually running slowly? i.e. is it `multipart_encode`? `urlopen`? – Mattie Jun 09 '12 at 17:39

2 Answers2

1

Chromium probably used compression (if supported by the website), while urllib does not look like using it (grepping "gz" or "bz" gives no result).

I am not sure about it, but Chromium may also be more optimized than traditional connections, using socket hacks or something...

Valentin Lorentz
  • 9,556
  • 6
  • 47
  • 69
0

you may add some time-reporter functions after lines. by this you could be able to pin point which step is lowering the speed or taking longer time. it may also vary on network speed ...

Ovisek
  • 143
  • 2
  • 10
  • I have already done this, the slow part is the urllib2.url.open(request).read() part. no question about it. – Delusional Logic Jun 09 '12 at 18:00
  • As stated in the original question, Chrome uploads around 3 times faster, this may be due to compression as Valentin stated, i might have to look into that. – Delusional Logic Jun 09 '12 at 18:09