0

We are trying to upload very large (+1Gb) files to Box via the upload api using python's httplib.

So that we don't have to keep the whole file in memory, we are using code like:

CHUNK_SIZE = 1024 * 1024
data = from_file.read(CHUNK_SIZE)
while data:
   http_connection.send(data)
   data = from_file.read(CHUNK_SIZE)

This works fine if the file is small enough, but after 30 seconds Box times out and closes the socket, even if data is still being uploaded. Is there any way to either tell Box that the upload is coming in multiple chunks like the Dropbox chunked_upload/ endpoint, or have Box not timeout after 30 seconds?

brettcvz
  • 2,371
  • 1
  • 13
  • 14
  • i think the limit of free or personal accounts is under 100mb or 1GB depending on certain things – pyCthon Oct 02 '12 at 04:33
  • It would be helpful if you could provide the response you are receiving from Box. In particular the headers, body (if any), and the status code. – Ben Zittlau Oct 03 '12 at 16:35
  • There is no response at all received from Box, the server terminates the connection while data is still being uploaded – brettcvz Oct 03 '12 at 18:45
  • Using libcurl and C++, I'm not having any trouble uploading large files to Box. I've had uploads take as long as 10 minutes. – Collin Dauphinee Oct 10 '12 at 21:36

0 Answers0