1

I need a way to send large files (5 GB) to my webserver, for this I use a plugin that can send chuncks of 100MB. I configured request/response limits and if I send files with a maximum of approximately 800MB, then everything is working fine. If I send larger files then the 10th chunk just stops working. No error or anything, it just stays in a loading state.

I then tried to send smaller chunks (10MB), but then it stalls after 98 requests. With 6mb chunks it also failed and when I finally tried with 1mb chunks it seems to work till the end. The same thing happens when I don't send chunks, but regular files in serial with the same size.

Obviously I'm happy that it worked, but it feels like it's more luck than wisdom, and I'm hesitant to use this in production if I dont understand why the smaller chunks work and the bigger don't.

Does anybody have any idea what could be causing this behaviour?

I'd rather set the chunk size to about 100mb, so smaller files are send as one file instead of chunks I need to combine again. So I'd like to know how I can enable the larger chunks.

  • Question: When the file is chunked, does your sending server send them in parallel (opening a new instance of FTP for each chunk) or serial (one section at a time)? I ask this because FTP is a connection-oriented protocol, meaning it needs enough bandwidth in the upload direction not just to upload but also to acknowledge receipt confirmations and avoid retransmit requests. If you're sending a GB worth of file in 10 parallel chunks, you need a GigE connection. If serial, you only need a 100mbit upload. – George Erhard Mar 14 '17 at 20:43

1 Answers1

0

Upload timeout is a possibility. Larger chunks will be more likely to timeout. You can try increasing the upload timeout in the IIS settings, and/or the settings for the plugin you are using.

There may also be conflicting configs between IIS and the plugin itself perhaps.

grizzthedj
  • 143
  • 1
  • 2
  • 10