1

I want to upload large files (can upload files by concurrent users at the same time), will permit sequential file upload per user, want to control upload through throttling techniques (using java).

Also, I want to kill the upload process in the middle if it taking more time.

Though I am going thru various articles and material online, researching on it.

Can you suggest some thoughts or insights, best approaches if this already implemented by you?

ItamarG3
  • 4,092
  • 6
  • 31
  • 44
pjason
  • 155
  • 1
  • 2
  • 10

2 Answers2

0

I have implemented throttled SFTP in the past. The key is to create a control loop on the server receiving end with a periodic micro delay in which the next delay duration is determined by the number of bytes that have been received per fixed unit of time. If you are reading more bytes per unit time than desired, incrementally increase the delay. If you are reading less, then incrementally decrease the delay.

You will also want to take into account the number of parallel downloads in order to determine the maximum rate for a single download. Unfortunately that code was made for an employer so I don't have it handy or I would provide some important snippets.

I seem to recall that we strategically modified openssh for this purpose. It was only a handful of lines to modify.

Jason K.
  • 790
  • 1
  • 14
  • 22
  • As per our requirment we upload files in browser and this upload process must be throttled to control the bandwidth. Pls suggest – pjason Apr 19 '17 at 01:00
  • The browser still needs to send the files to a destination on your server so the key is to have the receiving end throttle. Do you know what software on the receiving end is getting the files? Besides only the server will know how many parallel uploads are taking place so that you can reduce the expected upload rate from each browser.. – Jason K. Apr 19 '17 at 01:02
  • Agree that server should keep track of parallel uploads. Lets say we are uploading files from browser, its progress should be tracked whether upload is in progress/completed and also stopped if necessary. Throttling on Server side, did you use any apache cxf mtom or custom java code for server side? – pjason Apr 19 '17 at 01:07
  • As mentioned above, we modified openssh. We added about 10 lines. You could do something similar with an nginx or apache plugin. As for the client side, you would probably need custom code to indicate upload status as this is normally a function of the browser. Although you could have a parallel communication to the download with the server side to track status. – Jason K. Apr 19 '17 at 01:11
  • customized and fixed this issue using fineuploader javascript library, i also recommend to use it – pjason Jun 27 '17 at 08:53
0

Customized uploading large file size (as we treat large files from 30 MB to upto 1 GB) using fineuploader javascript library). FineUploader is 100% opensource library, with which we can configure chunk size, enable pause, resume of upload (close browser and continue from where it let off, progress bar etc.,)

On the server side used Custom Servlet (in java - as our solution is based on java) to upload the chunks, merge them when all chunks are uploaded.

Customized solution steps are as below: [*Configured chunk size (for eg.30MB) in fineuploader script] 1. When we upload file, fineuploader script splits file into chunks of size 30 MB of total size 2. and chunk will uploaded calling Custom Servlet (in same way all chunks will uploaded (can be uploaded sequentially or concurrently [as per our requirement]) 3. When all chunks are uploaded to server side, they will merged to file

I recommend to customize and make use of fineuploader script library (if you have similar requirement)

pjason
  • 155
  • 1
  • 2
  • 10