0

I am trying to workaround Cloudflare's 100mb upload limit, by sending chunked uploads. However I cannot seem to figure out how to actually start sending the chunked data. Say I have 100 blobs I'd like to send to the server, do I stream them? How does the server tell the continuous requests relate to each other?

Here's my code so far:

getChunks(file) {
  const divideBy = 1 * 1024 * 1024
  const availableDivisions = Math.ceil(file.size / divideBy)
  let currentSlice = 0

  const chunks = Array(availableDivisions)
    .fill()
    .map((iteration, index) => {
      const nextDivision = divideBy * (index + 1)
      const chunk = file.slice(currentSlice, nextDivision, file.type)

      currentSlice = nextDivision

      return chunk
    })

  return chunks
}

sendChunk(blob) {
  return new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest()

    xhr.open('POST', 'http://localhost:4080/test', true)
    xhr.setRequestHeader('Content-type', blob.type)

    xhr.onreadystatechange = () => {
      if (xhr.readyState == XMLHttpRequest.DONE && xhr.status == 200) {
        resolve()
      }
    }

    xhr.send(blob)
  })
}

uploadChunked(file) {
  const chunks = this.getChunks(file)

  let iteration = 0

  const upload = chunk => {
    let nextIteration = iteration + 1
    let nextChunk = chunks[nextIteration]

    this.sendChunk(chunk).then(() => {
      if (nextChunk) {
        iteration = nextIteration
        upload(nextChunk)
      }
    })
  }

  upload(chunks[0])
}

So this works fine, the upload requests are done correctly. My problem is figuring out how the server should tell that all these consecutive requests refer to one file. I've looked online and I am just extremely confused at this part.

Sebastian Olsen
  • 10,318
  • 9
  • 46
  • 91
  • So I am not sure I understand this. Are you going to somehow concatenate many 100MB files on the server side ? – trk Sep 10 '17 at 19:15
  • The idea is that instead of sending one whole 100mb in one request, you send smaller chunks, to increase performance and avoid upload limits (such like Cloudflare's) – Sebastian Olsen Sep 10 '17 at 19:16

2 Answers2

2

I've solved my issue by using the Tus protocol. Now my server can accept chunked (and resumable) uploads, and Cloudflare doesn't complain.

Sebastian Olsen
  • 10,318
  • 9
  • 46
  • 91
  • What are you using on the server side ? Does Cloudflare honor it out of the box ? – trk Sep 12 '17 at 05:30
  • I'm not sure I understand what you're asking. Cloudflare doesn't honor anything, it has an upload limit of 100mb per request. By sending the file accross several request in smaller chunks, Cloudflare won't complain about the request being too big. – Sebastian Olsen Sep 12 '17 at 09:47
  • you said you use Tus protocol right ? How does Cloudflare honor it ? – trk Sep 12 '17 at 09:53
  • Cloudflare doesn't honor anything. I don't understand what you're asking. Cloudflare simply doesn't care because each request is smaller than 100mb, which is their limit. – Sebastian Olsen Sep 14 '17 at 05:59
  • You neither seem to have the patience OR the courtesy to understand what I am asking. If you using Tus then the question is : are you using a server module that understands Tus protocol packets from the client ? If yes, then how does it directly answer your posted question (since you are using `ajax`. If you don't want to answer I am cool but just don't down vote answers putting null effort to understand it in the first place. – trk Sep 14 '17 at 06:10
  • I am not using a server module, I rolled my own implementation. The documentation for the protocol was easy to understand. – Sebastian Olsen Sep 14 '17 at 06:12
-1

You cannot. 100 MB (or X MB) is not a per request limit.

It is a limit per file. In other words if you chunk them up then each chunk would end up becoming a file on the server.

You could upload them in several chunks as you are doing now and also provide an additional script to help your users to later stitch them up on the client side.

trk
  • 2,106
  • 14
  • 20
  • You are misunderstanding my question. Sending small chunks of a bigger file to the server, that the server will then combine when done is a common practice. – Sebastian Olsen Sep 10 '17 at 19:27
  • @SebastianOlsen see at an ajax request level you could do multi-part upload and so on. But that again will coalesce `xhr.send`'s payload to a single file. So a chunk (as you are referring here) will be end up being a file at the server's end. Isn't that what you see now already ? – trk Sep 10 '17 at 19:30
  • Again, you're misunderstanding what I am trying to do. Since I am using Cloudflare, I cannot upload files bigger than 100mb through one request, Cloudflare has an upload limit. So instead I need to do chunked uploading, and have a way to keep track of the upload on the server. This way, I can send small chunks per request to the server and when done, the server will assemble them and do whatever step is next. – Sebastian Olsen Sep 10 '17 at 19:42