35

I am trying to upload a local mp4/movie file with XMLHttpRequest() on a server. This is written with react-native and in a final version user would be able to upload any big files from his/her IOS/Android device to the server.

My test file is about 1 GB (it could potentially be from 100 mb to 10 GB).

With files below 70 mb I could easily use this code (below) to load the blob then slice it and put on server using : https://github.com/inorganik/digest-auth-request

sendRequest = new digestAuthRequest('PUT', uri, usernameVal, passwordVal);
        sendRequest.request(function (data) {
let oReq = new XMLHttpRequest();
oReq.open("GET", obj['path'], true);
oReq.responseType = "blob";
oReq.onload = function (oEvent) {
let blob = oReq.response;
console.log(blob);
sendBloblPart()
};

Then this is sendBlobPart function stripped away from all handling on callback (not important here) :

function sendBloblPart() {
    sendRequest = new digestAuthRequest('PUT', uri, usernameVal, passwordVal);
    sendRequest.request(function (data) {
    }, mainThis.fileUploadErrorHandler, blob.slice(startB, endB), true);
}

I am reconstructing the file on server from all the parts and this works just fine with the blob created by XMLHttpRequest(), but it seems XMLHttpRequest is loading whole file into memory so for example when trying to load 1 GB file as blob I get out of memory error. I looked for several solutions however nothing worked so far. I found promising feature of : https://github.com/joltup/react-native-fetch-blob

so creating Blob from 1 GB file now takes about 100 ms and doesn't consume memory:

const Blob = RNFetchBlob.polyfill.Blob
let newBlob = new Blob(RNFetchBlob.wrap(obj['path']), {type: obj['mime']});

This however has a challenge in that the new blob has a different structure then previous one (previous one meaning from XMLHttpRequest) where I was able to simply pass the sliced blob as data to send and it worked just fine. Now no matter what I do with new blob I am not able to send any data to the server, all requests appear to work just fine, but 0 bytes are received on server end.

Are there any hints/solutions or ideas on how to better approach this problem?

Below I will add structure of both blobs, I can see they are quite different (I used 1.2 mb file for this example so XML request will work just fine and won't crash as with my 1GB desired file) :

Structure of both blobs

While I thought about using readStream and similar solutions, I couldn't find an option for stream to wait while upload is done or start from 50% of the file for example.

Mark Schultheiss
  • 32,614
  • 12
  • 69
  • 100
Dirindon
  • 634
  • 1
  • 5
  • 15

3 Answers3

0

I have a couple ideas. What happens if you wait for the blob slice to be created before making the upload call?

const firstBlob = await new Promise(resolve => {
  newBlob.slice(0, 262144).onCreated(resolve)
});

// send request with blob

firstBlob.safeClose();

const secondBlob = await new Promise(resolve => {
  newBlob.slice(262144, 262144 * 2).onCreated(resolve)
});

// etc

It looks like the rn-fetch-blob package has a polyfill for XHR that considers the onCreated callback. If you use that XHR instead it could also work.

const XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
AJcodez
  • 31,780
  • 20
  • 84
  • 118
0

I'm a little confused about your codes.

As you said,'User would be able to upload any big files from his/her IOS/Android device to the server'.In that situation, you suppose to get the File instance by a component like <input type="file"> rather than get file from server and store it into a blob and then upload it.Blob loads all data into memory at once when the data is from server, so it causes memory error.

If the big file which user want to upload must be downloaded from a server, you should download the file chunk by chunk (need to do some changes of the backend download codes) and upload it in the same way.

If you get file from <input type="file"> and use File.slice to get file chunks it won't cause memory errors, do it as what you did.

Taurz
  • 388
  • 2
  • 6
0

Handling large file uploads in React Native can be challenging due to memory limitations and potential performance issues. Since you have already identified that XMLHttpRequest loads the entire file into memory, causing out-of-memory errors, and the react-native-fetch-blob solution is not working as expected, let's explore an alternative approach using the react-native-fs package, which provides file system access and might help with your file upload requirements.

Here's a step-by-step guide to achieving the file upload using react-native-fs:

  1. Install the required packages:
npm install react-native-fs
  1. Link the native modules:
react-native link react-native-fs
  1. Import the necessary modules in your upload component:
import RNFetchBlob from 'rn-fetch-blob';
import RNFS from 'react-native-fs';
  1. Create a function to upload the file in chunks:
const CHUNK_SIZE = 10 * 1024 * 1024; // 10 MB (Adjust the chunk size based on your server's capabilities)

const uploadFileInChunks = async (fileUri, serverUrl, username, password) => {
  const fileSize = await RNFS.stat(fileUri).then((fileStats) => fileStats.size);

  let start = 0;
  let end = CHUNK_SIZE;

  while (start < fileSize) {
    const chunkUri = await RNFS.slice(fileUri, start, end);

    // Convert the chunk file URI to Blob
    const chunkBlob = await RNFetchBlob.fs.readFile(chunkUri, 'base64');
    const chunkData = RNFetchBlob.wrap(chunkBlob);

    // Send the chunk to the server using your digestAuthRequest or other methods
    await sendChunkToServer(serverUrl, username, password, chunkData);

    start = end;
    end = Math.min(end + CHUNK_SIZE, fileSize);
  }
};
  1. Use this function to upload the file:
const uploadLargeFile = async () => {
  const fileUri = 'file:///path/to/your/file.mp4';
  const serverUrl = 'https://your-server-url.com/upload';
  const username = 'your-username';
  const password = 'your-password';

  try {
    await uploadFileInChunks(fileUri, serverUrl, username, password);
    console.log('File upload successful!');
  } catch (error) {
    console.error('File upload failed:', error);
  }
};

Remember to adjust the CHUNK_SIZE according to your server's capabilities and network conditions. Smaller chunks might increase the number of requests, but they'll be less prone to memory issues.

By using this approach, you'll be able to upload large files in smaller chunks without consuming excessive memory, and it should be more reliable than attempting to load the entire file into memory at once.

sarath
  • 343
  • 8
  • 31