1

I need to upload large(more than 300 MB) file through my node server which is hosted in Google App Engine. I'm able to upload files using xhr.upload and its working fine in local, but in GAE I'm getting 413 - request entity is too large, it seems GAE has a restriction of 32MB for content size.

While exploring, I came across google BlobStore API, but it seems that doesn't support NodeJS.

Can you guys please suggest me which is the best approach for this?

Thanks in advance.

MuthuD
  • 13
  • 6

1 Answers1

1

You can upload files up to 5GB in size directly to Google Cloud Storage. There is no need to use (and pay for) your App Engine instance for this task.

You can do it by setting an upload URL in your file upload form. Look at the Cloud Storage API.

Andrei Volgin
  • 40,755
  • 6
  • 49
  • 58
  • Thanks for your response. I will try Cloud Storage API, Is this API supports nodejs? ... Lets say, I need to read the stream to do some serverside validation before uploading it to storage. Any other suggestions to upload through GAE. – MuthuD Aug 23 '16 at 13:29
  • GAE does not support streaming. So you cannot use it to work with large files. You can use Compute Engine instances to do this work. This is what I use in FileMambo.com to create archives when a user wants to download many files at once - a Compute Engine instance pulls files from Cloud Storage, adds them to an archive in chunks, and streams this archive to the user. On Compute Engine you can use Node.js or any other runtime. – Andrei Volgin Aug 23 '16 at 14:58