0

My goal is to download a large zip file (15 GB) and extract it to Google Cloud using Laravel Storage (https://laravel.com/docs/8.x/filesystem) and https://github.com/spatie/laravel-google-cloud-storage.

My "wish" is to sort of stream the file to Cloud Storage, so I do not need to store the file locally on my server (because it is running in multiple instances, and I want to have the disk size as small as possible).

Currently, there does not seem to be a way to do this without having to save the zip file on the server. Which is not ideal in my situation.

Another idea is to use a Google Cloud Function (eg with Python) to download, extract and store the file. However, it seems like Google Cloud Functions are limited to a max timeout of 9 mins (540 seconds). I don't think that will be enough time to download and extract 15GB...

Any ideas on how to approach this?

Wouter Doornbos
  • 124
  • 2
  • 13
  • On GCS side, there is a [streaming solution](https://cloud.google.com/storage/docs/streaming#stream_an_upload) – Gourav B Feb 10 '22 at 05:47

1 Answers1

0

You should be able to use streams for uploading big files. Here’s the example code to achieve it:

$disk = Storage::disk('gcs');

$disk->put($destFile, fopen($sourceZipFile, 'r+'));

Gourav B
  • 864
  • 5
  • 17