1

I'm trying to create a cloud function which saves some data (documents from firestore) to cloud storage.

Wrote some cloud functions before, but kind of new to cloud storage, buckets etc. From what I've read I have to "stream" this data to a bucket.

I'd love to see a short snippet that does just that :)

user6097845
  • 1,257
  • 1
  • 15
  • 33

2 Answers2

1

For you to achieve that, it should not be something very complicated, so I hope I can help you.

To perform this, I will follow the example explained in the article Backup Firestore data to storage bucket on a schedule in GCP - which you can follow completely, in case you are interested - focusings in the upload from Firestore to Cloud Storage. I will explain which parts to use and how to use them, to achieve your agoal

Once you created your Cloud Storage bucket - it should have Multi-regional and Nearline configured in the settings - you need to use the below code as indicated after them.

index.js file:

const firestore = require('@google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
// Replace BUCKET_NAME
const bucket = 'gs://<bucket-id>'
exports.scheduledFirestoreBackup = (event, context) => {
  const databaseName = client.databasePath(
    process.env.GCLOUD_PROJECT,
    '(default)'
  );
return client
    .exportDocuments({
      name: databaseName,
      outputUriPrefix: bucket,
      // Leave collectionIds empty to export all collections
      // or define a list of collection IDs:
      // collectionIds: ['users', 'posts']
      collectionIds: [],
    })
    .then(responses => {
      const response = responses[0];
      console.log(`Operation Name: ${response['name']}`);
      return response;
    })
    .catch(err => {
      console.error(err);
    });
};

package.json file:

{
  "dependencies": {
    "@google-cloud/firestore": "^1.3.0"
  }
}

These files should be created, with the configuration in the Cloud Function as following: an unique name; Cloud Sub/Pub as trigger; topic name similar or equal to initiateFirestoreBackup; using Node.js, source will be the above files written and function to execute scheduledFirestoreBackup.

The above codes should be enough for you to export from your Firestore to your Cloud Storage, due to the fact that it will be getting all your collections - or you can define specifics - and sending to the bucket you already created.

Besides that, in case you want more information on uploading files to Cloud Storage using Cloud Functions, you can check here as well: Uploading files from Firebase Cloud Functions to Cloud Storage

Let me know if the information helped you!

gso_gabriel
  • 4,199
  • 1
  • 10
  • 22
  • Yes, this solution can be found here as well: https://firebase.google.com/docs/firestore/solutions/schedule-export#gcp-console Unfortunately, it exports more data than I want (sub-collections etc.), which means high costs. Using your second link I was able to implement a partial backup - which is great for my needs. Awesome! Thanks a lot :) – user6097845 May 20 '20 at 13:19
  • Sure, I understand your point, @user6097845, makes sense. I'm glad that it helped you! – gso_gabriel May 20 '20 at 13:26
1

Thanks to @gso_gabriel I was able to create a partial backup for my documents. To anyone who's interested, here's a simplified version of my code:

const bucketName = 'myproject.appspot.com';

exports.backup = functions.https.onCall(async (data, context) => {
   const userDoc = await admin.firestore().collection('users').doc('abc123').get();
   await writeFile('temp', 'user.json', JSON.stringify(userDoc.data()));
}

async function writeFile(dirName, fileName, content) {
   var bucket = admin.storage().bucket(bucketName);
   const destFilename = dirName + '/' + fileName;
   const file = bucket.file(destFilename);
   const options = {
      destination: destFilename,
      metadata: { contentType: "application/json" }
   };
   await bucket.file(destFilename).save(content, options);
}
user6097845
  • 1,257
  • 1
  • 15
  • 33