1

I am using Container Builder to process huge JSON files and transform them. It's a nice possibility of a non-standard usage of it as described here.

Is it possible to trigger a container builder build and pass a parameter to it via cloud functions? This would allow to act on newly uploaded files in GCS and process them via container builder automatically.

Currently I am trying to use the REST API for triggering it (I am new to Node.js), but I get a 404 on my URL. I am developing on a Cloud Shell instance with full API access.

The URL that I am trying to trigger via a PUT request and a JSON body containing the JSON equivalent of a successfully ran cloudbuild.yaml is: https://cloudbuild.googleapis.com/v1/projects/[PROJECT_ID]/builds

I am using the requests library from Node.js:

request({ url: "https://cloudbuild.googleapis.com/v1/projects/[PROJECT_ID]/builds", 
    method: 'PUT', 
    json: {"steps":[{"name":"gcr.io/cloud-builders/gsutil",(...),
    function(error, response, body){
        console.log(error)
        console.log(response)
        console.log(body)
    })
Tobi
  • 904
  • 1
  • 8
  • 29

2 Answers2

1

Apparently, there is already someone who has done it. The library is on GitHub: https://github.com/mhr3/gcp-container-builder-node and available via npm: https://www.npmjs.com/package/gcp-container-builder

The usage was not clear to me at first, but this is how I am using it now:

var build = Object.create(null);
build.steps = [{
    name: 'gcr.io/cloud-builders/gsutil',
    args: ['cp', 'gs://some_bucket/some_file.json', '/workspace/some_file.json']
}]

// more build steps, converting the file, uploading it, etc.

builder.createBuild(build, function(err, resp) {
    console.log(err);
    console.log(resp);
});
Tobi
  • 904
  • 1
  • 8
  • 29
-1

The procedure you propose require three different steps:

Google Cloud Storage → Cloud Functions → API call.

According to the requirements you exposed, could be better to use Container Builder’s Build Triggers.

You upload the files to Google Cloud Source Repository and create a trigger. Everytime you upload a change to the repository, Container Builder will build the image automatically. This way you avoid using cloud functions, the API call and Node.js.

This will reduce the procedure to only one step, which reduces the complexity and increases the reliability.

komarkovich
  • 2,223
  • 10
  • 20
  • You mean I should misuse Cloud Source Repositories and store huge newline-delimited JSON blobs in it? Isn't that contradictory to the way git is designed? Also how do I tag the files that I have processed already? (In GCS it's easy to add tags to the metadata of blobs.) – Tobi May 11 '18 at 12:10
  • You can use [git tags](https://git-scm.com/book/en/v2/Git-Basics-Tagging). – komarkovich May 17 '18 at 07:45