I am trying to use the Google Cloud Storage bucket to serve static files from a web server on GCE. I see in the docs that I have to copy files manually but I am searching for a way to dynamically copy files on demand just like other CDN services. Is that possible?

- 12,938
- 4
- 61
- 78

- 1,067
- 13
- 30
-
Can you please clarify what you mean, i.e., what's your use case, what problem are you seeing, etc.? What files do you want to dynamically copy? Are they dynamically generated? Are they not (yet) stored on GCS? Also, can you edit your question to point to the documentation you're referring to? – Misha Brukman Jan 14 '15 at 18:57
-
I am trying to copy static files that are generated dynamically on demand by requesting the files on demand from my webserver if they are not already on the bucket. What files do you want to dynamically copy? dynamically generated JS CSS file, Are they not (yet) stored on GCS? No. – Gadelkareem Jan 14 '15 at 19:36
1 Answers
If you're asking whether Google Cloud Storage will automatically and transparently cache frequently-accessed content from your web server, then the answer is no, you will have to copy files to your bucket explicitly yourself.
However, if you're asking if it's possible to copy files dynamically (i.e., programmatically) to your GCS bucket, rather than manually (e.g., via gsutil
or the web UI), then yes, that's possible.
I imagine you would use something like the following process:
# pseudocode, not actual code in any language
HandleRequest(request) {
gcs_uri = computeGcsUrlForRequest(request)
if exists(gcs_uri) {
data = read(gcs_uri)
return data to user
} else {
new_data = computeDynamicData(request)
# important! serve data to user first, to ensure low latency
return new_data to user
storeToGcs(new_data) # asynchronously, don't block the request
}
}
If this matches what you're planning to do, then there are several ways to accomplish this, e.g.,
- language-specific libraries (recommended)
- JSON API
- XML API
Note that to avoid filling up your Google Cloud Storage bucket indefinitely, you should configure a lifecycle management policy to automatically remove files after some time or set up some other process to regularly clean up your bucket.

- 12,938
- 4
- 61
- 78