I need to copy data files from a Linux server to Google Cloud Storage buckets and schedule this process to run hourly. I think the "Transfer for on-premises" is the most suitable option for this process. However, I am struggling to find detailed information how to implement this process step-by-step. Any help would be greatly appreciated. Thanks.
1 Answers
The documentation details this process:
Install Docker and run a small piece of software, called an agent, in your private data center. The agent runs within a Docker container and has access to your locally mounted NFS data.
See Installing and running the on-premises agent for more information.
Start a Transfer Service for on-premises data transfer from the Google Cloud Console. You'll provide the NFS directory and a destination Cloud Storage bucket to transfer data to.
See Creating a transfer job for more information.
When the transfer starts, it recursively traverses through the given NFS directory and moves data it finds to your Cloud Storage bucket.
Transferred data is checksummed, files with errors are re-tried, and data is sent via a secure connection. A record of the transfer's progress is written to log objects within your destination Cloud Storage bucket. You can track the progress of the transfer within the Google Cloud Console.
When the transfer completes, you can view error samples within the Google Cloud Console. You can also review the transfer log for a catalog of files transferred and any errors.
Regardless, just by going to the Transfer Service for on-premises data Web Console page, you are guided through once you click on Create Transfer Job.

- 4,075
- 1
- 14
- 23