-1

One step in my cloud build is to copy files from a VM in another project. After a series of problems, I've set up the service account access, and can successfully do this scp from my own workstation. However, in cloud build itself, I get this error on this step:

2022-08-03 22:21:32.170 EDTStep #4 - "Copy in static images": Failed to add the host to the list of known hosts (/builder/home/.ssh/google_compute_known_hosts).

The step runs a shell script. The pertinent part does this:

    args:
      - '-c'
      - ./auto-image-xfer.sh
    id: Copy in static images 
    entrypoint: bash

The shell script does this: gcloud compute scp --recurse user@vmname:/path/to/images ./destination --zone us-central1-a --ssh-key-file=./google_compute_engine --project "projectname"

Again, I hasten to add that I worked out a series of service account issues that originally prevented my ssh key from working prior to this, so I think it's just down to not being able to write the known_hosts file.

I looked into the -o ssh options to specify an alternative known hosts file, but these aren't valid for the gcloud compute scp command, and can't seem to be passed through with the scp-flags option.

I'm wondering if I need a custom builder for this, or is there an easier solution I'm overlooking?

  • Bypass using **gcloud** and directly use the **scp** tool. The CLI **gcloud** is just a wrapper for **ssh/putty** and **scp**. – John Hanley Aug 04 '22 at 02:48
  • Can you explain: what is the scp tool? Do you mean just use scp? The issue is that the firewall does not permit scp access. So I think gcloud scp must be using some special access. – James Wetterau Aug 04 '22 at 03:05
  • Google search for details on SSH and SCP. – John Hanley Aug 04 '22 at 04:11
  • I understand what ssh and scp are, but this VM does not handle ssh on its publicly reachable IP addresses, and the build is in a different project from the VM, so the network topolgy does not support using vanilla scp / ssh. – James Wetterau Aug 04 '22 at 15:00
  • Anything the CLI can do, so can you. You just need to know how everything works together. – John Hanley Aug 04 '22 at 19:24
  • How do you scp to a machine in a different project that doesn't have a public IP address and doesn't have VPC peering between the two projects? What I did instead was grant a service account in the second project access to the VM in the remote project, and then I was able to access it via the gcloud command shown above, by specifying the project, vm name and zone. What would be the equivalent scp command if the host has no public IP address and is in a different project's network with no peering between the two? – James Wetterau Aug 04 '22 at 22:35
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/247040/discussion-between-james-wetterau-and-john-hanley). – James Wetterau Aug 04 '22 at 22:36
  • If a question must go to chat, the question needs work or more research. To connect to a VM without a public IP address, you can open an IAP TCP tunnel. Then the scp command can connect using the TCP tunnel. That will solve the command line option issue you have with scp. https://cloud.google.com/iap/docs/using-tcp-forwarding – John Hanley Aug 05 '22 at 01:37
  • The trouble with an IAP tunnel is that the connection was from cloud build in a different project. It was contrary to the goals of this project to create a publicly reachable endpoint for ssh, so a public IP address via IAP was a non-starter. And again the connection was coming from a different project with no shared VPC access to the first one. As I mentioned in my answer, the glcoud scp works well in this case, but there was one issue with known_hosts, which I eventually found the answer for. Moving to vanilla scp was not feasible given the constraints. – James Wetterau Aug 05 '22 at 22:48
  • IAP can connect to instances with only a private IP. – John Hanley Aug 05 '22 at 23:38
  • Yes IAP can, but how do I connect to the IAP without gving it a public IP address? Anyway, setting up IAP is supposed to be easier than just getting gcloud scp to work? – James Wetterau Aug 06 '22 at 03:45
  • I provided a link in my earlier comment. – John Hanley Aug 06 '22 at 04:57

1 Answers1

0

This stack overflow post was very informative: Using SSH keys with Google Container Builder

As was this documentation item about using ssh to access github from within a build: https://cloud.google.com/build/docs/access-github-from-build

It turned out it was necessary to get the known hosts file into the build.

My solution was cribbed from one of the stack overflow comments. I added this step:

  - name: gcr.io/google.com/cloudsdktool/cloud-sdk
    args:
      - '-c'
      - ./copy-known-hosts.sh
    id: Copy in known hosts
    entrypoint: sh

The shell script does this:


exitfn () {
  trap SIGINT
  rm ./google_compute_*
}

trap "exitfn" INT

gcloud secrets versions access 1 --secret=known-hosts > google_compute_known_hosts
mkdir -p /builder/home/.ssh
cp ./google_compute_known_hosts /builder/home/.ssh/google_compute_known_hosts
chmod 400 /builder/home/.ssh

exitfn