1

I have a monorepo setup for my go project. I would love it if I could find a way to use go build (or similar internal tool) to get a list of targets that need to be re-built.

Here is an example of what I am looking for:

...
├── pkg //shared code across mono repo
│   └── math
│       └── common_operations.go
└── cmd // individual packages to be built 
    ├── package1
    │   └── main.go
    └── package2
        └── main.go

The package1 program calls a subtract function from the math shared library. The package2 program calls an add function.

  • If I change the package1 code, only the package1 target is listed
  • If I change the package2 code, only the package2 target is listed
  • If I change the add function in the shared library, only the package2 target is listed
  • If I change the subtract function in the shared library, only the package1 target is listed
  • If I change all the functions in the shared library, both package1 and package2 rebuilds.

I would be perfectly happy to use use the internal build package and get the list programatically. I am just am unfamiliar with it.

What I have tried: Bazel has an option for this but I would prefer to avoid using bazel if at all possible. the bazel command: bazel build cmd/some-target --check_up_to_date returns error code 0 if it is up to date, otherwise it returns error code 1. Which is technically a solution, but my need, as you might have inferred, is ci/cd based. And I want to avoid integrating Bazel into that process as much as possible.

t3r
  • 21
  • 3
  • I'm interested in answers to this as well! I addressed this problem by building each of the target binaries, calculating a checksum, then comparing that checksum to the value from the previous build. If there's a way to avoid the overhead and just not build things that don't need it, that would be great! – jdp Jun 22 '22 at 14:56
  • I don't think the second and third points are technically possible; Go builds at the package level, so if the shared package is changed, all packages consuming it will have to be rebuilt. – Adrian Jun 22 '22 at 20:38
  • @Adrian I didn't think it was possible either, but it is possible with Bazel. I don't know what Bazel is doing special but I do know that the build process for go does produce efficient binaries. My assumption is that the code that gets compiled into the final binary, no matter the originating package, is only the code that the target package needs to run. So for point two it would be package1 + the subtract function from the shared package. If this is true, jdp's answer would be a solution. – t3r Jun 22 '22 at 20:58

2 Answers2

1

Not really sure of the use case here, are you OK with actually compiling the packages as well? In that case maybe go build -v can do the job for you. From go help build:

-v
        print the names of packages as they are compiled.
zacho314
  • 71
  • 1
  • 3
  • Honestly, I am ok with compiling if that is my only option. An optimal solution would be to not compile it and flag it for compilation. That way you don't need to waste compile times that were unnecessary. The ultimate goal for this is in a more production environment where you want to save compile time in the ci process and get the benefits of a mono-repo at the same time. You know, have your cake and eat it too. – t3r Jun 22 '22 at 21:02
  • I think the option that looks more interesting to me is the ``` -n print the commands but do not run them. ``` option. Because the output of a target that has not changed looks a lot different than the output of a target that needs to get rebuilt. I was hoping there was a tool or option that already exists instead of having to build one that would parse the output of that command. – t3r Jun 22 '22 at 21:05
  • > That way you don't need to waste compile times that were unnecessary - I can't really tell if you're aware that the Go compiler caches builds, so if you build the same code twice, then the second build will return the result of the first? :) Of course that has little effect if your build server discards the cache after each job. See `go env GOCACHE` for your cache destination. – zacho314 Jun 23 '22 at 09:52
  • Yeah, I believe [this is the line](https://github.com/golang/go/blob/bdab4cf47a47b69caacad6fd7ff6ab27bb22ab1c/src/cmd/go/internal/work/exec.go#L1631) in the source code that does the check you are mentioning. I am learning how the build works trying to figure out this problem. – t3r Jun 23 '22 at 15:41
  • For sure! Happy hacking – zacho314 Jun 23 '22 at 23:27
0

My desire was to find something similar to a go build option that would basically spit out a true or false if a target package was up-to-date. I would be happy if someone out there found a tool ( Earthly I am looking at you right now) that could solve this.

The closest thing to a solution I could find to solve my issue was to run this command:

go build -n cmd/some-target

And if the output is:

touch some-target

Then, it must be up-to-date. If the output is a long list of commands, then it isn't. If it is not up-to-date. You could then get the package name using:

go build -v

To get the name of the package and move it to the next stage of the CI process (building target, testing target, building image, etc).

Obviously it is a little hacky and requires self-rolling a solution and specifics would likely need to be changed on your exact needs. It also requires, as @zacho314 mentioned, saving the state of the go build cache, but most modern CI technologies have a solution for that. I am sure I will do something similar to this for now.

t3r
  • 21
  • 3