Dropbox ran some benchmarks recently:

Note that the clean build without a cache scenario is significantly slower for Bazel, whereas the incremental build scenario is significantly faster.
Bazel build units tend to have a smaller granularity than Gradle's. It's common to see java_library
targets with just a single source file. The command line actions in these build units, also known as targets, are individually cached (locally or remotely) and composed together to build a java_binary
.
With many small build units, there are typically more actions to execute, and therefore more disk I/O and computation, leading to a slower initial, clean build time.
Some executables of these actions may also have a high startup cost (e.g javac
), which adds up when these processes are restarted many times. Bazel has a mechanism called persistent workers, where the executable process for individual actions (e.g. a compiler wrapper for javac
, tsc
, scalac
or ghc
) can be persisted across action executions, saving on startup times and enabling an even lower level of caching at the process level.
On the other hand, Bazel's small build units enable highly incremental builds and fast iterative development, as seen in the chart above.
The small build units also enable Bazel to form a dependency graph with a high degree of parallelism. With remote execution, you can run 100s and 1000s of small build actions in parallel.
The dependency graph is also highly optimized for the no-op build case. If nothing has changed in your project, Bazel should take as little time as possible to figure out nothing has changed, so nothing needs to be done.
The drawback of a slow clean build can also be mitigated with remote caches, remote execution, or not running the relatively rare bazel clean
since builds aim to be hermetic, deterministic, and consistent. A build with 100% remote cache hits are common with Bazel.