How efficient is puppet with handling large files? To give you a concrete example:
Let's assume we're dealing with configuration data (stored in files) in the order of gigabytes. Puppet needs to ensure that the files are up-to-date with every agent run.
Question: Is puppet performing some file digest type of operation beforehand, or just dummy-copying every config file during agent runs?