The settings files will not change very often. How can I cache the files locally so that they aren't downloaded every time?
Per my understanding, you could use the ETag property of your blob file which could use to as the identifier and it would be updated every time when an update operation is performed on your blob file.
Before you load the settings files within your Windows Client application, you could check the ETag from your local side and compare with the related value from the server side, if the ETag does not match, you could download the settings to your local side for renewal, then load the newer settings file(s).
Since you are using Windows Client application, you could set container public access level to Public read access for blobs only or generate a SAS token with the limited permission against your container, then you could use Azure Storage Client Library for .NET and follow the code below to retrieve the ETag
property from the blob service as follows:
//access container with public access level to blobs
//var container = new CloudBlobContainer(new Uri("https://{storage-name}.blob.core.windows.net/{container-name}"));
//access container with SAS token
var container = new CloudBlobContainer(new Uri("https://{storage-name}.blob.core.windows.net/{container-name}?{SAS-token}"));
CloudBlob blockBlob = container.GetBlobReference("setting.json");
blockBlob.FetchAttributes();
string etag=blockBlob.Properties.ETag;
When using WebClient
to download the setting files, you could set the Etag value within the file name of your local setting file (e.g {filename}-{etag}.json
).
Additionally, you could leverage Azure Storage Explorer to manage your storage resources. Moreover, you could also leverage the ContentMD5
property, but it has some limitations, for more details you could refer to here.