I realize this is an old question, but I thought I'd leave this response for those who are still looking for ways to back up Azure table data in 2018.
I've seen a lot of suggestions for using AzCopy, which looks like a great way to do it.
However, if using C# works better for you, I wrote a tool (that my workplace allowed me to open source) which is on github:
https://github.com/Watts-Energy/Watts.Azure#azure-data-factory
The main objective of the project is not backups, but it can be used to do just that, and we have backups running in Azure Web Jobs using the functionality therein. We open sourced it because we figured it could prove useful to others, besides us, since it allows you to do 'incremental' backups, which I don't know if you can accomplish with AzCopy. I'm not saying you can't, only that I haven't a clue whether that's possible.
The idea is that you create a small console application (to be hosted as an Azure Web Job for example) in .NET and you can e.g. do something like this:
DataCopyBuilder
.InDataFactoryEnvironment(environment)
.UsingDataFactorySetup(environment.DataFactorySetup)
.UsingDefaultCopySetup()
.WithTimeoutInMinutes(numberOfMinutes)
.AuthenticateUsing(authentication)
.CopyFromTable(sourceTable)
.WithSourceQuery(null)
.ToTable(targetTable)
.ReportProgressToConsole()
.StartCopy();
If you, when the job runs, store the time (in UTC) when you started your last copy operation, you can supply a 'source query' (example: 'Timestamp gt datetime'{lastBackupStarted.ToIso8601()}
') rather than null
like in the example above, and it will only take data modified since that date.
It's explained in greater detail in the project README.txt .
Again, not sure if it's useful to anyone, but it does solve some challenges we have had.