I have an Azure table that has over a million entries and I am trying to do about 300,000 queries programmatically in C#
in order to transfer some data to another system. Currently I am doing the following as I read through a file which has the partition and row keys:
while (!reader.EndOfStream)
{
// parse the reader to get partition and row keys
string currentQuery = TableQuery.CombineFilters(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partKey), TableOperators.And, TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.Equal, rowKey));
TableQuery<MyEntity> query = new TableQuery<MyEntity>().Where(currentQuery);
foreach (MyEntity entity in table.ExecuteQuery(query))
{
Console.WriteLine(entity.PartitionKey + ", " + entity.RowKey + ", " + entity.Timestamp.DateTime);
}
Thread.Sleep(25);
}
This is taking a very long time to complete(5+ hours). The queries are taking on average around 200 milliseconds from what I can see. I am kinda new to Azure so I figure I am doing something wrong. How can I improve it?