0

I was wondering if there is a way to export data from tables or whole tables generated by Log-query-searches in Log Analytics and Application Insights to an exact same Table In an Azure Sql Database?

I'm thinking like a Export/Import kind of deal. or even like doing it with a webjob or something like that.

Is there any way to do this?

Peter Bons
  • 26,826
  • 4
  • 50
  • 74
H4p7ic
  • 1,669
  • 2
  • 32
  • 61

1 Answers1

0

There are a couple of options:

  1. Use the rest api to issue queries and store the result in sql server. This can be done using a web host or an azure function (triggered by a schedule?)
  2. Use Continuous export to export the data to azure blob storage. From there you can read the blobs and store the data to sql using a web host or an azure function (triggered by a blob creation?). A better fit might be to use Azure Data Factory to copy the data from blob to sql server.

Keep in mind you will have to define the sql table schemas yourself to represent the json data. There might not be a straight forward conversion as it is json.

What is the idea behind having the data in sql server anyway? Depending on your use case there might be better options available.

Peter Bons
  • 26,826
  • 4
  • 50
  • 74
  • Hi, Peter, And thanks for the answer. Well The use case i'm looking to achieve with having the data "staged" in a staging database is to be able to update my report more often than 8 times a day for a minimum of every 30 minutes. I managed to test the difference between getting my data from the m-languages queries in difference to getting them from a sql database and for that i gained the option to be able to update my data in my report every 15 minutes. I'm using my report as a front end to a logging-service. – H4p7ic Oct 29 '18 at 07:57