1

I'm currently facing issues with sending telemetry data from a C++ desktop application to Azure, specifically Azure Data Explorer.

In my project, telemetry is being sent via a gRPC exporter to a collector on the same machine. This collector successfully writes the telemetry data to local files. However, when attempting to send the data to Azure, I find that no data appears in the Azure Data Explorer (ADX) tables that I have set up.

I haven't noticed any error messages from the collector, which makes the issue even harder to troubleshoot. As a reference, I've been following the steps laid out in the Azure documentation for creating an Azure Active Directory application registration in Azure Data Explorer (link).

To help identify if there is any configuration issue, I have attached the OpenTelemetry (otel) collector YAML file (otel-collector.yml) here.

I'm at a bit of a loss for how to proceed with debugging this issue. If anyone could provide some guidance, or even better, a working example that I could compare to, it would be greatly appreciated.

Thanks in advance for your help!

otel-collector.yml:

receivers:
    otlp: # the OTLP receiver the app is sending traces to
      protocols:
        grpc:
  
  exporters:
    azuredataexplorer:
      cluster_uri: "https://<adx-clustor-name>.<region>.kusto.windows.net"
      application_id: "<app_id>"
      application_key: "<app-key>"
      tenant_id: "<tenent-id>"
      db_name: "MyDatabase"
      metrics_table_name: "OTMetrics"
      logs_table_name: "OTLogs"
      traces_table_name: "OTTraces"
      ingestion_type : "managed"
    logging:
      #logLevel: info
    file: # the File Exporter, to ingest logs to local file
      path: "./app_example.json"
      rotation:
  
  processors:
    batch:
  
  service:
    pipelines:
      traces:
        receivers: [otlp]
        processors: [batch]
        exporters: [file, azuredataexplorer]
      metrics:
        receivers: [otlp]
        processors: [batch]
        exporters: [file, azuredataexplorer]
      logs:
        receivers: [otlp]
        processors: [batch]
        exporters: [file, azuredataexplorer]

Otel Collector output:

C:\otel-collector>otelcol-contrib.exe  --config=C:\otel-collector\otel-collector-config.yaml
2023-06-06T12:00:07.889-0700    info    service/telemetry.go:113        Setting up own telemetry...
2023-06-06T12:00:07.890-0700    info    service/telemetry.go:136        Serving Prometheus metrics      {"address": ":8888", "level": "Basic"}
2023-06-06T12:00:07.891-0700    info    service/service.go:141  Starting otelcol-contrib...     {"Version": "0.77.0", "NumCPU": 32}
2023-06-06T12:00:07.891-0700    info    extensions/extensions.go:41     Starting extensions...
2023-06-06T12:00:07.891-0700    warn    internal/warning.go:51  Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks        {"kind": "receiver", "name": "otlp", "data_type": "traces", "documentation": "https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"}
2023-06-06T12:00:07.892-0700    info    otlpreceiver@v0.77.0/otlp.go:94 Starting GRPC server    {"kind": "receiver", "name": "otlp", "data_type": "traces", "endpoint": "0.0.0.0:4317"}
2023-06-06T12:00:07.892-0700    info    service/service.go:158  Everything is ready. Begin running and processing data.

shivi
  • 9
  • 2
  • Hello shivi, can you attach the full logs for this? The exporter seems to be shutdown because of the whole set up getting shutdown? . If there are any logs you can attach it will be very helpful to go further – Ramachandran.A.G Jun 04 '23 at 14:31
  • Since you are using managed ingestion_type, please make sure you have streaming ingestion enabled. https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-streaming?tabs=azure-portal%2Ccsharp `.alter table policy streamingingestion enable`
    – Abhishek saharn Jun 05 '23 at 06:52
  • @Ramachandran.A.G hey sorry I killed the process so that's why it shows the shutdown logs. I have attached logs where I left it running. – shivi Jun 06 '23 at 19:13
  • @Abhisheksaharn thanks for the suggestion! I did double check that I had `streamingingestion` enabled on all the tables. – shivi Jun 06 '23 at 19:15
  • 1
    I got it working finally. So previously, I was using an existing cluster I had created in https://dataexplorer.azure.com/freecluster and followed the steps 1-7 described in https://learn.microsoft.com/en-us/azure/data-explorer/provision-azure-ad-app - this did NOT work and I don't know exactly why. This time around I created a new ADX cluster from the Azure portal -> Create a resource, followed the rest of the steps and it worked! – shivi Jun 08 '23 at 06:01

1 Answers1

0

"However, when attempting to send the data to Azure, I find that no data appears in the Azure Data Explorer (ADX) tables that I have set up."

from the log you attached i see that the azure service process appears to start correctly and is ready to start processing your data. then something called otelcol/collector receives an interrupt signal from the OS as soon as it starts processing your data, followed by the azure service shutting down.

i do see a warning about using address 0.0.0.0 because it is unsafe. this could be the collector refusing to function because of that. the collector also may have some configuration issues.

SpaceCadet
  • 25
  • 4
  • 1
    As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jun 02 '23 at 23:27