2

Stream Analytics job ( iot hub to CosmosDB output) "Start" command is failing with the following error.

[12:49:30 PM] Source 'cosmosiot' had 1 occurrences of kind 'OutputDataConversionError.RequiredColumnMissing' between processing times '2019-04-17T02:49:30.2736530Z' and '2019-04-17T02:49:30.2736530Z'.

I followed the instructions and not sure what is causing this error. Any suggestions please? Here is the CosmosDB Query:

SELECT
[bearings temperature],
[windings temperature],
[tower sway],
[position sensor],
[blade strain gauge],
[main shaft strain gauge],
[shroud accelerometer],
[gearbox fluid levels],
[power generation],
[EventProcessedUtcTime],
[EventEnqueuedUtcTime],
[IoTHub].[CorrelationId],
[IoTHub].[ConnectionDeviceId]
INTO
cosmosiot
FROM
TurbineData 
Chandra
  • 515
  • 6
  • 19
  • 1
    Could this be a case-sensitivity issue with property names? In ASA v1.0, output property names were lowercased. In v1.1, they are not. This could cause a mismatch with your (case-sensitive) Cosmos DB property names. You can check by looking at your ASA job settings (one of the tabs is for compatibility level) – David Makogon Apr 17 '19 at 03:22
  • No luck David. I am still getting the same error – Chandra Apr 17 '19 at 08:31
  • It would help if you edited your question to show your ASA query, along with what you defined in Cosmos DB for your collection. – David Makogon Apr 17 '19 at 10:49
  • Additional details about above error, along with a portion of the output event payload should be in diagnostic logs. Can you please enable diagnostic logs and see if the additional details there help? Also, do you still get output events or does this affect all output events? – Vignesh Chandramohan Apr 18 '19 at 05:30
  • Hi David, I tried again by creating IOT Device simulation solution accelerator .The stream analytics job( iot hub to cosmosdb) is working perfectly fine with live data. The problem still exists for historical data ( with custom schedule setting a past date/time). – Chandra Apr 19 '19 at 08:51

1 Answers1

4

If you're specifying fields in your query (ie Select Name, ModelNumber ...) rather than just using Select * ... the field names are converted to lowercase by default when using Compatibility Level 1.0, which throws off Cosmos DB. In the portal if you open your Stream Analytics job and go to 'Compatibility level' under the 'Configure' section and select v1.1 or higher that should fix the issue. You can read more about the compatibility levels in the Stream Analytics documentation here: https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-compatibility-level