2

I'm setting up CosmosDb with a partition key as a Stream Analytics Job output and the connection test fails with the following error:

Error connecting to Cosmos DB Database: Invalid or no matching collections >found with collection pattern 'containername/{partition}'. Collections must >exist with case-sensitive pattern in increasing numeric order starting with >0..

NOTE: I'm using the cosmosdb with SQL API, but the configuration is done through portal.azure.com

I have confirmed I can manually insert documents into the DocumentDB through the portal Data Explorer. Those inserts succeed and the partition key value is correctly identified.

I set up the Cosmos container like this

Database Id: testdb
Container id: containername
Partition key: /partitionkey
Throughput: 1000

I set up the Stream Analytics Output like this

Output Alias: test-output-db
Subscription: My-Subscription-Name
Account id: MyAccountId
Database -> Use Existing: testdb
Collection name pattern: containername/{partition}
Partition Key: partitionkey
Document id: 

When testing the output connection I get a failure and the error listed above.

bnm91
  • 33
  • 1
  • 4

1 Answers1

0

I received a response from Microsoft support that specifying the partition via the "{partition}" token pattern is no longer supported by Azure Stream Analytics. Furthermore, writing to multiple containers from ASA in general has been deprecated. Now, if ASA outputs to a CosmosDb with a partition configured, Cosmos should automatically take care of that on its side.

after discussion with our ASA developer/product group team, the collection pattern such as MyCollection{partition} or MyCollection/{partition} is no longer supported. Writing to multiple fixed containers is being deprecated and it is not the recommended approach for scaling out the Stream Analytics job [...] In summary, you can define the collection name simply as "apitraffic". You don't need to specify any partition key as we detect it automatically from Cosmos DB.

bnm91
  • 33
  • 1
  • 4