I have a Tabular Model which is a standard solution for multiple customers which consumes tables from Data Lake. Previously we used SQL Server databases as data sources and would edit the connection to read a different customer database prior to deploying.
I have not found a way to successfully edit the connection to a different customer folder in Data Lake.
- If i add a new data source the queries do not populate the existing tables and an import query creates a new table called for example Clients1 instead of Clients.
- If i open the data source and change the string representing the URL to a different folder, the data source window raises and error.
We chose Data Lake because it seems to be the least expensive intermediate storage mechanism when using Data Factory. Azure Database and Azure Data Warehouse are off limits due to cost. Although i am aware that Azure Data Warehouse can be paused.
Hopefully i am just misunderstanding the interface and this is possible.
Hope to hear from someone! ;)
EDIT 09/02/2018 11:21h
I opened the model file as code and find/replaced the connection reference for the tables. I then manually reconnected to each Data Lake connection and processed the model. This worked.
Note; i had to manually connect to each connection because when processing Visual Studio only presented me with a single connection dialogue and the process would fail.