2

I'm having some trouble using DirectQuery for Power BI datasets and Analysis Services: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-services

I would think I meet all the prerequisites. "Allow XMLA Endpoints and Analyze in Excel with on-premises datasets" is enabled for my tenant:

enter image description here

I've also enabled DirectQuery for PBI datasets and AS in my source dataset, which has been published to a PPU workspace:

enter image description here

I can create a live connection to the dataset no problem. The trouble comes in when I try to create a local model.

enter image description here

enter image description here

Can anyone help me understand what I'm missing here? I seem to only get this error for this specific dataset.

Update 1: The workspace is now on a premium capacity. Everything else is the same.

Update 2: I ended up saving a copy of the .pbix file and republishing. For some reason, that seemed to do the trick. I'm still curious why this error occurred and whether there was an easier way. The new dataset and all dependent reports now have new ID's, which means I had to replace all my hyperlinks, which was a bit of a pain.

Bryan Rock
  • 582
  • 1
  • 4
  • 13
  • Have you enabled the XMLA endpoint itself? [Learn Microsoft - Enable XMLA read-write](https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools#enable-xmla-read-write). If you are unable to connect to the XMLA endpoint (using a tool like SSMS or Tabular Editor) for a given workspace (instructions also at above link), I would assume the direct query functionality would not work. Per the link you provided: `for Premium capacities and Premium Per User the "XMLA endpoint" setting should be enabled and set to to either "Read Only" or "Read/Write"` – trenton-ftw Apr 10 '23 at 21:34
  • I'd have to confer with our tenant admin, but I would think the XMLA endpoint would be set to read-only since that's the default. The weird thing is I can connect to other datasets in the same workspace through SSMS. But when I try to open this particular one, I get the following error: Failed to discover the state of the model from the server. Check that tabular database with name 'USE Finance Data Model' and compatibility level 1200 or higher exists on the server. (Microsoft.AnalysisServices.Tabular) – Bryan Rock Apr 11 '23 at 13:12
  • I know from tabular editor that the compaitbility level is 1567, so that's not the issue. – Bryan Rock Apr 11 '23 at 13:19
  • You might want to do a compare of some of the model properties for a database that works. Just because the compatibility level is set higher doesn't actually mean all the correct properites for that compatibility level are set. Someone easily could have updated the level using an improper method, which I have seen before. – trenton-ftw Apr 11 '23 at 14:30
  • How are you connecting when it works and when it fails? Please post screenshots and M code of both. Also confused that your screenshot is for THQ and your comment above is for USE. – TheRizza Apr 12 '23 at 13:41
  • The dataset was recently renamed from THQ to USE, but the error occurred before the renaming. I can no longer recreate the error (see update 2), so screenshots are a bit tricky. But basically, I got the workspace connection from the workspace settings. Then I went to SSMS->connect, selected Analysis Services, pasted my clipboard into Server name, and entered my Windows credentials. All 4 datasets on my workspace were visible, including USE Finance Data Model. The error occurred when I tried to expand the Tables folder. – Bryan Rock Apr 12 '23 at 14:10
  • 1
    This is almost certainly related to an invalid schema being saved server side. My experience is that the server side schema validation is not very good, leading to client side errors when tools like SSMS (and presumably direct query connections) use client libraries to parse the server side schema into TOM objects. Tabular Editor is a fantatistic tool, but since it allows you to save directly to the server side schema, you should be very cautious. Per their own instructions, you should never be connecting directly to a deployed production dataset. Glad you got it worked out. – trenton-ftw Apr 12 '23 at 15:24

1 Answers1

0

As a general rule, if you can't connect to your workspace's XMLA endpoint and browse the dataset from a tool like SSMS, then you are going to experience unexpected issues in other tooling related to the dataset's model, like the Direct Query functionality.

My experience is that the server side schema (TOM) validation is not very good, leading to often, in my opinion, improperly handled client side errors when client libraries attempt to parse the server side schema into TOM objects. At best, it results in very vague error messages, and at its worst it just causes crashing.

As far as what your model's issue is, I am not aware of a single method of isolating it using any existing tooling. I typically just deploy the dataset to a dev environment and simply begin removing items in an attempt to isolate the issue. I would imagine that you could pretty simply write an Tabular Editor Advanced Script to parse the objects in the dataset into the TOM libraries it provides and quickly identify the issue.

Just a reminder, tools like Tabular Editor are fantatistic, but since it allows you to save directly to the server side schema, you should be very cautious. Per Tabular Editor's own documentation, you should never be connecting directly to a deployed production dataset. Tabular Editor itself performs some schema validation, but it itself operates using a TOM wrapper (for the most part) and there is really only so much validation a tool like that can do without writing a ton of additional code for ensuring your schema is correct for the given compatibility level and other libraries like AMO that you might be connecting to your schema as well.

trenton-ftw
  • 950
  • 5
  • 14
  • I do think its a sidenote, but I have noticed a lot of these 'skipped' server side validations revolve around the Refresh Policy for tables. Invalid M expressions or value combinations often seem to trigger this behaviour. Given that Refresh Policies are still only a feature for Power BI Premium, I assume that is the reason why. Hopefully they will improve this in the future. So if you are having this issue, I might start by reviewing tables with a defined Refresh Policy. – trenton-ftw Apr 12 '23 at 17:49