Tuesday, March 31, 2020

My First Look at Read/Write XMLA Endpoints in Power BI Premium

Exactly 1 year ago Microsoft announced the public preview of PowerBI open-platform connectivity with XMLA endpoints. That allowed us to connect to Power BI Premium datasets with external tools, like SSDT, SSMS, SQL Server Profiler and DAX Studio.



On March 26, they took the next step and announced the public preview of read/write XMLA endpoints in Power BI Premium.
You can go to the link of the webinar to re-view the session given by Christian Wade last week.

Theory

If you just want to see read/write XMLA endpoints in action, skip to the Practice section of this post.
Otherwise, if you never heard of it or just wanna read up on the specifics, keep reading here and we'll get there eventually :-)

Preview Feature

To start with a side-note disclaimer:




XMLA Endpoint

If you're not familiar with XMLA, you can read all about it at the XML for Analysis (XMLA) Reference. Basically, it's the protocol for communicating between client applications and the engine that manages the workspaces and datasets.
These communications are through what are commonly referred to as XMLA endpoints. XMLA is the same communication protocol used by the Microsoft Analysis Services engine, which under the hood, runs Power BI's semantic modeling, governance, lifecycle, and data management.

Additional Functionality

Through these endpoints, Visual Studio and open source community tools like Tabular Editor are enabled to provide additional data modeling and deployment capabilities (supported by the Analysis Services engine, but not yet in Power BI Desktop), like:

Enable Read Write

Follow the steps outlined here to enable XMLA Read Write in the Power BI Admin Portal for your Premium capacity.



In Practice

To test the new read-write functionality I started with an existing Tabular project in Visual Studio (SSDT) and see if I can deploy this to a Power BI Premium workspace. This project currently has a compatibility level of SQL Server 2014 (1103).





One thing to remember is the current SSAS-version of my localhost, this is 1400. We'll come back to this later, as it seemed rather important…





Upgrade Existing Model

So I started with opening the model (.bim-file) to change it to SQL Server 2019 / Azure Analysis Services (1500).
That leaves me with the following error:





A bit strange at first, because I don't know why this wouldn't be possible. I tried upgrading one compat-level at a time. 1200 worked fine, as did 1400. But 1500 kept failing. Then it struck me that I set up the Workspace Server to my localhost, which is level 1400.
So to fix this, in the properties of the model, I changed the setting Integrated Workspace Mode to True. And after that I could change the compatibilitiy level to 1500.




If you haven't disabled the pop-up, you can also set this at the startup of your project:




Deploy to Premium Workspace

If you don't know yet how to connect to a workspace on a dedicated capacity read up on it here and then come back. In the properties of the project file you can set the Deployment Server to the Power BI URL of your workspace.




Security

In addition to the XMLA Endpoint property being enabled to read-write by the capacity admin, the tenant-level Export data setting in the Power BI Admin Portal, also required for Analyze in Excel, must be enabled.
Furthermore, it's good to know that when accessing workspaces through the XMLA endpoint, group memberships (workspace and app access rights) are still honored.
The highest access rights that are currently possible are equivalent to Analysis Services database admins. So anything that requires server admin permissions (server-level traces, certain DMVs) is not yet supported through XMLA endpoints.

Model Roles

When I started my deployment I ran the following error:





As stated in the docs, the following limitations apply when working with dataset roles through the XMLA endpoint:
  • During the public preview, you cannot specify role membership for a dataset by using the XMLA endpoint. Instead, specify role members on the Row-Level Security page for a dataset in the Power BI service.
  • The only permission for a role that can be set for Power BI datasets is the Read permission. Build permission for a dataset is required for read access through the XMLA endpoint, regardless of the existence of dataset roles. Use the Power BI security model to control permissions beyond RLS.
  • Object-level security (OLS) rules are not currently supported in Power BI.

Obviously, I ran into the first point. And it turns out I had 3 XMLA-scripts for role assignments in my project, which are not supported yet in preview mode, as also explained here. I had to delete the defined roles in my model itself, as shown here:




Success (sort of)

After fixing the roles I tried to deploy again and I got myself a little cheer, because the metadata was successfully deployed! However, I was seeing some errors at the table level of the deployment.





Looking at the error details quickly showed me what is was, because I read that, again, in the docs:

When deployed the first time, a dataset is created in the workspace by using metadata from the model.bim. As part of the deployment operation, after the dataset has been created in the workspace from model metadata, processing to load data into the dataset from data sources will fail.





So looking at the Power BI workspace, we can already see the dataset appearing there!





The error is now with the credentials of the dataset. So configuring the dataset settings in the Power BI Service will solve this. Depending on the data source you can either:
  • Set data source credentials
  • Configure the dataset to use the data gateway

Downloading the PBIX file

There's one more thing I'd like to mention, which is also noted in the docs, but very important to know:
At this time, a write operation on a dataset authored in Power BI Desktop will prevent it from being downloaded back as a PBIX file. This will result in the following error in the Power BI Service:




Closing

I also did some tests with another project and got into some other errors. But this post is already getting very long, so I'm first going to investigate these errors and see if I can find an explanation for them to pop up.
Also remember once again, this feature is still in public preview, so it should not be used in a production environment. It is scheduled to be in GA in September 2020 according to the current release plan.

All-in all I think this is an awesome feature that will give us many more capabilities, especially regarding metadata of a dataset. And I only used SSDT and SSMS to connect to and deploy my dataset, but there are a lot of other options available that you can use.
The error messages I received could have been a bit more descriptive, but I heard (from a very good source) that the Power BI team is already working on that!

So to recapitulate, a few important points to notice in the end:
  • Go read the documentation first 
  • Enable Read Write for XMLA Endpoints at your Premium Capacity
  • Enable the Export data setting in the Power BI Admin Portal
  • Make sure your localhost is compat level 1500 or use the Integrated Workspace Mode
  • Delete any role memberships in your model during the public preview
  • A write operation on a dataset will prevent it from being downloaded back as a PBIX file
  • Did I already mention it is still a preview feature? :-)

Did you already have a start with XMLA endpoints, being the recently released read and write feature, or the read-only feature? Please let me know what you think and also what things could be improved even further?


12 comments:

  1. Does it support Legacy Data Sources? For me, when I try to use Legacy sources, even after entering the credentials in Service, the refresh is failing with error "Data Source information not found in Power BI Service". But, when I use the Power Query data source, the refresh works fine.

    ReplyDelete
    Replies
    1. Hi Akhil,
      Thanks for commenting. What do you mean with Legacy data sources exactly?

      Delete
    2. You can check here:
      https://docs.microsoft.com/en-us/azure/analysis-services/analysis-services-datasource

      "Tabular 1400 and higher models use Power Query connectors to manage connections, data queries, and mashups between the data source and Analysis Services. These are sometimes referred to as structured data source connections in that connection property settings are set for you. You can, however, enable legacy data sources. When enabled, you can use Table Import Wizard to connect to certain data sources traditionally supported in tabular 1200 and lower models as legacy, or provider data sources."

      Delete
    3. Hmm I'm afraid I can't help you with that. Have you posted your question at https://community.powerbi.com? Or tried creating a support ticket at (free if you have a Pro license) https://powerbi.microsoft.com/en-us/support/pro/?

      Delete
  2. Nicky, Thank you for documenting! This was a great help to me. I have been pursuing xmla as the solution to move existing pbi reports to analysis services and analysis services to power bi premium and finally back to power bi. Basically i dream of a day when all the xmla can play together happily. I wonder if you have any theories as to why xmla created data set deployed to premium cannot be downloaded as a pbix? Any suggestions for further options to pursue?
    Thank you so much for documenting! Sam

    ReplyDelete
    Replies
    1. Downloading the pbix is not possible when the feature is in preview, I suspect this will be lifted when it goes to GA.

      Delete
  3. Nicky, not sure if my previous comment made it to you so trying again.
    1. Thank you for documenting!
    2. What options could be next for being able to download a PBIX from premium with the AS 1500 data set?

    THANK YOU AGAIN! Sam

    ReplyDelete
    Replies
    1. Hi Sam, I have moderation for comments turned on, so that's why it might take a while for them to show up.
      Thanks for letting me know it was of help!

      Delete
  4. Hi Nicky, very nice article. I have the same problem with the legacy data source with my existing tabular cube (in my case, it's a odbc datasource to Impala). I can pulish the model to the PowerBI Service. But I have no way to see the gateway, neither credentials configuration

    ReplyDelete
  5. Thank you again for all the knowledge you distribute,Good post. I was very interested in the article, it's quite inspiring I should admit. I like visiting you site since I always come across interesting articles like this one.Great Job, I greatly appreciate that.Do Keep sharing! Regards, essay writng services

    ReplyDelete
  6. Great blog ! I am impressed with suggestions of author.
    Oxford Computer help

    ReplyDelete
  7. Nice post. I was checking constantly this blog and I’m impressed! Extremely useful info specially the last part I care for such information a lot. I was seeking this certain info for a long time. Thank you and good luck.write for us

    ReplyDelete

Thanks for taking the time to provide feedback!

Cheers,
Nicky

Featured Post

My DataGrillen Adventure: Speaking, Connecting, and New Friendships

I just got back from an incredible trip to DataGrillen, and I can’t wait to share my experiences with you. If you haven’t heard of it,  Data...