Tuesday, March 31, 2020

My First Look at Read/Write XMLA Endpoints in Power BI Premium

Exactly 1 year ago Microsoft announced the public preview of PowerBI open-platform connectivity with XMLA endpoints. That allowed us to connect to Power BI Premium datasets with external tools, like SSDT, SSMS, SQL Server Profiler and DAX Studio.



On March 26, they took the next step and announced the public preview of read/write XMLA endpoints in Power BI Premium.
You can go to the link of the webinar to re-view the session given by Christian Wade last week.

Theory

If you just want to see read/write XMLA endpoints in action, skip to the Practice section of this post.
Otherwise, if you never heard of it or just wanna read up on the specifics, keep reading here and we'll get there eventually :-)

Preview Feature

To start with a side-note disclaimer:



XMLA Endpoint

If you're not familiar with XMLA, you can read all about it at the XML for Analysis (XMLA) Reference. Basically, it's the protocol for communicating between client applications and the engine that manages the workspaces and datasets.
These communications are through what are commonly referred to as XMLA endpoints. XMLA is the same communication protocol used by the Microsoft Analysis Services engine, which under the hood, runs Power BI's semantic modeling, governance, lifecycle, and data management.

Additional Functionality

Through these endpoints, Visual Studio and open source community tools like Tabular Editor are enabled to provide additional data modeling and deployment capabilities (supported by the Analysis Services engine, but not yet in Power BI Desktop), like:

Enable Read Write

Follow the steps outlined here to enable XMLA Read Write in the Power BI Admin Portal for your Premium capacity.


In Practice

To test the new read-write functionality I started with an existing Tabular project in Visual Studio (SSDT) and see if I can deploy this to a Power BI Premium workspace. This project currently has a compatibility level of SQL Server 2014 (1103).




One thing to remember is the current SSAS-version of my localhost, this is 1400. We'll come back to this later, as it seemed rather important…




Upgrade Existing Model

So I started with opening the model (.bim-file) to change it to SQL Server 2019 / Azure Analysis Services (1500).
That leaves me with the following error:




A bit strange at first, because I don't know why this wouldn't be possible. I tried upgrading one compat-level at a time. 1200 worked fine, as did 1400. But 1500 kept failing. Then it struck me that I set up the Workspace Server to my localhost, which is level 1400.
So to fix this, in the properties of the model, I changed the setting Integrated Workspace Mode to True. And after that I could change the compatibilitiy level to 1500.



If you haven't disabled the pop-up, you can also set this at the startup of your project:



Deploy to Premium Workspace

If you don't know yet how to connect to a workspace on a dedicated capacity read up on it here and then come back. In the properties of the project file you can set the Deployment Server to the Power BI URL of your workspace.




Security

In addition to the XMLA Endpoint property being enabled to read-write by the capacity admin, the tenant-level Export data setting in the Power BI Admin Portal, also required for Analyze in Excel, must be enabled.
Furthermore, it's good to know that when accessing workspaces through the XMLA endpoint, group memberships (workspace and app access rights) are still honored.
The highest access rights that are currently possible are equivalent to Analysis Services database admins. So anything that requires server admin permissions (server-level traces, certain DMVs) is not yet supported through XMLA endpoints.

Model Roles

When I started my deployment I ran the following error:




As stated in the docs, the following limitations apply when working with dataset roles through the XMLA endpoint:
  • During the public preview, you cannot specify role membership for a dataset by using the XMLA endpoint. Instead, specify role members on the Row-Level Security page for a dataset in the Power BI service.
  • The only permission for a role that can be set for Power BI datasets is the Read permission. Build permission for a dataset is required for read access through the XMLA endpoint, regardless of the existence of dataset roles. Use the Power BI security model to control permissions beyond RLS.
  • Object-level security (OLS) rules are not currently supported in Power BI.

Obviously, I ran into the first point. And it turns out I had 3 XMLA-scripts for role assignments in my project, which are not supported yet in preview mode, as also explained here. I had to delete the defined roles in my model itself, as shown here:




Success (sort of)

After fixing the roles I tried to deploy again and I got myself a little cheer, because the metadata was successfully deployed! However, I was seeing some errors at the table level of the deployment.




Looking at the error details quickly showed me what is was, because I read that, again, in the docs:

When deployed the first time, a dataset is created in the workspace by using metadata from the model.bim. As part of the deployment operation, after the dataset has been created in the workspace from model metadata, processing to load data into the dataset from data sources will fail.




So looking at the Power BI workspace, we can already see the dataset appearing there!




The error is now with the credentials of the dataset. So configuring the dataset settings in the Power BI Service will solve this. Depending on the data source you can either:
  • Set data source credentials
  • Configure the dataset to use the data gateway

Downloading the PBIX file

There's one more thing I'd like to mention, which is also noted in the docs, but very important to know:
At this time, a write operation on a dataset authored in Power BI Desktop will prevent it from being downloaded back as a PBIX file. This will result in the following error in the Power BI Service:



Closing

I also did some tests with another project and got into some other errors. But this post is already getting very long, so I'm first going to investigate these errors and see if I can find an explanation for them to pop up.
Also remember once again, this feature is still in public preview, so it should not be used in a production environment. It is scheduled to be in GA in September 2020 according to the current release plan.

All-in all I think this is an awesome feature that will give us many more capabilities, especially regarding metadata of a dataset. And I only used SSDT and SSMS to connect to and deploy my dataset, but there are a lot of other options available that you can use.
The error messages I received could have been a bit more descriptive, but I heard (from a very good source) that the Power BI team is already working on that!

So to recapitulate, a few important points to notice in the end:
  • Go read the documentation first 
  • Enable Read Write for XMLA Endpoints at your Premium Capacity
  • Enable the Export data setting in the Power BI Admin Portal
  • Make sure your localhost is compat level 1500 or use the Integrated Workspace Mode
  • Delete any role memberships in your model during the public preview
  • A write operation on a dataset will prevent it from being downloaded back as a PBIX file
  • Did I already mention it is still a preview feature? :-)

Did you already have a start with XMLA endpoints, being the recently released read and write feature, or the read-only feature? Please let me know what you think and also what things could be improved even further?


Friday, February 21, 2020

Update: My Favorite (Data Platform) Podcasts

A tweet on the #SQLFamily hashtag triggered me last week to update an old blog post I wrote a few years back on My favorite (SQL server) podcasts.




After I switched jobs 2,5 years ago I have been doing a lot less commuting, which is good actually, but that left me also with less time to listen to podcasts. Therefore I chose to not listen to some episodes that are not that interesting to me.

Old podcasts

Here's the list I was already following since my last post:

  • .NET Rocks!
    • This is still a great podcast. I particularly like the Geek-outs.
  • Dear SQL DBA
    • The last episode is from the beginning of 2019, I always liked this very much because of the way Kendra explains things.
  • Developer On Fire
    • This was still a thing in 2019, but kinda dried up at the end of last year. The list of book recommendations is still there and huge, so do have a look there.
  • Hanselminutes
    • Still very much a thing. Good, short and fun episodes.
  • no dogma podcast
    • Still a regular flow of good episodes.
  • Office Hours Podcast: SQL Server Pain Relief
    • This one had its last episode on Jan 2 2019.
  • RunAs Radio
    • Still a constant, weekly flow of information and knowledge!
  • Simple Programmer Podcast
    • The frequency was turned down to weekly in 2019, but still regularly producing content.
  • SQL Down Under
    • The URL changed, and after 9 months of silence there was a new episode in December of last year.
  • SQL Server Radio
    • After a few months of silence there was a (re)new(ed) episode with a great announcement.
  • voiceofthedba’s podcast
    • Steve is actually on a sabbatical right now, but his short bursts of knowledge will certainly continue as he goes back to work next week.

New podcasts

These are the podcasts that are new:


Dutch podcasts

Next to the Data Platform related ones above, I also have a few Dutch podcasts that I like to listen to:


Closing

I still use the app Podcast Addict forAndroid, now the paid version without ads.
And one more pro tip before I finish: I use a podcast app that can set the speed of the podcast. I listen everything at 2 times the speed now :-) You can start at 1.5 and work your way up to what feels comfortable for you.

Are you also listening to podcasts, on your way to work, during work or somewhere else? Let me know in the comments which ones you like and if you have others not in my list!

@NickyvV

Monday, February 10, 2020

My Upcoming Speaking Engagements: Belgium and Iceland!

Before talking about my upcoming speaking engagements I would like to tell you a little story. And it goes like this.

How it all started

Back in 2018 I started my journey of getting more involved in the SQL and Power BI community. It actually all started with a Power BI User Group (the Dutch Data Dude Farewell Tour with Jeroen ter Heerdt (B | @jeroenterheerdt)) organized by PBIG (and me) at the Van Lanschot Kempen office in Den Bosch, on which I wrote about earlier last year. Jeroen Schalken (B | @JeroenSchalken) and me did a little introduction talk (around 15 minutes) about our implementation of Power BI at Van Lanschot Kempen. It was frightening and exciting at the same time. During the break of the evening Edgar Walther (B | @edgarwalther) of PBIG came to say it was a very interesting presentation and if we would like to extend that and come talk at the Power BI User Day in March 2019. It was right there that I (and Jeroen) made the decision to say YES. We still had to make a one hour presentation, but also had plenty of time left.

After that presentation I really got the feel of it and started looking for other opportunities to speak. Eventually this turned out in speaking at Power BI Days Belgium with a session and a workshop, followed by SQL Saturday Prague together with Jeroen, dataMinds Connect in Belgium, and even the Power Platform World Tour in Dublin!
It brought me so much more than I could have imagined.

The first thing is that I get the chance to share my knowledge with like-minded people. Next, it was not only amazing to travel to these venues with great speakers and people I look up to. There were also awesome social events during every trip. And that's where the real magic happens. You make contacts, chat a bit, have a beer (or drink of your choice) and make new friends. And at another event you meet even more people to connect with, but sometimes also catch up with the newly made friends from the previous conference. It's been an amazing experience so far.

Upcoming events

But now back to the title of this blog post: it was time again to update my Speaking page, because I already have a few very nice events in my calendar.

This year will start with SQL Saturday Belgium (#956) on March 7, where I will deliver (an updated version of) the presentation that started it all.
Next up is SQL Saturday Iceland (#937) on March 28, where I will deliver my new session A Form, a Flow and a Power BI Streaming Dataset Walk Into a Bar. This session is based on the lightning session I gave at the Dublin Power BI UG during the Power Platform World Tour in Dublin.
Next to those two SQL Saturday events there's also a UG meeting planned, but not officially released yet. So more info on that one later.

Teaser

And last but not least: I will have some very exciting news to share in the coming weeks hopefully. I'm currently finalizing everything, so I'll let you know when everything is signed, sealed and delivered :-)

Friday, January 31, 2020

A Form, a Flow and a Power BI Streaming Dataset Walk Into a Bar - Part 2

In this blog series I will set apart all the steps needed to get this demo up and running on your own. I will use the text in this post, as well as screen shots and video's explaining it step by step. So if you're already (somewhat) familiar with Forms, Power Automate and/or Power BI streaming datasets, feel free to go directly to the video's and start building your own demo right away. Otherwise, keep reading and eventually you'll come across all the resources as well.

In Part 1, I already explained what the reason behind this demo is, and what I used it for in my presentations. I made a start with the demo and showed you how to make a Form and set up a streaming dataset in Power BI.
In Part 2 of this blog series I will show you how to glue it all together with Power Automate and create a report out of the data. There are some things you need to know in the streaming dataset, to properly show the output in your Power BI report. And lastly, I will show you a trick to get your data refreshed live while respondents are answering the questions, but without refreshing your browser!

Create an automation flow

In Power Automate you can create flows manually or with the help of a template. I used a template for my first flow, but if I search for this same template in the gallery it actually doesn't show up.


Power Automate template
I do have the URL because you can check that in your current flows. I resubmitted it as a template because I think it's quite useful:
Power Automate flow details
You can use that template here, or create the steps yourself manually. It's actually not that hard if you take a look at the template.



Create a Power BI report

When the previous steps have been completed, we have built ourselves a working solution, we only have to visualize the results from the dataset. OK, I know, I could also use the default reporting of Microsoft Forms, but that uses pie charts… And you know: every time you create a pie chart, God kills a kitten! :-)

Microsoft Forms default report
Before creating a report it's good to have some data to create the visuals from. To do that, in Forms, clicking on Preview lets you enter the form as a user. You can then create a report by connecting to the streaming dataset, live from Power BI Desktop.



As you saw in the video, I created a basic report with the questions I used in the form. You can create any kind of fancy report with this approach of course.
If you use for example a multiple choice question in your form, with multiple possible answers, I use the DAX formula CONTAINSSTRING or CONTAINSSTRINGEXACT to get the values out of the JSON-string. You write 1 measure per choice and then plot them in a chart. I'm not sure if there's an easier way of doing this, also maybe with a dynamic number of values. Let me know in the comments if you figure this out :-)

Share the Form

Last but not least, we also want to share our form with our audience, right? You can do this by creating a shorter URL with any of the URL shortener services like bit.ly, tinyurl.com or Google. You then share this link in the first few slides of your presentation and even have it visible while attendants are walking in prior to the start of your session. That way they have a minute to fill in the form and you don't have to wait on their answers during the presentation.

Create a Live Dashboard

So the trick for creating a live refreshing view of the answers coming in, is to create a dashboard of the visuals in the Power BI service. So pin all the visuals you need from the report to a dashboard, maybe also add the URL to the form and you're done. You could also refresh the browser every few seconds (either manually or with a browser extension in Chrome). But I do think the automatic refresh in a dashboard looks cleaner, otherwise you're also depending on the speed of the WiFi for loading your whole page every time you refresh.

Recap

As you have seen, setting up this Power Platform demo is actually not that difficult, you just have to know the steps and maybe go through them once to remember them. It does however add a lot of value to a presentation in my opinion. After the PowerPlatform World Tour Meetup in Dublin I had some good feedback from people that wanted to use this setup in their next presentation. From Mary Fealty (B | @Br0adtree) and Marc Lelijveld (B | @MarcLelijveld) I already heard they implemented this, so that's nice to hear!
Mary even added a great feature to the report.

Let me know in the comments or on Twitter if you use this or another method or if you have any questions!