Monday, March 23, 2026

My Take On The Fabric Conference Updates

I am sure you have seen, there has been a lot of Fabric and Power BI news lately. Not surprisingly, Fabric Conference was also last week! 😀

I won't list all the updates here, you can read Arun's blog, or either of the Fabric or Power BI monthly feature summary blogs to go through the whole list:

Instead, I will highlight what are the biggest game changers for me, and what you definitely should dive in deeper.

Platform

Database Hub (Early Access)

This is a new control plane for everything database: so Azure SQL, Cosmos DB, PostgreSQL, MySQL and Fabric Database, offering estate-wide observability, delegated governance and Copilot-powered insights. I haven't seen or used it yet, but does this really take away the need to use the Azure Portal and / or SSMS?

Planning in Fabric IQ (New)

A new enterprise planning capability that brings financial and business planning directly into Fabric. So we're talking budgets, forecasts, scenarios and targets. To me it sounds like this is aimed at replacing disconnected planning tools (like Anaplan)? It's a co-operation / acquisition with Lumen, but as far as I got from the sessions and community notes, you don't pay Lumen directly but instead it just costs you CU's from the already present capacity.

OneLake + Databricks Unity Catalog (Preview)

They now added the capability to natively read from OneLake through Azure Databricks Unity Catalog. As I see more and more companies use Databricks combined with Fabric, as either bronze/silver or the whole data platform, I reckon this integration will have advantages for people using both platforms. It still looks like Microsoft is keeping their promise of keep developing an open platform.

Fabric Remote MCP (Preview)

If you haven't looked into MCP servers yet, I highly recommend you do. I also haven't dived deep into it, but I started doing some dabbling around with the Power BI modelling MCP server, for minor tweaks and measure creation, and that was already awesome.
Kurt Buhler and Eugene Meidinger have some good posts about the theory, inner workings and how to get started.

Power BI 

Translytical Taskflows (GA)

Translytical taskflows (TT) a.k.a. writeback for Power BI, although that doesn't paint the whole picture because it can do much more. It uses User Data Functions (Python notebooks) in Fabric to update data in a data store of you choice (e.g. Azure SQL, Fabric SQL Database, Lakehouse) from within certain elements in a Power BI report. Now that TT is GA I reckon much more companies will be able / are allowed to explore this. This will be a game-changer, you don't need seperate licenses for e.g. Power Apps or other 3rd party software that can do writeback.
Be aware though, because there are still some limitations, a.o. PBIR (Power BI enhanced report) and PBIP (Power BI Project) formats cannot be used together with TT.

Direct Lake on OneLake (GA)

Direct Lake now works better with OneLake security and it has more modeling features. The guidance is now clear: use Direct Lake on OneLake by default, unless you have a specific SQL endpiont security dependency. For me, import is still the default, but if you have specific requirements around data freshness or loading very large datasets into memory, you could have benefit by using Direct Lake.

TMDL View in web modeling (Preview)

Finally, we have TMDL view in the web! That makes the web modeling so much more complete, you can copy measures, tables or whole models over. Or adjust sources, parameters, anything! They also implemented a side-by-side diff view before applying changes.

Custom Totals in Tables / Matrices (Preview)

I know a few people that will be very pleased with this... :-) There's been a lot of complaining over the years that "my totals are off!". No, that's just the way DAX works... Well, now you can at least decide for yourself if you want to change it.

Modern Visual Defaults (Preview)

I wanted to call out this one, because I'm not sure if I'm actually a fan of this feature. The new default will be smoothed lines, which can sometimes give false data points in a graph. I'd have to check wether this will turn out good.

Developer


Git integration now has selective branching, where you can branch out for a specific feature and only pull items you need.

Furthermore, two new open-source projects have been added:
• Agent skills for Fabric: these are natural language plugins for GitHub Copilot terminal. This sounds like plugging into the Matrix and learning a new skill :) It also means you can use and share skills from and for the community.
• Fabric Jumpstart: these are reference architectures and single-click sample deployments. These look familiar to the earlier industry solutions, but now completely set up with data inside your own tenant.

Closing

With everything taken into account, Microsoft is betting hard on Fabric as the AI data operating sysem. I'm curious what the coming weeks and months will bring.

As I am typing this blog, I'm now actually in the plane to the MVP Summit in Redmond, Washington at the Microsoft HQ! Hopefully I can dive even deeper into a lot of these topics while I'm there!


Thursday, March 5, 2026

Power BI Gebruikersdagen 2026


Today, it's that time of year again! The Dutch Power BI Gebruikersdagen are here :-)

Last year's keynote, photo credits by PBIG


Today is Master class Thursday, all day workshops, tomorrow is Deep Dive Friday, with 90 minute deep dive sessions, and Saturday is Community Saturday

This conference is a must see for me and my colleagues at Powerdobs. Although last year I had to miss it, because it coincided with our annual family ski-ing trip. I was really sad I had to miss out.


There are workshops on Thursday, those cover a whole day deep dive on one topic.
Then there are deep dive sessions on Friday, 90 minute (mostly technical) sessions on one topic.
And last, we have community Saturday, with sessions of 60 minutes, with also a lot of less technical  and more business focused sessions. This year, there are 11 (!) simultaneous sessions, so it will be very hard to pick one session per slot!

You can find the agenda with the full list of sessions here.


I'll be heading to the conference tomorrow. This year, I will present a 90-minute session on Friday, where I'll deep dive into the Direct Lake (pun intended).


It will be fun to see so many familiar faces again. And it's always good to meet new ones too.

Come say hello or ask a question in, before or after my session on Friday. I'll also be present the whole day on Saturday!

Friday, February 13, 2026

Fabric Workspace Settings Update: "License Type" Renamed to "Workspace Type"

Just a quick post because I noticed a change in the Fabric UI, specifically in the Workspace settings.

I am working on a demo for my Power BI Gebruikersdagen session, and wanted to switch a workspace to Fabric capacity. I noticed that the setting License type has changed, and is now called Workspace type.

You can see the new layout in the screenshot below.

The new setting called Workspace type


As you can see in the screenshot, there are now 2 groups of workspaces (= licenses):

  • Power BI only workspace types
  • Fabric and Power BI workspace types
The 1st category is for a Pro or PPU user license or an embedded SKU.
The 2nd type is for capacity licenes only, with an F or FT (trial) SKU, or an old P-SKU


There's no change to the licensing itself (at least as far as I know), it is just a slight rewording of the already available types of user and capacity licenses. The documentation on Workspace types is already updated to reflect these changes.
I've checked 3 different tenants, and in all of these I saw the same new setting being displayed. All these tenants are in the West-Europe region, and because it might not be available in all regions worldwide your tenant may still be showing the old License type.

I like the new categories and more clear distinction between Power BI and Fabric workspace types.
Let me know: Do you like this change?

Friday, February 6, 2026

Transitioning to the New Power BI Enhanced Report Format (PBIR) - What You Need to Know

Let me be clear: I really like and support the updates that Rui Romano has been pushing the last years!
In short, it brings:

  • Better support for CI / CD and source control 
  • Better integration for programmatic report updates, e.g. with LLM's
  • More reliable merge outcomes with the PBIP and PBIR structures

With that being said, I do think some customers do not want to have preview features in production, so they will be cautious with the recent developments. Since the end of January, the PBIR format will be the default if you don't take action.
If you don't want to enable PBIR yet, or just want to know more about the transition, read on.


A little bit of background

Let's start with the basics, what is PBIP, TMDL and PBIR?

PBIP
PBIP turns Power BI into a real project instead of a single magic PBIX file. Files on disk, folders, source control that actually works. If you’ve ever tried to diff a PBIX and felt pain: PBIP is the fix.

TMDL
TMDL is the semantic model as text. Tables, measures, relationships: readable, reviewable, automatable. This is Microsoft finally saying “models are code”, not artifacts you click together and hope for the best.

PBIR
PBIR does the same, but for reports. Visuals, layout and interactions stored in a format that Git can understand. PBIX still exists, but the report inside it stops being a black box. That’s why PBIR becoming the default is a big deal.

So, PBIP is the structure, TMDL is the model, PBIR is the report.
Together, they move Power BI away from file-based BI and closer to proper software development, whether you asked for it or not. 😊
And to be clear, PBIX is still here and not moving away either, I will expand on that later.


Why This Matters Now

PBIR will become the default report format starting January 25th, 2026 in the Power BI Service, with Desktop adopting it in March 2026, but it will still be in preview.

This will affect developers using PBIP workflows, admins managing enterprise tenants, and teams relying on legacy formats.

Because a lot of clients don't want to use preview features, you can opt out during preview: let me explain how and why you might want to...

M365 Message center PBIR announcement


Legacy Format vs. Enhanced Report Format

The Legacy format is the older JSON-based metadata structure inside reports, which contains 1 big, unreadable JSON file for all things reports in your Power BI Desktop file, whether it's PBIX or PBIP.
This legacy format poses limitations on readability, CI/CD, automation, source-control and working with AI / LLM's. 

As I mentioned earlier, and Microsoft is also explicit about that, PBIX isn't going away. It will still be the primary developer format.

What will change is the metadata inside a PBIX file: from PBIR-Legacy to the new PBIR structure.

Official Timeline

Changes will hit the service first, and Power BI Desktop later.

Power BI Service, starting January 25:
  • New reports in the service will default to PBIR
  • Existing reports will auto-convert to PBIR when edited and saved in the service
Power BI Desktop, starting with the March 2026 release:
  • Desktop switches to PBIR as the default for both PBIX and PBIP-files. Until then, PBIR must be explicitly enabled in the preview features
General Availability (GA) & Beyond:
  • PBIR remains in Preview during the transition and becomes mandatory at GA (right now: expected in Q3 of 2026, check the roadmap for latest info)
  • At GA, PBIR-Legacy will be removed and PBIR will be the only supported report format

What You Need To Do Today For Opting Out

In Power BI Desktop:
  • Disable the PBIR preview features (under Options > Preview features).
In Power BI Service (Tenant Setting):
  • Turn off Automatically convert and store reports using Power BI enhanced metadata format (PBIR).
  • Important: this tenant setting already existed for quite a few weeks but only took effect in the end of January 2026
PBIR tenant setting to opt out before GA


What about rollback?

If a report is already converted to the new format, you can still go back, but you have to do it with a backup of your report. Backups are automatically created for you, both in Desktop and in the service, but they will only be available for a limited time window.

Desktop

In Desktop, a backup is automatically written to disk and kept for 30 days. Depending on which version you have, the location differs a little bit:
  • Microsoft Store version:
    %USERPROFILE%\Microsoft\Power BI Desktop Store App\TempSaves\Backups
  • Executable installer version:
    %USERPROFILE%\AppData\Local\Microsoft\Power BI Desktop\TempSaves\Backups

Service

When a report is automatically converted in the service, a backup in legacy format is retained for 28 days. You can restore it from the workspace in the report settings.

For both the desktop and service reverted versions, it won't prompt again to upgrade for that specific file.

Wrap-Up & Call-to-Action


While the transition to PBIP and PBIR in general is a good thing, and I'm already using it for most of my projects, there might be some clients that rather do not want to, or just aren't allowed to use preview features in production.
If you are in the last group, I suggest you:
  • Check the Power BI Desktop preview settings
  • Check the Tenant settings in the Fabric Admin portal

Friday, November 28, 2025

Quick Tip: Analyze the Fabric Capacity Metrics App Easily

Just a quick tip for anyone using the Fabric Capacity Metrics App.

If you find yourself checking the Metrics app and see a spike in usage you might want to analyze that. How many times did you have to click to get exactly the column you needed? Or before you were able to click any column at all? 😁



You can of course filter on the dates, but sometimes that's not enough detail.
Let's say I want to analyze the spike around November 19th, but I can't click that one red bar that has the highest peak?
When you hover over the visual you can select Focus mode.




This brings up the visual on the entire screen, making it much easier to click on individual bars.



When you select the single bar you are looking for, and click Back to report on the top left you go back to the normal page view, but with that bar still selected.



You can then click Explore and go to the timepoint details.

Making sense in case of huge spikes

Sometimes you may hit a big spike in usage, either by high usage of the capacity, or because the capacity was paused.


In the above case I had a severe throttling issue, where the capacity would have been unavailable for multiple hours and the client decided to pause / unpause the capacity. This effectively takes all the future CU-usage and makes you pay it off at once, hence the enormous spike in usage on November 12th.

Note

Pausing a capacity will almost always increase your Azure bill, next to the Fabric capacities you already have.


The downside is that this immediately renders this visual useless, because now you can't see anything other than that 65000% spike anymore.. 😁

If you want to make this visual meaningful again, you can filter out the date with the very high %, in the filter pane on Date.
Change the selection to Basic filtering and select only the dates without the spike!




Tuesday, September 23, 2025

Power BI - Requirements to Use Analyze in Excel

I think I now got this question 4 times in the last months, so I thought I'd write it down so I can reference it later, and point people to it 😄

What are the requirements so (a group of) colleagues can start using Analyze in Excel?

Good question, let me break it down. 
In general, I think it's also better to use Analyze in Excel than Export to Excel!
Reza Rad also wrote about why that's important earlier.


Prerequisites to use Analyze in Excel

  • Power BI license
    You either need a Power BI Pro or PPU license, or the semantic model you connect to must be in a workspace backed by a Power BI Premium / Fabric F-capacity, this needs to be an F64 or higher SKU. Although I have seen some instances where it might have (temporarily) worked with a lower F-SKU, this is not supported/allowed (see Power BI licensing).
  • Tenant Setting
    Your Power BI / Fabric Administrator needs to enable the Tenant setting in the Admin Portal: "Users can work with Power BI semantic models in Excel using a live connection". You can learn more about that setting on the admin portal documentation.
  • Semantic Model permissions
    The user needs at least build permissions on the semantic model, or at least the Contributor role in the workspace. Ideally you put people in an Entra ID group and give that the appropriate permissions.
  • Excel
    You obviously need Excel Desktop or Excel for the web installed for it to work. You might have to install some updates for it to work.




A few things to be aware of:

  • Analyze in Excel creates a dynamic, live connection to the Power BI dataset, so any changes to the dataset will be reflected when the Excel report is refreshed
  • Free users can analyze data from Premium workspaces without needing a Pro license, provided they have the appropriate role (at least build permissions or Contributor role) assigned in that Premium workspace
  • You'll need to use measures in the PivotTable's values area, as you can't directly drag numeric columns like 'cost' into that area. It's anyway better to create explicit instead of implicit measures

Wednesday, September 3, 2025

Fabric Quality of Life Update: No More Default Semantic Models!

Another quick post, because today is an important day for everyone working with Fabric and Power BI!

Last month, Microsoft announced they are Sunsetting Default Semantic Models: Yaay! 😀
Today marks that day: No more automatic child semantic models!

The info message also states only a SQL endpoint is created

So now whenever you create a warehouse, lakehouse, SQL database, or mirrored database, you only will get that item, and the SQL analytics endpoint connected to it.

No more default semantic models!


This means from now on you always have to manage the semantic model yourself, whenever you create one of the above items.


What about my existing default semantic models?

To be clear: existing (default) semantic models are not affected (yet!). But by the end of December (2025) they will be decoupled from their connected item, and you will have to manage that model manually.


Conclusion

Now that Fabric is getting more widely used, the demand for stronger governance and greater control over semantic models was growing. This change takes away the auto/generated models and gives you more control in the creation of your semantic models.

To read more about this:

There's more information in those links about the exact changes and menu-items going away, the timeline and future updates/blog posts.

Featured Post

My Take On The Fabric Conference Updates

I am sure you have seen, there has been a lot of Fabric and Power BI news lately. Not surprisingly, Fabric Conference was also last week! 😀...