Monday, March 30, 2026

Quick Tip: Get Ready for Changes in OneLake Operation Reporting

Check Your Capacity Metrics App

Just a quick heads-up for anyone using the Fabric Capacity Metrics app to monitor OneLake usage.

One of the upcoming features from the Fabric Roadmap I'm watching is OneLake Storage Lifecycle Management Policies. The idea is straightforward: rule-based policies that automatically move data between hot, cool, and cold storage tiers based on last access or modification time. So data you haven't touched in months stops costing you the same as data you query every day.

A recent message in the M365 Admin Center got me thinking (you need an Admin account to access this, or check out this Message Center Archive website instead):

Changes to OneLake operation reporting in Microsoft Fabric
Starting April 1, 2026, Microsoft is rolling out changes to how OneLake compute operations are reported in Fabric. This is tied to the (later) introduction of the mentioned OneLake storage tiers (hot, cool, and cold), and while billing rates are not changing, the way operations appear in your dashboards and reports is.

So what is changing?

The following changes are being applied:

  • Operation names now include the storage tier. "OneLake Read via Proxy" becomes "OneLake Read (Hot)", for example.
  • Proxy and Redirect operations are consolidated under a single operation name. No billing impact, but the names you're used to will disappear.
  • Operations are now grouped by workspace in the Capacity Metrics app, under a new OneLake item. So plan for workspace-level reporting. If you rely on item-level operation details in the Capacity Metrics app, you'll have to use OneLake diagnostics going forward.

Units of measure and consumption rates stay the same, so no surprises on the bill.

A preview of how the reporting in the Metrics app will look like:


 

What should you do before April 1?

This change is rolling out in phases:

  • Operation name changes start April 1
  • Consolidation and Metrics app updates follow in the next weeks. Keep an eye on your tenant.
Steps you should take:

  • Check if you have any custom reports, dashboards, or scripts that reference OneLake operation names, update those to match the new naming convention (think FUAM or others)
  • If you rely on item-level operation detail in the Capacity Metrics app, set up OneLake diagnostics now so you don't lose visibility
  • Share this with your helpdesk or anyone on your team who monitors capacity or cost analytics

You can find more details in the OneLake compute consumption documentation and track the rollout via the Microsoft 365 Message Center (MC1259829).

Monday, March 23, 2026

My Take On The Fabric Conference Updates

I am sure you have seen, there has been a lot of Fabric and Power BI news lately. Not surprisingly, Fabric Conference was also last week! 😀

I won't list all the updates here, you can read Arun's blog, or either of the Fabric or Power BI monthly feature summary blogs to go through the whole list:

Instead, I will highlight what are the biggest game changers for me, and what you definitely should dive in deeper.

Platform

Database Hub (Early Access)

This is a new control plane for everything database: so Azure SQL, Cosmos DB, PostgreSQL, MySQL and Fabric Database, offering estate-wide observability, delegated governance and Copilot-powered insights. I haven't seen or used it yet, but does this really take away the need to use the Azure Portal and / or SSMS?

Planning in Fabric IQ (New)

A new enterprise planning capability that brings financial and business planning directly into Fabric. So we're talking budgets, forecasts, scenarios and targets. To me it sounds like this is aimed at replacing disconnected planning tools (like Anaplan)? It's a co-operation / acquisition with Lumen, but as far as I got from the sessions and community notes, you don't pay Lumen directly but instead it just costs you CU's from the already present capacity.

OneLake + Databricks Unity Catalog (Preview)

They now added the capability to natively read from OneLake through Azure Databricks Unity Catalog. As I see more and more companies use Databricks combined with Fabric, as either bronze/silver or the whole data platform, I reckon this integration will have advantages for people using both platforms. It still looks like Microsoft is keeping their promise of keep developing an open platform.

Fabric Remote MCP (Preview)

If you haven't looked into MCP servers yet, I highly recommend you do. I also haven't dived deep into it, but I started doing some dabbling around with the Power BI modelling MCP server, for minor tweaks and measure creation, and that was already awesome.
Kurt Buhler and Eugene Meidinger have some good posts about the theory, inner workings and how to get started.

Power BI 

Translytical Taskflows (GA)

Translytical taskflows (TT) a.k.a. writeback for Power BI, although that doesn't paint the whole picture because it can do much more. It uses User Data Functions (Python notebooks) in Fabric to update data in a data store of you choice (e.g. Azure SQL, Fabric SQL Database, Lakehouse) from within certain elements in a Power BI report. Now that TT is GA I reckon much more companies will be able / are allowed to explore this. This will be a game-changer, you don't need seperate licenses for e.g. Power Apps or other 3rd party software that can do writeback.
Be aware though, because there are still some limitations, a.o. PBIR (Power BI enhanced report) and PBIP (Power BI Project) formats cannot be used together with TT.

Direct Lake on OneLake (GA)

Direct Lake now works better with OneLake security and it has more modeling features. The guidance is now clear: use Direct Lake on OneLake by default, unless you have a specific SQL endpiont security dependency. For me, import is still the default, but if you have specific requirements around data freshness or loading very large datasets into memory, you could have benefit by using Direct Lake.

TMDL View in web modeling (Preview)

Finally, we have TMDL view in the web! That makes the web modeling so much more complete, you can copy measures, tables or whole models over. Or adjust sources, parameters, anything! They also implemented a side-by-side diff view before applying changes.

Custom Totals in Tables / Matrices (Preview)

I know a few people that will be very pleased with this... :-) There's been a lot of complaining over the years that "my totals are off!". No, that's just the way DAX works... Well, now you can at least decide for yourself if you want to change it.

Modern Visual Defaults (Preview)

I wanted to call out this one, because I'm not sure if I'm actually a fan of this feature. The new default will be smoothed lines, which can sometimes give false data points in a graph. I'd have to check wether this will turn out good.

Developer


Git integration now has selective branching, where you can branch out for a specific feature and only pull items you need.

Furthermore, two new open-source projects have been added:
• Agent skills for Fabric: these are natural language plugins for GitHub Copilot terminal. This sounds like plugging into the Matrix and learning a new skill :) It also means you can use and share skills from and for the community.
• Fabric Jumpstart: these are reference architectures and single-click sample deployments. These look familiar to the earlier industry solutions, but now completely set up with data inside your own tenant.

Closing

With everything taken into account, Microsoft is betting hard on Fabric as the AI data operating sysem. I'm curious what the coming weeks and months will bring.

As I am typing this blog, I'm now actually in the plane to the MVP Summit in Redmond, Washington at the Microsoft HQ! Hopefully I can dive even deeper into a lot of these topics while I'm there!


Thursday, March 5, 2026

Power BI Gebruikersdagen 2026


Today, it's that time of year again! The Dutch Power BI Gebruikersdagen are here :-)

Last year's keynote, photo credits by PBIG


Today is Master class Thursday, all day workshops, tomorrow is Deep Dive Friday, with 90 minute deep dive sessions, and Saturday is Community Saturday

This conference is a must see for me and my colleagues at Powerdobs. Although last year I had to miss it, because it coincided with our annual family ski-ing trip. I was really sad I had to miss out.


There are workshops on Thursday, those cover a whole day deep dive on one topic.
Then there are deep dive sessions on Friday, 90 minute (mostly technical) sessions on one topic.
And last, we have community Saturday, with sessions of 60 minutes, with also a lot of less technical  and more business focused sessions. This year, there are 11 (!) simultaneous sessions, so it will be very hard to pick one session per slot!

You can find the agenda with the full list of sessions here.


I'll be heading to the conference tomorrow. This year, I will present a 90-minute session on Friday, where I'll deep dive into the Direct Lake (pun intended).


It will be fun to see so many familiar faces again. And it's always good to meet new ones too.

Come say hello or ask a question in, before or after my session on Friday. I'll also be present the whole day on Saturday!

Featured Post

Quick Tip: Get Ready for Changes in OneLake Operation Reporting

Check Your Capacity Metrics App Just a quick heads-up for anyone using the Fabric Capacity Metrics app  to monitor OneLake usage. One of the...