An article by our Director Ross Waterston

Recently, as part of our internal training, I was reminded that I have been using PowerPivot and then Power BI since mid-2015. I backed the crowdfunding campaign for Rob Collie’s book back in September 2015 after finding the first edition amazing help when learning PowerPivot. Looking at how far the platform has come since then, it felt like it was time to consider the changes in my nine years of hardcore usage.

A quick history

The journey to deliver Power BI is often missed by those newly coming to the platform. Microsoft achieved what should have been unachievable between 2014 and 2017; this is the most amazing feature of Power BI. As Microsoft had to rapidly pivot their whole Enterprise Analytics offering from the traditional SQL Server Analysis Server (and SQL Server Report Server) to a more modern Analysis solution. The market was disrupted primarily by Tableau as it landed the capability for individuals to analyse what was previously considered to require a server. I’m being a little vague about this because Excel was starting to transition to XLSX, which did have that capability, but many people were still stuck with XLS that did not.

Tableau disrupted Microsoft’s dominance, challenging its future as a Leader in the Analytics space. Server-based infrastructure didn’t have the agility to compete in the modern world. Microsoft risked being marginalised at a time when the Azure Cloud Platform needed to expand to keep them relevant in a “Cloud First” landscape. The solution was a rapid change of direction to double down on the Cloud options while mobilising the traditional Excel user base (yeah, there is a very healthy dose of creative licence here, but it’s a true story for a given value of true). Excel was the key to Power BI success, with PowerPivot being released in 2010 and then augmented with first Power Query and then Power View (the missing link) before Power BI was launched officially in 2015 with the creation of Power BI Desktop and the Power BI Service. Excel users were the first to benefit as PowerPivot allowed them to use their existing licences to analyse data in ways they had been unable to before. Power Query made it possible to pull data and process it in ways you could not do with Excel prior to that point. Trying to read in every single one of those daily dump files sitting in those shares on your server suddenly became the task of seconds instead of needing hours of IT help. Power View introduced the report canvas we still know in Power BI Desktop, but within Excel, it was not in a way you could readily share.

By 2016 Power BI was moving forward and the development pace has been frantic ever since. Power BI Desktop receives monthly updates, with the service following a similar cycle. This frequency of ongoing improvements for that period of time is unprecedented. While not all the steps have been forward for Microsoft in this ecosystem, one cannot deny that progress has been consistently made. Personally, the two biggest stand out improvement that I could not live without now have been the ability for Power BI to use Power BI as a source (yes, thin reports were not possible when Power BI first came out) and the ability to use Variables within measures. Why not let us know what feature you think was the standout one?

Let’s talk about Fabric

Now Microsoft is again looking to make significant platform changes, no doubt as part of the push into Generative data. The second part is the part Microsoft will never discuss, and that is the Power BI mistake. Power BI exists as part of the Office 365 platform rather than the more IT-centric Azure Platform; this has caused problems for businesses being steered into a significant Azure spend. Incentives being provided for Azure could not be used with Power BI, which particularly impacted the Power BI Premium Capacity, where budgets that had been set aside for analytics were found not to be able to be used in a way that the business required or even worse had to be handed back and requested again against a different budget.

Fabric comes very much from the Azure side of the fence and brings the traditional SaaS (Software as a Service) solutions that come from Azure in to the PaaS (Platform as a Service) solution of Fabric. The difference between SaaS and PaaS is key. SaaS is similar to a traditional on-premise solution, but you do not have to worry about the Operating System of the physical hardware; you are simply paying for the resulatant capability. For example, Azure SQL is an SQL server of a defined size and capability. PaaS, on the other hand, removes the “defined size and capability” element and makes that a variable. You are paying for the ability to use SQL for a workload not a defined quantity of SQL. This, I believe, highlights the point of friction being experienced over Fabric and really what is preventing it from proceeding further. It is too unclear what commitment Fabric will place on an organisation. Fixed costs spent from Power BI licencing and managed spending from Azure are being replaced by vague “well, you will pay for…”. An Enterprise ONLY mindset from Microsoft exacerbates this. All too often, features being demonstrated and showcased are behind the F64 paywall or a commitment of ~£5,000 / month.

The ongoing monthly changes to the platform, combined with new releases being announced either as GA or Public Preview, do nothing to make planning easy. Clients and Consultants struggle to map requirements to an ever-shifting platform. The Fabric feature set is incredibly compelling to any business; however, can it be relied upon when it can pivot on an announcement in a blog post?  Consultancies are also trying to support our clients with the data challenges of Generative Data tools like Copilot requiring policies and processes to support and manage them. Data Governance has always been important but the consequences of getting it wrong now can be greater than ever.

In 2025, Microsoft will increase the cost of a Power BI license for the first time since its full launch in 2015. The list prices are rising by ~40%, as a single increase this is significant. A 40% increase in budget will require many clients to take their Power BI spending back to their Finance team at least for approval. If not, needing to go back to the board for full approval. My personal view is that the increase is not too bad considering it is the first increase since launch, but to do it at this point when clients are struggling to get business agreements with Fabric or Copilot may prove to be an own goal. One is forced to suppose that the lack of price increase in the new Fabric prices means that the intent is to reduce the price difference and so steer more clients towards Fabric.

So, while there has been considerable growth and maturation in Power BI over the years, several factors are making the future challenging for customers. The traditional position of the Office 365 estate has been a “Fixed Cost” model; you pay for your Office 365 subscriptions and get certain apps; if you want more apps, you pay more. Azure exists in a “Compute” and “Storage” pricing model. Compute is the primary driver of the “unit” price for services. This means that the more you want to use, the more you will pay, so clients understandably look to minimise their spend, restricting who can run jobs and monitoring usage. This is why the traditional separation between Data Engineering and Data Analysis has persisted in the cloud era. Fabric promotes the concept of “All can be use”. Potentially allowing anyone to initiate complex engineering tasks.

A blurring of lines is happening, one that I believe could have dire consequences for businesses and not just due to flexible licencing. Data Governance is also at risk. To be clear, I feel the platform capabilities shown with Fabric are good, if not excellent, but the latest advice and platform updates push the intended model of using a single workspace for Data Engineering, Data Science, and Data Analysis (+ Business Intelligence) will lead to significant challenges.

Geordie Consulting recommends splitting your workspaces by workloads, meaning that input and output points are understood and become your points of governance. A core benefit of Apps in Power BI has always been that they are a “shadow snapshot” of a defined report and dashboard set, meaning you can work to upgrade the app by redeveloping the reports and then promote the updates into the App at an agreed-upon time after approval. The utilisation of dedicated App Workspaces also encourages the mentality that the App is dedicated to a specific team or function, meaning that ALL report content that the team or function needs, regardless of source, can be brought into the same app. This means people only have a single link to Power BI content to be aware of. They are not forced to search through multiple links to find the content they need; instead, they go to their app (preferably embedded in their Teams site), and from there, everything they need is there. Using Workspaces per platform for Data Engineering allows for ease of management. Remember, the point of the Power Platform is that it can get data from “anywhere,” so why would you try to put the workflows needed to pull them all in under one roof? Data Scientists need to be able to explore data to identify new insights that can then be formalised into a strategy. So, they will draw together data sources, add new data sources, and generally eviscerate your line of business reporting (but in a good way). This is why we must separate our Semantic models that we will use for our Corporate Business Intelligence and the majority of our Data Analysis.

In short, the state of Power BI (and the Power Platform) is good, but significant challenges lie ahead.

Geordie Consulting uses the Microsoft Power Platform to provide businesses with agile, cost-effective data solutions. We specialise in integrating data from across business applications into a single, manageable platform. Our services include enabling business users to develop custom reports using Power BI and Excel, leveraging Semantic Models for consistent data analysis, and enhancing user self-sufficiency through Copilot. We aim to ensure your data agility to support a data-driven business. We provide comprehensive documentation and support to maximise the business value provided by your data. Let our insight unlock your insights.

Default sample caption text

Contact Us: