Understand the power and capabilities of Power BI. My highest goal is to enable everyone to get the most out of their data with Power BI.

Assign power bi workspaces to a capacity automatically.

In recent discussions with customers, I was asked if there is an automatic way to assign workspaces to dedicated capacities like Power BI Premium or Embedded. Obviously, you can do it manually through the Power BI Admin Portal, but how can you automate it in a scenario where you have to assign hundreds of workspaces based on different conditions? I’m sure you know my answer for this question: Through the Power BI REST API! Let me walk you through how to achieve it.

Prerequisites

You’ll need a few things to be able to automate this need.

  • Service Principal
  • A dedicated capacity (Power BI Premium, Embedded, or Premium per User license)
  • Python skills
  • Understanding REST APIs

Setting the scene

For my demo purpose I’m going to use a Power BI Embedded capacity – a so-called A-SKU – from the Azure Portal. If you’re interested in how to create an Embedded capacity, follow this link .

Further, I’m going to reuse my already created Service Principal (SP). I blogged about how to create a SP, what kind of settings in your Power BI Service you need to enable, and what kind of permissions are needed here .

Lastly, I have to make sure my SP can assign workspaces to the capacity. Regarding the documentation (see here ) the SP needs admin rights on the workspace as well as capacity assignment permissions to that capacity.

Unfortunately, the Admin REST API does not support (yet?) Service Principals to assign workspaces to a capacity. Therefore, we have to make sure that the SP is admin of each workspace we wish to move to a capacity. Luckily, there is an Admin API to assign yourself as Admin or a SP to a workspace (see here ). If you’re interested in a blog about assigning a SP to different workspaces, let me know and I’ll be happy to blog about it.

So, let’s make sure the SP has sufficient permissions.

Add SP as Admin to a workspace

This step is pretty easy and straight forward. Just head over to powerbi.com, select your workspace, click on Access, and make sure your SP is added as Admin.

power bi premium capacity assignment

As you can see in the screen shot above, the workspace is not assigned to a capacity yet otherwise it would have a diamond sign right to the workspace name (PBI Guy).

Add SP as Capacity Admin

In the case of Power BI Embedded you can’t differentiate between admins and contributors like with Premium. Therefore, I have to add the SP as admin in the Azure Portal. To do that I just log in to the Azure Portal , select my Power BI Embedded capacity, and click on Power BI capacity administrators. Once there, click + Add, search for your SP and add it. That’s it. Just make sure your Embedded Capacity is running otherwise you can’t add a new admin.

power bi premium capacity assignment

Further, we have to make sure the Service Principal is allowed to start and pause the embedded capacity. This is done through the Access control on the left-hand side of the pane. Once selected, click + Add and select Add role assignment.

power bi premium capacity assignment

Next, select the needed role. In my case I just give the SP Owner rights but Contributor would be sufficient as well. Once selected, hit next.

power bi premium capacity assignment

On the next screen just select the + Select members, search and add the SP to it. Click select to proceed.

power bi premium capacity assignment

Lastly, hit Review + assign to check your configuration.

power bi premium capacity assignment

If everything looks as expected, hit Review + assign again. We’re now good to go and create our Python script.

It’s time for magic!

As usual, in my first block of code I’m going to import the needed libraries.

In my second block, I specify all required variables which we will use later on.

The client id, secret as well as the tenant id can be found in the Overview page of your SP.

power bi premium capacity assignment

The domain is everything behind the @ of your email address, e.g. [email protected] would mean “pbiguy.com”.

Authority URL and the scope shouldn’t be touched as those are needed to authenticate for the PBI Service. Lastly, the subscription name and resource group name can be found in the Azure Portal on the Power BI Embedded Services overview.

power bi premium capacity assignment

Just keep in mind to use the Subscription ID, not the name!

Next piece of code is to grab a token on behalf of the SP.

In my next step, I want to list all capacities I have access to. Because I’ll need to do that twice within the code (explanation will follow), I create a function, so I don’t have to duplicate my code. The function returns a Pandas DataFrame with all capacities incl. further details.

Next, I want to select my desired capacity, which is in my case the Power BI Embedded one. So, I call the function to get all capacities and filter the result, based on the capacity id, to my Embedded one. Making sure the right one is selected, I print out the capacity name as well as the status (is it running or paused).

The result is as desired, and I only selected my Embedded capacity.

power bi premium capacity assignment

Now that I got the needed capacity, it’s time to get all workspaces I wish to assign to this capacity. Therefore, my next step is to call the REST API to list all workspaces the SP has access to. To get an overview, I display the DataFrame at the end.

power bi premium capacity assignment

For my purpose, I filter the workspaces to only include those with “BI” in the name. Of course, you can create further conditions and filter options based on your needs.

Again, I display the DataFrame at the end to check my filter and selection. Looks good so far.

power bi premium capacity assignment

Let’s do a quick recap what we achieved so far. We have our capacity selected to which we want to assign our workspaces. We also selected all the workspaces we wish to assign to our capacity. As a next step, we have to assign them. But before doing so, especially in the case of Power BI Embedded, we have to make sure that the capacity is running and not paused. Thus, my next block of code will check the status and if it’s paused (suspended), I’ll start (activate) it. This step is not necessary for Premium capacities as they are always active.

I’ll first create a function to get an Azure token. This one differs from the Power BI one as we have to log in into Azure and not Power BI.

Next, I define a function to create the URL to start or pause the capacity. As the REST API URL is very similar and only the last piece (status variable) differs, it’s much more efficient due to reusability reasons to work with a function in this case.

Lastly, I use the capacity status from my previous code to check if it’s suspended. If so, I call the previously created function to create an Azure Token and call the REST API to resume the capacity. At the end of the code, I print out a message based on the status code received.

As it takes some time to activate the capacity, I’ll check in my next code block if the capacity is really active. Otherwise, I would get an error message trying to assign a workspace to a suspended capacity. Now, I call the get_all_capacities function to get again all capacities, filter down to my desired one, and save the status in a separate variable called capacity_status. Next, I do a while loop if the status is suspended and check all 5 seconds until the status has changed to active. This way I make sure the capacity is really in active stage.

Let’s check in the Azure Portal, if the capacity is really running. I select the general overview of the Power BI Embedded service and see that my embedded capacity has an active status – great!

power bi premium capacity assignment

Finally, I can assign now my workspaces to the capacity. I create a for each loop on my selected workspaces DataFrame to assign each workspace obviously to the capacity (bulk update is not supported through the API). In the loop I extract the workspace ID and the name, update the URL for the REST API call (including the workspace ID), and specify the required body. In there, you’ll find the capacity_id variable specifying to which capacity we wish to assign the workspace. At the end I call the REST API and provide a message based on the status code received. If it’s successful, I print out a message with the workspace and capacity name confirming it worked.

If you wish to unassign a workspace from a capacity and put it back to Power BI Service (Shared Capacity), just use the 00000000-0000-0000-0000-000000000000 GUID for the capacity_id variable.

power bi premium capacity assignment

Let’s check in the Power BI Service if it really worked.

power bi premium capacity assignment

Great! First sign that it worked is the diamond icon right to the workspace name. Making sure the workspace is really assigned to the right capacity, I also check the workspace settings. Looks perfect!

My last step in the Python code is to pause the capacity making sure no additional or unnecessary costs will occur as I’m using the Power BI Embedded one. Depending on the number of workspaces, the Azure Token could be expired. Therefore, I want to make sure I have still an active one and call the get_az_token function again to get a fresh token. Afterwards, I call the create_url function but this time with the suspend status and save it to the url variable. Afterwards I call the REST API to pause it. Lastly, I again print out a message based on the response from the REST API.

Once the code is executed, it looks like the status of the capacity has changed.

power bi premium capacity assignment

Let’s again check in the Azure Portal.

power bi premium capacity assignment

After hitting the refresh button, I see the capacity is really paused – excellent!

With this piece of code, I can automatically assign workspaces based on my needs to a specific capacity! As I worked with an Embedded one, I even automatically started and paused the capacity through the REST API – awesome!

Please let me know if this post was helpful and give me some feedback. Also feel free to contact me if you have any questions.

If you’re interested in the files used in this blog check out my GitHub repo https://github.com/PBI-Guy/blog

Share this:

  • Click to share on Twitter (Opens in new window)
  • Click to share on Facebook (Opens in new window)

7 thoughts on “ Assign Power BI workspaces to a capacity automatically ”

Fantastic post

Do you know please ? how to add a SP as contributor (add/remove WS) to a Premium capacity (not embedded) Is it via power shell ? Because impossible to add manually (email adress not valid)

Thanks very much!

Can you add the Service Principal to a Security Group and add this group to the Contributors of your Premium capacity? ( https://learn.microsoft.com/en-us/power-bi/enterprise/service-admin-premium-manage#manage-user-permissions )

Thanks a lot Guy I guess we can use your nice & very useful python scripts on Azure Data factory via azure batch tasks If you are close to Microsoft, do you know please , by any chance if python support will be improved, simplified, modernized etc… on ADF ?

Apache Airflow dags uses python itself to manage tasks together, organized with dependencies and relationships to say how they should run.

Not that of course on ADF ;>) , but improve python scripts handle on ADF

I guess so yes but haven’t tested it 🙂 I’m not very close to the ADF team and can’t find anything on the roadmap so perhaps through the community you can raise the idea for a better python script handling?

Thanks Guy for detailed explanation! Need your small help in #Filter to needed workspaces. In this case all workspaces with “BI” in the name will be used. How can I achieve the same via PowerShell? It would be helpful if you can help me with the PS script to filter to needed workspace

I’m unfortunately not really good with PowerShell (that’s the reason why I use Python :D) but something like this should work:

Get-PowerBIWorkspace -Scope Organization -Filter “tolower(name) eq ‘My Workspace Name'”

I’m using the “tolower” at the beginning to lowercase everything as the filter is case sensitive. As the filter option is passed through OData you can use something else than eq (which means equals). Check all filter options here: https://www.odata.org/documentation/odata-version-2-0/uri-conventions/ under 4.5 Filter System Query Option ($filter)

Hope that helps.

[…] In one of my previous blog posts I showed how to create a Service Principal ( https://pbi-guy.com/2022/03/10/power-bi-and-activity-logs-with-python/ ) and also did a walk-through how to give the sufficient permission to the SP to start and pause the Embedded Capacity ( https://pbi-guy.com/2022/09/22/assign-power-bi-workspaces-to-a-capacity-automatically/ ). […]

Leave a comment Cancel reply

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

power bi premium capacity assignment

How To Use The Power BI Premium Capacity Metrics App

Organizations spend lots of money ensuring they have the right resources to support their Power BI platform to ensure a good developer and end-user experience. This is especially true if your organization has purchased its own Premium Capacity . Because of this large investment, you’ll want to make sure that you have what you need to appropriately monitor your organization’s premium capacities. 

Enter the Power BI Premium Capacity Metrics App, a canned app designed to give admins the necessary monitoring capabilities to make informed decisions about their Premium Capacity.

In this blog, we’ll explain the PBI Premium Capacity Metrics App, describe how to install it, and provide some tips on how to use it. 

What Is The PBI Premium Capacity Metrics App?

The PBI Premium Capacity Metrics App is a template app, meaning it’s publicly available to use within your organization’s tenant. This app equips Power BI admins to be able to monitor their capacity’s resources and identify issues and areas for improvement. 

Let’s start with an example of how you, as a Power BI admin, might use this app. 

When a dedicated capacity in Power BI reaches its maximum capacity, meaning its compute resources are fully utilized, the admin team will receive an email notifying them and explaining that their capacity will be throttled. This will impact the dataset and report performance, thus negatively impacting user experience. 

As an admin, you can use the app to identify where the issue came from, the exact point in time that it occurred, and even the user that caused it so that you can make them aware and provide mitigation tips.

As previously stated, the PBI Premium Capacity Metrics App is a template app and is available in the AppSource to be installed (instructions for installation are in the next section).

How To Install And Set Up The PBI Premium Capacity Metrics App

Installing the PBI Premium Capacity Metrics App is easy and can be done directly from Power BI Service.

Step 1:  Open the “Apps” page using the left panel and click the green “Get apps” button on the top right.

Step 2: Navigate to the “Template apps” section and search Premium Capacity Utilization and click on the app icon (note there is still a previous version available, make sure you have the one in the screenshot below).

Template apps

Step 3: In the next two pop-up windows, click the blue “Get It Now” button.

Step 4: Click the “Install” button in the next window (this step may take a few minutes). 

Installing this app will automatically create a new workspace with the app already installed in it; there is no alternative to this and you will not be able to install it into an existing workspace.

Step 5: Open the newly created workspace which will follow this naming convention in the screenshot below. Click the “Connect your data” button on the yellow banner towards the top.

Connect your data

Step 6: Fill out the information in the pop-up window accordingly. If you have multiple capacities, you will NOT need multiple apps, just use one of the ID’s, and it will automatically connect to the others in the same tenant.

pop-up

Step 7: Follow the prompts on the next screen and click the “Sign in and connect” button.

Step 8: Add the app to your list of favorites by going back to the “Apps” page using the left panel, then clicking on the “Get apps” button and searching for the Premium Capacity Utilization and Metrics App. 

How To Use The PBI Premium Capacity Metrics App For Performance Monitoring

Using the PBI Premium Capacity Metrics App can be daunting because of the amount of information it can provide. For that reason, we will use this section to introduce the different components of the app and provide some advice on where a user’s time is best spent. 

The app has five different pages that admins can use to help monitor their capacity and triage any issues that may come up, which are described in more detail below.

At the top of each page is a slicer for the user to select which capacity they would like to view.

Overview Page

The Overview page provides high-level details of the overall performance of the capacity and has three sections. 

Overview Page

This section is the left portion of the page and consists of two separate visuals stacked on top of each other.  

Drive your focus to the matrix on the bottom left, which displays metrics (described in the table below) for each Power BI item in the selected capacity. Use this matrix to highlight problem artifacts that could be draining your capacity’s resources. And expand the individual artifact to view what kind of operation could be causing the problem.

artifacts

Performance

The performance section is to the right of the Artifacts section and has four visuals. This section is perhaps the most important and will help the user pinpoint exactly when, where, and who caused a capacity to reach its max threshold.

CPU over time: visual on the top displays CPU usage for the capacity broken out by background and interactive activities. Background operations are activities that are not directly triggered by users, such as data refreshes. Whereas, interactive operations are caused by users, such as loading pages or filtering.

Right-click to drill through any part of the visual to open the Timepoint Details page to view exactly what background and interactive operations caused a spike at that given moment.

Timepoint Details

Overloaded minutes per hour: displays a value showing how severe an overload was on the performance of an artifact.

Item size: shows the memory recorded for the Power BI items in the capacity over time.

Performance profile: displays report performance broken out into three different categories (Fast/Moderate/Slow). The performance aggregation is taken from the number of operations conducted on an artifact.

Weekly Trendlines

This section has four sparklines that provide quick, high-level insights on the trends for each of the following data points included.

Evidence Page

The Evidence page of the app helps to identify which Power BI artifacts are leading to overload and which items are impacted by that overload.

power bi premium capacity assignment

Overloaded Item: the most important visual here is the table on the bottom left, which displays Overloaders and Overloaded Items. The overloaders highlight items that cause highly impactful overloaded events. Toggle this visual to Overloaded Artifacts, which displays items most impacted by overloading in the past two weeks. Users can drill through the individual artifacts to get more information specific to that artifact.

Overloading windows: helps users understand if an overload or autoscale events occurred because of a single artifact or several.

Items overloaded (seconds) use this visual to understand if items that are overloading impact their own performance, or if they impact the performance of other items

Number of users overloaded: use this visual to understand if a single user was impacted by an overload event, or several users. This helps the user understand how widespread of an issue this is. 

Refresh Page

This entire page is dedicated to helping users identify refresh performance issues and can be accessed for an entire capacity or as a drill-through for specific items or. The top of the page allows users to make selections that impact all of the visuals on the page.

Refresh Page

Refresh by item: shows the breakdown of items by the metric filtered to (using the slicer described above). This visual is helpful when trying to understand which optimizations for an item can help reduce the capacity footprint.

CPU: the columns in this visual represent the number of CPU seconds used to finish a single operation per hour across a fourteen-day period.

Duration and Operations visuals: the columns in these visuals show the number of seconds a single operation took to finish a single operation per hour across a fourteen-day period.

Refresh Detail: his matrix shows the metadata for all of the individual refresh operations that occurred. The ratio column should be used to help understand the ratio between CPU time and processing time. A low ratio means there are inefficiencies that cause Power BI to spend more time waiting for the data source and less time executing the refresh.

Timepoint Details Page

This page is perhaps the most important page when determining the causal factors that lead to overload or your dedicated capacity reaching its maximum threshold. Access it by drilling through various points in other pages to view the operations that occurred at that time.

The page is broken into two primary sections. There is a table for interactive operations and a table for background operations. As described before, interactive operations are activities performed by users at the front end, such as loading pages, making visualization selections, and filtering. 

Background operations are not caused directly but users and are operations such as dataset refreshes. Each of the tables displays the exact artifact, the operation, the start and end time, whether or not the operation was a success as well as helpful metrics that describe it’s impact to the capacity at that point in time.  

I recommend using this view when you want to know the specific operation that caused an overload and would like to contact the user. 

power bi premium capacity assignment

Item Details Page

The Item Details page is another drill-through page that offers very low-level information specific to a single item that the user drills through. This page displays the information provided for the past two weeks regarding CPU usage, Users, Duration, Overloading, and more.

Item Details Page

Organizations spend lots of time, money, and resources ensuring their teams have the necessary items to report on their business metrics efficiently. If your organization has licensed Power BI for a dedicated Premium Capacity , you should take the necessary steps to monitor and act proactively to find efficiencies that will lead to better and faster insights, and thus a better user experience leading to greater adoption. 

The PBI Premium Capacity Metrics App is an excellent tool that helps organizations do that, and since it’s a template app, it requires no effort to develop and minimal effort to install.

If you have more questions about the PBI Premium Capacity Metrics App or Power BI in general, our experienced Power Platform experts will gladly guide you. 

power bi premium capacity assignment

More to explore

power bi premium capacity assignment

phData Announces Partnership with LandingAI

power bi premium capacity assignment

Snowflake Cortex vs. Snowpark – What’s the difference?

power bi premium capacity assignment

phData Achieves Elite Partner Status with Coalesce

power bi premium capacity assignment

Join our team

  • About phData
  • Leadership Team
  • All Technology Partners
  • Case Studies
  • phData Toolkit

Subscribe to our newsletter

  • © 2024 phData
  • Privacy Policy
  • Accessibility Policy
  • Website Terms of Use
  • Data Processing Agreement
  • End User License Agreement

power bi premium capacity assignment

Data Coach is our premium analytics training program with one-on-one coaching from renowned experts.

  • Data Coach Overview
  • Course Collection

Accelerate and automate your data projects with the phData Toolkit

  • Get Started
  • Financial Services
  • Manufacturing
  • Retail and CPG
  • Healthcare and Life Sciences
  • Call Center Analytics Services
  • Snowflake Native Streaming of HL7 Data
  • Snowflake Retail & CPG Supply Chain Forecasting
  • Snowflake Plant Intelligence For Manufacturing
  • Snowflake Demand Forecasting For Manufacturing
  • Snowflake Data Collaboration For Manufacturing

power bi premium capacity assignment

  • MLOps Framework
  • Teradata to Snowflake
  • Cloudera CDP Migration

Technology Partners

Other technology partners.

power bi premium capacity assignment

Check out our latest insights

power bi premium capacity assignment

  • Dashboard Library
  • Whitepapers and eBooks

Data Engineering

Consulting, migrations, data pipelines, dataops, change management, enablement & learning, coe, coaching, pmo, data science and machine learning services, mlops enablement, prototyping, model development and deployment, strategy services, data, analytics, and ai strategy, architecture and assessments, reporting, analytics, and visualization services, self-service, integrated analytics, dashboards, automation, elastic operations, data platforms, data pipelines, and machine learning.

power bi premium capacity assignment

with contract intelligence

power bi premium capacity assignment

Explore Contract Intelligence Services

Drive value from your contracts to transform them from static documents to strategic business assets with our Icertis Contract Intelligence Center of Excellence.

AI into business operations

power bi premium capacity assignment

AI/ML (Artificial Intelligence & Machine Learning)

power bi premium capacity assignment

IoT (Internet of Things)

power bi premium capacity assignment

NLP (Natural Language Processing)

Democratize.

power bi premium capacity assignment

Data Management & Governance

power bi premium capacity assignment

Data Engineering

power bi premium capacity assignment

Data Analytics

Apps at litespeed.

power bi premium capacity assignment

App engineering and integration

power bi premium capacity assignment

Low-Code/ No-Code

Solution overview.

power bi premium capacity assignment

Transport Management

Fast-track your railroad modernization through the CloudMoyo Rail Transportation Management (CTRM) Software Suite. CTRM automates core operations, manages crews, optimizes assets, obtains commercial visibility, and supports safety compliance – all powered by advanced analytics an AI/ML.

power bi premium capacity assignment

Interline Settlement

Automate revenue settlement processing and dispute resolution involved in interline rail freight shipment. The CloudMoyo Interline Settlement System (ISS) works with Railinc CISS and acts as a single source of truth for all data related to your interline shipment, improving the efficiency, timing, and overall quality of interline settlements.

power bi premium capacity assignment

Unifies Crew & Safety Management

A powerful suite of digital systems for effective, safe, and compliant crew operations. CloudMoyo’s Unified Crew & Safety Management promotes collaboration, data visibility and analysis, integrated functions, and organization initiatives by unifying all business solutions for crew and safety management.

About CloudMoyo

power bi premium capacity assignment

Here at CloudMoyo, we don’t just create and provide technology solutions, we build dreams. We strongly believe that our expertise is founded on the efforts of our employees, a group of talented and fun-loving people.

A Complete Guide to Power BI Pricing and Capacity Management

  • February 2, 2023

As business intelligence programs are becoming more sophisticated and nuanced, Power BI has found a unique position in the market as a go-to BI tool. As an increasing number of organizations are adopting Power BI, they’ve got lots of questions about the various aspects of Power BI like capacity and pricing.

As a Microsoft Gold Certified Partner with more than a decade of experience implementing digital solutions, CloudMoyo’s experts have shared some insights that can help your organization identify which solution might fit your enterprise BI needs. Keep reading for an overview of dedicated capacities, supported workloads, content sharing, and other Power BI features that might be helpful in your decision-making!

Power BI Capacity Management

Power BI offers three tiers of service – Free, Pro, and Premium based on usage:

  • Power BI Free for Content Creation: Use the free Power BI Desktop tool to author reports and prepare, model, and create data visualizations.
  • Power BI Pro for Content Publication: Collaborate with colleagues, model data, author content, share dashboards, publish reports, and perform ad hoc analysis.
  • Power BI Premium for Content Consumption: Read and interact with pre-published dashboards and reports with either a per-user Power BI Pro license or a Power BI Premium license for large-scale databases.

power bi premium capacity assignment

While Power BI Pro and Premium both have their own set of extensive features, we’ve highlighted some of the unique features of Power BI Premium in Figure 1. These features are in addition to those common with Power BI Pro.

power bi premium capacity assignment

Power BI Pricing Management

The total users and users by category who interact with Power BI reports or dashboards are the biggest factors in estimating the cost of Power BI for your organization. There are three categories of users:

  • Pro Users: These users require collaboration, authorization of content, data modeling, ad hoc analysis, dashboard sharing, and report publishing.
  • Frequent Users: Users frequently interact with reports or dashboards.
  • Occasional Users: Users occasionally consume the reports and dashboards.

Based on these identified user types, the Power BI pricing calculator can be used to estimate the cost. Let’s work with an example here:

power bi premium capacity assignment

Suppose an organization contains a total of 4500 users who’ll have access to Power BI. Let’s divide these users into 3 categories – 20% pro users, 35% frequent users, and 45% occasional users. Based on the pricing calculator, the total cost for 4500 users will be $23,976/month.

Power BI Premium

Power BI Premium provides dedicated and enhanced resources to run the Power BI service for your organization with features like:

  • Greater scale and performance
  • Flexibility to license by capacity
  • Unify self-service and enterprise BI
  • Extend on-premises BI with Power BI Report Server
  • Support for data residency by region (Multi-Geo)
  • Share data with anyone without purchasing a per-user license

The Office 365 subscription of Power BI Premium is available in two SKU (Stock-Keeping Unit) families:

  • P SKUs (P1-P3) for embedding and enterprise features. The commitment is monthly or yearly, and it’s billed monthly. This includes a license to install Power BI Report Server on-premises.
  • EM SKUs (EM1-EM3) for organizational embedding. The commitment is yearly and is billed monthly. EM1 and EM2 SKUs are available only through volume licensing plans. You can’t purchase them directly.

Capacity Nodes

As described earlier, there are two Power BI Premium SKU families – EM and P. The SKU represents the storage, memory, and a set number of resources consisting of processors where ALL SKUs are considered capacity nodes. Each SKU contains some operational limits on the number of DirectQuery and Live Connections per second and the number of parallel model refreshes.

Processing is achieved by a set number of v-cores, divided equally between the back-end and front-end. Back-end v-cores are also known as active datasets where it has assigned a fixed amount of memory that’s primarily used to host models. Back-end v-cores are responsible for core Power BI functionalities which contain the following activities: query processing, cache management, running R services, model refresh, natural language processing, and server-side rendering of reports and images.

Front-end v-cores are responsible for the following activities: web services, dashboard & report document management, access rights management scheduling, APIs, uploads & downloads, and everything related to the user experiences. Storage is set to 100 TB per capacity node. The resources and limitations of each Premium SKU (and equivalently sized A SKU) are described in table 2:

power bi premium capacity assignment

Workload Configuration in Premium Capacity Using Power BI Admin Portal

To fulfill the capacity resource requirements, you will need to change memory and other settings if the default settings are not meeting the requirements. The steps to configure workloads in the Power BI admin portal are:

  • In Capacity settings > PREMIUM CAPACITIES, select a capacity.
  • Under MORE OPTIONS, expand Workloads.
  • Enable one or more workloads and set a value for Max Memory and other settings.
  • Select Apply.

Different parameters contributing to workloads in a premium capacity are AI workload, datasets, max intermediate row set count, max offline dataset size, max result row set count, query memory limit, query timeout, automatic page refresh (in preview), data flows, container size, and paginated reports.

Power BI Embedded

The total cost of Power BI Embedded depends on the type of node chosen and the number of nodes deployed. Node types differ based on the number of v-cores and RAM. The Power BI Embedded pricing by Microsoft is available on monthly/hourly basis across different regions. Table 3 covers the pricing for the Central U.S. region by the hour.

power bi premium capacity assignment

Frequently Asked Questions (by CloudMoyo Customers)

When should i choose power bi pro for deployment.

For small and large deployments, Power BI Pro works great to deliver full Power BI capabilities to all users. Employees across roles and departments can engage in ad-hoc analysis, dashboard sharing and report publishing, collaboration, and other related activities.

Not all my users require the full capabilities of Power BI Pro – do I still need a Power BI Pro license?

Even though you have Power BI Premium, you will need Power BI Pro to publish reports, share dashboards, collaborate with colleagues in workspaces, and engage in other related activities. To use a Power BI Premium capacity, you need to assign a workspace to a capacity. The following use cases will help you to understand the scenarios in which you can go for Power BI Pro/Premium or both.

power bi premium capacity assignment

Do you need self-service BI?

Self-service BI isn’t just a trend anymore – it’s become a necessity for efficient information sharing within an organization. Various professionals can collaborate, publish, share, and perform ad-hoc analysis easily with advanced self-service BI tools.

Can Power BI support big data analytics and on-premises, as well as cloud reporting?

Power BI Premium provides enterprise BI, big data analytics, cloud, and on-premises reporting along with advanced administration and deployment controls. It also provides dedicated cloud computing and storage resources that allow any user to consume Power BI content.

So, Do I Need Power BI?

Depending on your enterprise business intelligence needs, it’s important to choose the right Power BI offering for you! From the number of users to pricing to varying features of each Power BI option, we hope this guide helps guide you in your decision.

Every organization is unique in its needs and goals. And the right technology partner can help you identify the best solution based on your enterprise needs! CloudMoyo experts have decades of experience working with various technologies, including Power BI and other Microsoft tools to transform your organization with resilience. Our goal is to support organizations on their digital transformation journeys, future-proofing their business with agility and scalability as they grow.

Have more questions about Power BI? Get in touch here !

Not quite ready to connect? Check out some of our customer success stories :

  • Optimizing Power BI Report Performance During Peak Hours Using Visual Studio Testing Services
  • Indivior Improves Reporting and Contract Management Process by Implementing Power BI and ODS

Originally posted 06/18/2020, Updated 02/02/2023.

By Abhay Jadhav, Analytics team CloudMoyo

Sign up for a consultation today and explore more about our services offering

Ready to get started?

Subscribe to our newsletter

power bi premium capacity assignment

power bi premium capacity assignment

13 May 2024

Printer friendly version

PowerBI premium capacity administration: dataset monitoring and tuning

Over the years Power BI has evolved into a complex and varied ecosystem of tools and solutions, which in its turn demands several supporting roles: there are, of course, developers, data engineers and data scientists, but there is need for one more, i.e. a capacity administrator. Of course some of these roles may be covered by one person, but the roles are distinct in their nature as well.

Capacity Administration

The role of the capacity admin is to oversee and make sure that all tools and users are playing well together, and sometimes there is need for a hard proof that something is wrong before it can be fixed. This article describes the process of how the capacity admin gathers enough data from the Fabric Capacity to helps the developers finetune their solutions and get the most value from the software, as well as the insights the programmers provide.

As part of the Capacity administration tasks, a PowerBI admin needs to monitor and finetune the performance of the environment, while providing best practices guidance to the developers which are publishing the reports to the shared environment. I have seen both sides in the past – I have been a Power BI developer for many years, optimizing performance and fighting for shared resources, while in the past year I have been in the Power BI Admin. These are two different points of view, especially because the developers have their tools (Power BI Desktop, DAX studio, Power BI Desktop profiler, and so on) which give them granular performance data from the local environment, while the Admins have their tools which can see the bigger picture of the resource utilization in reality from the shared environment where the published reports are.

Wouldn’t it be best to have both views combined and thus creating a much reliable and better performing environment?

For an admin, one way to get a good overview of the Premium Capacity and its resource usage is to use the Fabric Capacity Metrics app which is provided by Microsoft. (For more info, read here: https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app )

In my previous article “ What is Microsoft Fabric all about? ” I explained how the resources of the Power BI capacities are expected to work and what triggers the throttling. But, in short, when we use Power BI services, we get the processing capacity we have subscribed for, and once we exceed it, the users of the entire capacity get “punished” by getting throttled or even rejected. Recently a team of developers managed to bring down an entire Power BI Premium capacity by publishing a single report without testing it, and without knowing how to monitor its performance on the Power BI portal after publishing it.

One Problem Is The Variability

At times a capacity may be completely adequate, but at other times there might be significant resource pressure, which if encountered for longer periods of time may mean throttling of the resources for all of the Premium Capacity constituents. When this occurs, all users and all workspaces will likely suffer the delays of a slowed down environment. A real-life example is how a single report was able to bring down an entire capacity due to having more than 10 nested levels of DAX in its measures, and by using the techniques described in this article, the admin together with the developers were able to restore the service quickly. Furthermore, there were some learnings on how to proactively investigate and prevent future outages by using the described techniques.

Of course, one way to deal with throttling caused by resource overutilization is to enable the AutoScale function. This would work well in cases when the Capacity is used for critical production reports and the stakeholders have endlessly deep pockets.

Sooner or later, however, it would be a great idea to do some performance tuning.

Viewing the Metrics

If we turn to the previously mentioned Fabric Capacity Metrics app, we might find something similar to this in the metrics report:

power bi premium capacity assignment

Looking further in the details, we might find which workspaces and which objects within them are causing the most pressure:

power bi premium capacity assignment

Finally, in the Fabric Capacity Metrics report we can also view a short list of statistics about the operations which take the most resources:

power bi premium capacity assignment

In this case we may think that this particular dataset is a great candidate for a performance tuning session, since we see that the Query is used by 91 users and is taking an enormous amount of CU seconds.

So far, this is the viewpoint of the Admin, and given that the Admin is actively monitoring the system, he / she will send a clear message to the developers that there is a great opportunity for optimization.

What would the developers do, though?

As mentioned before, the PowerBI developers use the PowerBI Performance Analyzer (by clicking on the Optimize tab in PowerBI desktop), which is a great tool in the PowerBI Desktop:

power bi premium capacity assignment

It gives insight into the query durations; we can even copy the DAX queries and finetune them in DAX Studio. We could even attach the local machine Power BI desktop process to the DAX Studio and replay and finetune the DAX queries in the context of the local developer’s machine.

Performance Analyzer, is not enough when it comes to the performance of the reports and the datasets in the Power BI Service portal . There are other factors at play in the Service Portal and the environments may be drastically different. For example, there might be a VPN in the way, or maybe the data transfer is different between the cloud services based on where the developer’s machine is– all of this would make the debugging much harder on our local development machines. Furthermore, there usually are various usage patterns of the datasets and the reports in the Service Portal, at times it may happen that everything works in a predictable way during development, but when the users touch the system, we get unexpected results.

This is the same challenge we have had in database development and query tuning for decades, and this is why there are so many posters and T-Shirts with the words “But it worked on my machine…”

This is the main reason why we would try and get performance analytics data from the portal and would not solely trust the reality of the developers’ machines.

In this case we know which reports and datasets use the most resources, but we need the tools to gather their performance metrics in context, not only on our development machines. In database development and query tuning, the next logical step would be to turn to our extended events and see what happens in the environment where the objects are.

The next tool which comes to mind is the SQL Server Management studio, after all we can connect to SSAS objects and use extended events to trace what is hurting our performance.

We can easily do this for on-prem SSAS instances:

power bi premium capacity assignment

But if we try to connect to the PowerBI service, we simply do not see the Management section:

power bi premium capacity assignment

We can use extended events in the case of an on-prem Analysis Services, but it is not allowed for use when we connect to the Service portal.

Luckily, in this case we can turn to the good-old (already deprecated) SQL Server Profiler . There are Extended events available for the on-prem Analysis Services, but there are none available for the Azure Analysis Services.

There are a few considerations, however. First, the Profiler will only work with Datasets published in a Premium workspace (Premium, Premium-per-user or PowerBI embedded. ). Second, we can only run a trace for one dataset at a time.

To trace a dataset, we need to be able to connect to the Workspace XMLA endpoint.

To use the XMLA endpoint feature there are couple of admin settings, which need to be enabled by the Power BI admin first:

  • The XMLA endpoint for the Capacity needs to be enabled.
  • The Allow XMLA endpoints and Analyze in Excel with on-premises datasets should be allowed on a tenant level and the user who will perform the tracing needs to be on the allowed user list.

To run a trace for a dataset, the first thing we need is the XMLA endpoint connection string. It is in the format of

The easiest way to get it is to go to the Premium Workspace, click on Settings > Premium > Workspace Connection and copy the string.

Starting the trace

As promised, now we will start working with the Profiler. We select Server Type to be Analysis Services and we connect to the XMLA endpoint with whatever authentication method is needed.

power bi premium capacity assignment

It is very important before clicking Connect here to go to the Connection Properties tab and to specify the dataset name:

power bi premium capacity assignment

If the dataset name is not specified correctly, then an error will appear like this

power bi premium capacity assignment

And now we can click on Connect and choose what events we want to collect. For this guide we can just select the Standard template

power bi premium capacity assignment

When we click on Run , we will start getting some real-time events captured:

power bi premium capacity assignment

Note that there is a XML chunk at the end of the Query Begin event, which contains some valuable session information. First of all, we have captured the actual DAX query, and we also have some context information. If we format it, it looks like this:

power bi premium capacity assignment

This is a great set of information, since it gives the developers some real insight of the performance of their dataset in context. The developer can see who the user was, what the application context was, and what the performance metrics were for each interaction. This information can be used for taking the DAX query and finetuning it, but also from this information we know the exact user who submitted the request, and we may talk to them about their needs – it may turn out that there is an easier way to get the work done. We also know which report and which visual is consuming the resources, and thus we can revisit those items.

From here on, the developers may take the best candidates for query optimization and they may work their way through optimizing the resources of the shared capacity.

A broader approach to performance tuning

So far, I mentioned the ad-hoc way to use SQL Server Trace tool to capture valuable information about the datasets and query performance in a Power BI Premium workspace.

There is, however, another way to capture performance information from a Power BI Workspace which is assigned to a Premium Capacity: each workspace can individually be added to a Azure Log Analytics workspace.

This is fairly simple to do – all we need to do is to create a Azure Log Analytics and go to the Workspace Settings and then to the Azure Connections tab and point to the Azure Log Analytics environment we wish the data to be stored in.

power bi premium capacity assignment

There are some important differences between Trace and Log Analytics, however.

For example, several developers can run a trace at any time when they are testing the environment, and this trace would not cost anything. The trace can be started and stopped at any time.

Log analytics collects data continuously and it incurs costs. Furthermore, specific permissions need to be granted and managed for the Log Analytics workspace, which creates a bit of administration work.

Another aspect which is most visible for me as a Power BI Global admin is that with the Trace, I am able to audit different datasets without worrying whether the workspaces are already added to some instance of Log Analytics or not, and whether I have access to it. This really helps me have quick conversations with the Power BI developers and do ad-hoc pear-programming and tuning sessions.

How Log Analytics works

Here is how it works: when we connect a workspace to Log Analytics, the data starts flowing to a predefined log called PowerBIDatasetsWorkspace (here is the official documentation for it: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/PowerBIDatasetsWorkspace )

It takes some time for the data to get flowing in the logs, as I have experience it, it takes about 5 minutes to get the data in.

When the data is in the Log, we can use KQL to query it, for example here is a query which gives us the average query duration by day for the last 30 days.

The result of this query is trivial and it wont tell us much.

We could perhaps spend some time and write a more advanced KQL query, which gives us query count, distinct users and average duration per workspace for the last 30 days:

The result may look like this:

power bi premium capacity assignment

What we see here is a simple aggregation of resource usage over time for our Premium Capacity – how many queries vs. how many users and the aggregated CPU and Duration. This particular query can be used as an overview, but it is also worth noting that the underlying data is quite granular and we can filter by a specific user or even look for specific patterns within the submitted queries.

For further dive into KQL I would recommend the official documentation as a starting point: https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/

As mentioned, there are options: we can either go for the ‘on-the-spot’ profiling or we can opt out for collecting log data for our workspaces and query it later.

After all, it is an iterative process, because after the finetuning sessions (whether with the help of the Trace or the Log Analytics), the developers may ask the Power BI admin for another view of their Performance Metrics report, as a guidance to how their work has improved the environment as a whole, and the admin may help them find out what the overall health of the environment is.

In this article we saw the different views of a Power BI admin and a Power BI Developer: their tools, options, and workloads.

Happy debugging!

Subscribe for more articles

Fortnightly newsletters help sharpen your skills and keep you ahead, with articles, ebooks and opinion to keep you informed.

power bi premium capacity assignment

Fabriano Richard

Fabriano has a background of many years working with the Microsoft stack, starting with SQL Server 2000, going all the way to Azure and PowerBI administration. Over more than 20 years Fabriano has worked on assignments involving database architecture, Microsoft SQL Server data platform, data model design, database design, integration solutions, business intelligence, reporting, as well as performance optimization and systems scalability. For the past 2 years he has actively developed PowerBI systems and administered Premium Capacities / Microsoft Fabric workloads.

Follow Fabriano Richard via

View all articles by Fabriano Richard

Load comments

Related articles

power bi premium capacity assignment

Purview and Microsoft Fabric: Better Together

power bi premium capacity assignment

How to Build Metadata Driven Pipelines in Microsoft Fabric

power bi premium capacity assignment

Data & AI

Four Questions to Ask When Planning a Power BI Premium Deployment

As adoption of Power BI grows, more and more organizations are finding value in purchasing and deploying Power BI Premium . Power BI Premium offers a different licensing structure where an organization can purchase dedicated capacity and users who only consume reports and dashboards aren’t required to have a Power BI Pro license. The move from shared to dedicated capacity brings with it:

  • larger storage limits
  • support for larger data sets
  • higher refresh rates
  • support for incremental refresh
  • support for Power BI Embedded
  • use of Power BI Report Server

Power BI Premium requires some additional deployment planning over simply using shared capacity in PowerBI.com. Listed below are important questions to answer as you get started with Power BI Premium.

Planning PBI Premium Rollout

1. How will we allocate Power BI Premium capacity? 

When you purchase Power BI Premium, you are purchasing an amount of capacity (virtual cores) rather than a number of nodes. If you purchased the P2 plan, you would have 16 v-cores. You could have one node with all of the v-cores, or you could have 2 nodes with 8 v-cores each. You’ll want to take into consideration memory and CPU usage based upon dataset sizes and refresh rates, as well as the need for autonomy (dedicated capacity) of departments or groups within your organization.

2. Who will manage each capacity? 

Power BI Premium brings new management roles in PowerBI.com. In addition to the Office 365 admin and Power BI admin roles, there are capacity admins and capacity assignment permissions. Capacity admins can assign workspaces to their capacity and update the capacity assignment list.  Users with capacity assignment permissions can add any workspace in which they are an admin to dedicated capacity. It can be helpful to allow some groups to manage that for themselves rather than to have to submit a help ticket to IT or the BI team when they want to move workspaces in or out of dedicated capacity. If you have multiple groups on one capacity, it might be best to have a capacity admin from each of those groups. But those capacity admins will need to communicate so they don’t overcommit the capacity and cause poor performance for the other groups.

3. How will we assign workspaces to each capacity? 

Not every workspace needs to be in dedicated capacity. As the number of workspaces grows, all of your workspaces may not fit in dedicated capacity. You could start out putting every workspace in dedicated capacity until it fills up and then re-evaluate, or you could come up with guidelines as to which workspaces should be assigned to a capacity. Some organizations leave all personal workspaces in shared capacity and put all app workspaces in dedicated capacity. Some organizations assign workspaces to dedicated capacity to maximize their use of Power BI Free licenses for users who only consume content. Other organizations identify important or frequently used workspaces and assign only those to dedicated capacity.

4. Should we allow users to sign up for Power BI Free licenses on their own, or block them from doing so? 

Often, the deployment of Power BI Premium brings new users to your Power BI tenant. By default, any user can sign up with their Azure Active Directory credentials. Some organizations turn off that ability and require the O365 admin to assign a license to users before they can use PowerBI.com. This might be a good fit if your organization requires training on Power BI or data security before users may begin to use Power BI.

Power BI Premium can bring interactive analytics to a larger audience within your organization, but it takes some additional planning and management. If you would like help planning your Power BI Premium deployment, please contact us . We would be happy to share our knowledge and experience.

Related Articles

3Cloud Global Delivery Center

3Cloud Global Delivery Center

How AI Benefits the Retail Industry  

How AI Benefits the Retail Industry  

power bi premium capacity assignment

Leading The Way In The Era Of AI Infographic

power bi premium capacity assignment

Your Cloud Transformation Journey Starts Here

Operationalize the cloud and enjoy the ultimate Azure experience with the experts at 3Cloud.

Let’s Talk

power bi premium capacity assignment

Microsoft Power BI Cookbook by Brett Powell

Get full access to Microsoft Power BI Cookbook and 60K+ other titles, with a free 10-day trial of O'Reilly.

There are also live events, courses curated by job role, and more.

Power BI premium capacity admins

  • Office 365 Global Admins and Power BI Admins are Capacity Admins of Power BI Premium capacities by default.
  • These admins can assign users as Capacity Admins per capacity during initial setup of the capacity and later via User Permissions within the Premium settings of a capacity in the Power BI Admin Portal.
  • Capacity Admins have administrative control over the given capacity but must also be granted assignment permissions in the Users with assignment permissions setting to assign workspaces to premium capacities if the capacity admin will be responsible for associating an app workspace to premium capacity.
  • Power BI Admins are expected to have the ability to assign individual workspaces to premium capacity ...

Get Microsoft Power BI Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Don’t leave empty-handed

Get Mark Richards’s Software Architecture Patterns ebook to better understand how to design components—and how they should interact.

It’s yours, free.

Cover of Software Architecture Patterns

Check it out now on O’Reilly

Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.

power bi premium capacity assignment

power bi premium capacity assignment

Power BI pricing

Compare business intelligence plans and pricing, free account.

  • Included in Microsoft Fabric free account
  • No credit card required
  • Upgrade to Pro or Premium to share reports                         

Power BI Pro

  • Publish and share Power BI reports
  • Included in Microsoft 365 E5 and Office 365 E5
  • Available to buy now with a credit card 1                                                     

Power BI Premium Per User

  • Includes all the features available with Power BI Pro
  • Access larger model sizes
  • More frequent refreshes
  • Available to buy now with a credit card 1                                                                    

Power BI Embedded

  • Brand Power BI reports as your own
  • Automate monitoring, management, and deployment
  • Reduce developer overhead

Fabric Capacity Reservation

  • Annual purchase, saving 40.5% over pay-as-you-go prices
  • Microsoft Azure Consumption Commitment eligible
  • Per-user licenses required for certain activities 5

Fabric Capacity Pay-As-You-Go

  • Dynamically scale up or down and pause capacity
  • Microsoft Azure Consumption Commitment (MACC) eligible

Power BI Premium in Microsoft Fabric

  • Includes all the features available in Power BI Premium per user
  • Use  autoscale  to respond to occasional, unplanned overage spikes in capacity 4
  • Power BI Premium capacity is being retired soon— learn more

 Explore Power BI Plans

A professional man and woman discussing work at a table with a laptop and water pitcher

Learn about Power BI licensing

included

Take the next step

Red logo resembling abstract blocks arranged in a 2x2 grid pattern, with one block tilted at an angle.

Start using Power BI for free

Red telephone handset icon on a black background.

Contact Sales

The image shows a bold, red no-photo icon consisting of a camera inside a prohibited circle.

Request we contact you

Prices shown are for marketing purposes only and may not be reflective of actual list price due to currency, country, and regional variant factors. Your actual price will be reflected at checkout. All offers are subject to product service limits. Contact a Microsoft representative to learn more about enterprise pricing and offers.

Academic, government , and nonprofit pricing are available.

Power BI is available in China operated by 21Vianet .

[1]  Power BI Pro and Power BI Premium per user subscriptions are available for self-service purchase, as well as through the Microsoft 365 admin center. Purchasing Microsoft Fabric requires access to the Microsoft 365 admin center. Learn more about available Power BI purchasing and licensing options. 

[2]  A  $10.00  per user/month add-on is available for users with Power BI Pro, Microsoft 365 E5, and Office 365 E5 licenses to step up to Power BI Premium per user. Learn more about purchasing Power BI Premium per user. 

[3]  Power BI Pro license is required for all Power BI Premium (“P”) and Fabric Capacity (“F”) SKUs to publish Power BI content to Microsoft Fabric. Enabling content consumers to review and interact with Power BI reports without additional paid per-user licenses is available at P1 and above (and F64 and above). 

[4]   Autoscale is an optional add-on that requires Power BI Premium per capacity (Gen2) and Azure subscriptions. Limits can be preset with autoscale to control costs, such as thresholds on vCore scaling or total subscription charges. 

[5]  A capacity-based license (P1 or F64 and above only) allows Pro or PPU users to create and save content in Fabric or Premium capacity workspaces. They can then share that workspace with colleagues who have any license type including a free account. Only users with Pro or PPU licenses can create and save content in Premium capacities, and only if their organization purchased Premium capacity. 

[6]  Representative of features available in general availability (GA) status.

[7]   Power BI Desktop is the data exploration and report authoring experience for Power BI, and it is available as a free download.

[8]  Power BI report consumption without paid per-user licenses applies to Fabric SKUs F64 and above, and Power BI Premium per capacity SKUs P1 and above.

[9]  See the Power BI model memory size limit capacity and SKUs .

[10]  For storing Power BI data sets only.

Follow Power BI

Twitter logo

  • Chat with sales
  • Contact sales

Available M-F 6 AM to 3 PM PT.

This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

Organizational licenses and subscriptions for Fabric and Power BI

  • 7 contributors

There are two ways for users to get a license, from an administrator or by using self-service. This article is for the administrators, who purchase and manage services and licenses.

Global and Billing administrators can sign up for the Power BI service and Fabric. After signing up and purchasing a subscription or a free trial, Billing, License, User, and Global administrators assign licenses to users in their organization.

If you give them permissions, users in your organization can get one of the organization's licenses simply by signing in to Power BI (app.powerbi.com) or Fabric (app.fabric.com). When a user signs in, they automatically receive a license. Administrators can set these permissions by following Enable and disable self-service .

If you're an individual, you can sign up for your own free, Pro, Premium Per User, and trial licenses. This process is called self-service. To learn more, see the self-service process .

Fabric and the Power BI service

Power BI is available as a standalone SaaS service and as one of the services that's integrated with Microsoft Fabric . Administration and licensing of the Power BI service is now integrated with Microsoft Fabric.

Licenses and subscriptions

Fabric and the Power BI service are both SaaS platforms and require licenses and at least one capacity subscription. There are some differences in how these licenses and capacities are acquired.

Power BI licenses

Use of the Power BI service requires a license and a capacity.

Per user license - Per user licenses allow users to work in the Power BI service. The options are Fabric free, Pro, and Premium Per User (PPU).

Capacity subscription - There are two types of Power BI subscriptions for organizations: standard and Premium. Premium provides enhancements to Power BI, and a comprehensive portfolio of Premium features . Power BI Premium is a tenant-level Microsoft 365 subscription, available in a choice of SKUs (Stock Keeping Units).

Each Microsoft 365 E5 subscription includes standard capacity and a Pro license.

Fabric licenses

To access the Fabric SaaS platform, you need a license and a capacity.

  • Per user license - Per user licenses allow users to work in Fabric. The options are Fabric free, Pro, and Premium Per User (PPU).
  • Capacity license - An organizational license that provides a pool of resources for Fabric operations. Capacity licenses are divided into stock keeping units (SKUs). Each SKU provides a different number of capacity units (CUs) which are used to calculate the capacity's compute power.

Which admins can purchase licenses and subscriptions

Admin roles for purchasing licenses and subscriptions.

You must belong to the Global or Billing admin role to purchase or assign licenses for your organization. Admin roles are assigned from the Microsoft Entra admin center or the Microsoft 365 admin center. For more information about admin roles in Microsoft Entra ID, see View and assign administrator roles in Microsoft Entra ID . To learn more about admin roles in Microsoft 365, including best practices, see About admin roles .

Admin roles for managing licenses and subscriptions

After the licenses and subscriptions are purchased, Global, Billing, License, or User admins can manage licenses for an organization. For details about these four admin roles, see Admin roles.

Get Fabric or Power BI for your organization

Fabric and the Power BI service are available for organizations to purchase and to try for free. For detailed instructions, select the option you prefer.

For up-to-date pricing information, see Pricing & product comparison .

Purchase a Fabric subscription. If you're ready to purchase, your options include bulk purchasing for your organization or enabling your users to upgrade their own licenses.

After you purchase a Fabric subscription, enable Fabric for your organization . This includes setting up tenants and capacities.

Purchase a Power BI subscription and licenses. Power BI Pro is included in Microsoft 365 E5. Otherwise, you can purchase Pro or PPU licenses from the Microsoft pricing site , through Microsoft 365, or through a Microsoft partner. After your purchase, you can assign licenses to individual users or allow self-service. For more information, see enabling your users to upgrade their own licenses .

Start a free 60-day Fabric trial for your organization. The trial includes a trial capacity, 1 TB of storage, and access to all of the Fabric experiences and features.

Get free licenses for all of your users. The Fabric free license doesn't include a Fabric capacity. The Fabric free license includes access to the paid features of the Power BI service.

Start a free 30-day trial of Power BI Pro. The trial includes up to 25 licenses. After the 30 days expire, you're charged for the licenses.

View existing licenses

To see which users in your organization already have a license, see View and manage user licenses .

Enable and disable self-service sign-up and purchase

You purchased licenses or signed up for an organizational trial. Now, decide whether your users can assign themselves a license or give themselves an automatic license upgrade. Here are links to help you decide which settings to use.

Self-service

  • When to use self-service sign-up and purchase
  • Manage self-service purchases and trials
  • Manage self-service license requests in the Microsoft 365 admin center
  • Manage self-service sign-up subscriptions in the Microsoft 365 admin center
  • How to combine self-service settings

Guest user licenses

You might want to distribute content to users who are outside of your organization. You can share content with external users by inviting them to view content as a guest. Microsoft Entra Business-to-business (Microsoft Entra B2B) enables sharing with external guest users. Prerequisites:

The ability to share content with external users must be enabled.

The guest user must have the proper licensing in place to view the shared content.

For more information about guest user access, see Distribute Power BI content to external guest users with Microsoft Entra B2B .

Other articles for admins

There are many other admin tasks related to licenses and subscriptions. Some of them are listed and linked here.

  • Assign and manage licenses with Microsoft 365 and with the Azure portal .
  • Remove or reassign a license, enable, disable a license
  • Cancel a trial
  • Takeover a license or tenant
  • Handle expiring trials .
  • Handle expiring licenses for Power BI

Power BI license expiration

There's a grace period after a Power BI license expires. For licenses that are part of a volume license purchase, the grace period is 90 days. If you bought the license directly, the grace period is 30 days. For license trials, there is no grace period.

Power BI has the same license lifecycle as Microsoft 365. For more information, see What happens to my data and access when my Microsoft 365 for business subscription ends .

Manage subscriptions included with Microsoft 365 E5 SKUs

A Microsoft 365 E5 subscription includes Power BI Pro licenses. To learn how to manage licenses, see Administration overview .

Related content

  • Purchase and assign Power BI Pro licenses
  • About self-service sign-up
  • Business subscriptions and billing documentation
  • Find Power BI users that are signed in
  • Self service FAQ

Was this page helpful?

Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see: https://aka.ms/ContentUserFeedback .

Submit and view feedback for

Additional resources

Chris Webb's BI Blog

Microsoft Fabric, Power BI, Analysis Services, DAX, M, MDX, Power Query, Power Pivot and Excel

Migrating From Power BI P-SKU Premium Capacities To F-SKU Capacities Is Not The Same Thing As Enabling Fabric

Since the announcement in March that Power BI Premium P-SKUs are being retired and that customers will need to migrate to F-SKU capacities intead I have been asked the same question several times:

Why are you forcing me to migrate to Fabric???

This thread on Reddit is a great example. What I want to make clear in this post is the following:

Moving from P-SKU capacities to F-SKU capacities is not the same thing as enabling Fabric in your tenant

No-one is being forced to migrate from Power BI to Fabric and using F-SKU capacities does not mean you are using Fabric. Access to Fabric for your users is governed by the tenant-level settings documented here and these settings work the same way regardless of whether you’re using a P-SKU capacity or an F-SKU capacity. If you do not enable Fabric you can carry on using Power BI in exactly the same way as you did before, with exactly the same functionality, when you move to using an F-SKU capacity. Your users will not have the ability to create Fabric items like notebooks, warehouses, lakehouses and so on just because you’re using an F-SKU.

As the announcement blog post explains, moving to F-SKUs will involve changes about how and where you purchase your capacities and there will be some features that are only available in F-SKU capacities. Migrating workspaces to a new F-SKU capacity is fairly straightforward (and no different from moving a workspace from one P-SKU capacity to another) but if you have questions about how to perform the migration or how this affects how much you’re paying for Power BI you should contact your Microsoft account team.

Share this:

' src=

Published by Chris Webb

My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel. View all posts by Chris Webb

15 thoughts on “ Migrating From Power BI P-SKU Premium Capacities To F-SKU Capacities Is Not The Same Thing As Enabling Fabric ”

In the post, you mention some Fabric features are only available on F-SKUs. Can you provide more information?

There are some examples in the blog post that I linked to: trusted workspace access for OneLake shortcuts is one.

Aren’t F-SKUs a bit more pricey than their P-SKUs counterparts?

Not really. Depends on how you buy them. There is a PAYG option that is more expensive, but very flexible. And there is a Reserved Capacity option that is the same price as the P SKUs.

Reserved Capacity requires an annual commitment, but for most customers, that’s normal.

Personally, I recommend Reserved Capacity for stable, big workloads and suggest looking at PAYG for Dev/Test environments.

With PAYG, you can scale as needed (up to look like Production and then back down). And… if you have a Dev environment that shuts down at night and weekends, it might be less expensive than Reserved Capacity. I ran the numbers and if you can “pause” your environment for more than 10 hours per day (like over night) then PAYG pricing is less than Reserved Capacity. BUT… you have to pause it. 🙂

Hope that helps. Andy http://www.directionsonmicrosoft.com

Chris, thank you… Your comments are true and I can understand the frustration. There is no requirement to use the other Fabric experiences, yet.

But there are three items that are relevant to this discussion. 1. The Fabric licensing for Premium EM and A is bad news. All EM options (which include read-only rights for all users) is going away. The Fabric equivalent SKUs (in performance) are the lower F SKUs, which require Power BI Pro licensing for all read-only users. That means EM customers will see a big cost increase. We have a number of customers who are very, very upset about this change.

2. Power BI Report Server dual-use rights are going away. They are available with Premium P, but are not included with F SKUs. There was a bogus argument that they are not compatible with Fabric, but it was always paperwork, so that makes no sense. Again, a number of our customers are frustrated that they bought P SKUs in good-faith to also run PBI Report Server on-prem and now… they have to purchase SQL Server Enterprise edition licenses and carry SA to do the same thing. That is a cost increase.

3. Finally, I’m seeing Power BI features go on the chopping block, only to be replaced by Fabric features. AutoML in Dataflows (being replaced by Data Science) is one and the fact that Dataflows Gen1 is called Gen1 now, leads me to believe it’s only a matter of time. So, while customers don’t have to use Fabric, the reality is for some Power BI features that are migrating it won’t be an option in the future.

Now, that sounded kind of harsh, and items 1 & 2 are because I don’t like cost increases. But I happen to be a big fan of Fabric and I think it’s a fabulous tool and the direction customers should go. But I would love to see a roadmap of Power BI features that are moving to Fabric, so customers can see the future, jump on board, and put together project plans to get there? That would really help…

Thanks… Andy http://www.directionsonmicrosoft.com

Licensing Power BI is non-trivial. Getting a reasonable bang-for-the-buck on reserved capacity is an almost impossible goal nowadays. I think reserved capacity is discouraged by Microsoft in favor of PAYG, and they certainly don’t give us the tools to get the most value out of reserved capacity.

Even before the F sku’s, Microsoft was continually changing the meaning of premium capacity. For example we used to have isolated background vs foreground cpu. Now it is all fungible. Also cpu seems to be fungible across TIME, and you can “borrow” from future cpu in a way that will “catch up with you” and create outages that last for two hours into the future or more!

The biggest problem with the fungibility of background and foreground cpu is the way the background cpu is averaged across the entire day. There is NO WAY to get full value out of the “overnight” CU’s . Because even if you try to concentrate the background operations to happen at night, it will still incur a debt that will be paid in the future rather than using the capacity that is going to waste overnight.

If you use the reservation recommender in Azure (it’s turned on by default), you will get recommendations on which RI to buy based on your usage pattern. RIs are super granular so you can optimize your commitment and maximize savings.

Something that confuses me is people say you have to have an F64 to have all the same features as a P1. However I am not clear what “features” are missing if you got an F32 vs F64 other than “speed” (generic term). I have not seen that documented.

Its because you only have free powerbi access for your tenant use F64. Using P1, you only need a PRO account do developemnt, in Fabric its only possible using F64. If you want to use powerbi with free account, you need to develop a portal, like you do with Embedded.

So here’s how it works.

An F64 and higher actually includes a Power BI Premium P environment. Granted, it shares compute with the other Fabric experiences, but it provides a full Premium P environment with all the features of Premium P. That means it has multi-geo, higher data model sizes, dataflow pipelines, bring your on key for data encryption, etc… All the high-end stuff big orgs need, including 100TB of Power BI storage.

An F32 and lower actually “works with Power BI”. None of those lower F SKUs include a Power BI environment directly. Instead, you have to fire up a Power BI Pro subscription which comes with its own data storage and is limited compared to a Premium P environment. (Lower model sizes, etc…) Now, what’s cool is the Fabric workspaces can include Power BI content and share security controls (works with). So… with a lower F SKU capacity working with Power BI feels just like the higher F SKUs, but the Power BI feature set is different.

Summary F64 and higher – Power BI included and brings Premium P features F32 and lower – Power BI not included but works with Power BI Pro workspaces

Hope that helps. Andy Snodgrass http://www.directionsonmicrosoft.com

I know what you’re trying to explain here but it’s not correct to say that “Power BI is not included” with F32 and lower. There’s a licensing difference: if you have a workspace on an F64 or above then you can use a Power BI Free licence to consume content in that workspace; if you are using a F32 the rules are the same as if you’re using Shared today, namely that you need a Power BI Pro licence to consume content as well as create it.

Chris, I think we’re into semantics here. My point is that purchasing an F32 or lower SKU does not give me (creator or consumer) the right to use Power BI. I have to purchase Power BI separately.

So, in my mind, that means Power BI is not included with F32 or lower. I can still use Power BI in an F32 deployment, but I have to pay for it separately.

Does that sound right? Andy

I agree we’re just discussing terminology here. I still don’t like the phrase “Power BI is not included in with F32 or lower” though. If you’re using Shared/Pro you still need licences to publish and consume, and you can’t say that Power BI is not included there.

In general when I look at reserved capacity F64 vs a P1 the basic price of an F64 seems to be much higher.

However, as a nonprofit the P SKU nonprofit pricing is heavily discounted compared to regular pricing. Microsoft does not seem to offer discounts on the Azure side for F SKUs (only small generic Azure Grant for nonprofits that we already use).

Is there any word if F SKU discounts is something that will be added for nonprofits? F SKUs like F64 especially without nonprofit discounts seem to be much more expensive.

Love to see MS continue to support Nonprofits/NGOs with Power BI and Fabric.

  • Pingback: Using F-SKU Power BI Capacity and Microsoft Fabric – Curated SQL

Leave a Reply Cancel reply

Discover more from chris webb's bi blog.

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

Jumpstart your career with the Fabric Career Hub

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.

  • Power BI forums
  • News & Announcements
  • Get Help with Power BI
  • Report Server
  • Power Query
  • Mobile Apps
  • DAX Commands and Tips
  • Custom Visuals Development Discussion
  • Health and Life Sciences
  • Power BI Spanish forums
  • Translated Spanish Desktop
  • Power Platform Integration - Better Together!
  • Power Platform Integrations (Read-only)
  • Power Platform and Dynamics 365 Integrations (Read-only)
  • Training and Consulting
  • Instructor Led Training
  • Dashboard in a Day for Women, by Women
  • Community Connections & How-To Videos
  • COVID-19 Data Stories Gallery
  • Themes Gallery
  • Data Stories Gallery
  • R Script Showcase
  • Webinars and Video Gallery
  • Quick Measures Gallery
  • 2021 MSBizAppsSummit Gallery
  • 2020 MSBizAppsSummit Gallery
  • 2019 MSBizAppsSummit Gallery
  • Custom Visuals Ideas
  • Upcoming Events
  • Community Blog
  • Power BI Community Blog
  • Custom Visuals Community Blog
  • Community Support
  • Community Accounts & Registration
  • Using the Community
  • Community Feedback

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge .

Co pilot tenant settings

  • Subscribe to RSS Feed
  • Mark Topic as New
  • Mark Topic as Read
  • Float this Topic for Current User
  • Printer Friendly Page
  • All forum topics
  • Previous Topic

Shanu02

  • Mark as New
  • Report Inappropriate Content

Brunner_BI

Helpful resources

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

t-gonggrace

Please enter your information to subscribe to the Microsoft Fabric Blog.

Microsoft fabric updates blog.

Microsoft Fabric May 2024 Update

  • Monthly Update

Headshot of article author

Welcome to the May 2024 update.  

Here are a few, select highlights of the many we have for Fabric. You can now ask Copilot questions about data in your model, Model Explorer and authoring calculation groups in Power BI desktop is now generally available, and Real-Time Intelligence provides a complete end-to-end solution for ingesting, processing, analyzing, visualizing, monitoring, and acting on events.

There is much more to explore, please continue to read on. 

Microsoft Build Announcements

At Microsoft Build 2024, we are thrilled to announce a huge array of innovations coming to the Microsoft Fabric platform that will make Microsoft Fabric’s capabilities even more robust and even customizable to meet the unique needs of each organization. To learn more about these changes, read the “ Unlock real-time insights with AI-powered analytics in Microsoft Fabric ” announcement blog by Arun Ulag.

Fabric Roadmap Update

Last October at the Microsoft Power Platform Community Conference we  announced the release of the Microsoft Fabric Roadmap . Today we have updated that roadmap to include the next semester of Fabric innovations. As promised, we have merged Power BI into this roadmap to give you a single, unified road map for all of Microsoft Fabric. You can find the Fabric Roadmap at  https://aka.ms/FabricRoadmap .

We will be innovating our Roadmap over the coming year and would love to hear your recommendation ways that we can make this experience better for you. Please submit suggestions at  https://aka.ms/FabricIdeas .

Earn a discount on your Microsoft Fabric certification exam!  

We’d like to thank the thousands of you who completed the Fabric AI Skills Challenge and earned a free voucher for Exam DP-600 which leads to the Fabric Analytics Engineer Associate certification.   

If you earned a free voucher, you can find redemption instructions in your email. We recommend that you schedule your exam now, before your discount voucher expires on June 24 th . All exams must be scheduled and completed by this date.    

If you need a little more help with exam prep, visit the Fabric Career Hub which has expert-led training, exam crams, practice tests and more.  

Missed the Fabric AI Skills Challenge? We have you covered. For a limited time , you could earn a 50% exam discount by taking the Fabric 30 Days to Learn It Challenge .  

Modern Tooltip now on by Default

Matrix layouts, line updates, on-object interaction updates, publish to folders in public preview, you can now ask copilot questions about data in your model (preview), announcing general availability of dax query view, copilot to write and explain dax queries in dax query view public preview updates, new manage relationships dialog, refreshing calculated columns and calculated tables referencing directquery sources with single sign-on, announcing general availability of model explorer and authoring calculation groups in power bi desktop, microsoft entra id sso support for oracle database, certified connector updates, view reports in onedrive and sharepoint with live connected semantic models, storytelling in powerpoint – image mode in the power bi add-in for powerpoint, storytelling in powerpoint – data updated notification, git integration support for direct lake semantic models.

  • Editor’s pick of the quarter
  • New visuals in AppSource
  • Financial Reporting Matrix by Profitbase
  • Horizon Chart by Powerviz

Milestone Trend Analysis Chart by Nova Silva

  • Sunburst Chart by Powerviz
  • Stacked Bar Chart with Line by JTA

Fabric Automation

Streamlining fabric admin apis, microsoft fabric workload development kit, external data sharing, apis for onelake data access roles, shortcuts to on-premises and network-restricted data, copilot for data warehouse.

  • Unlocking Insights through Time: Time travel in Data warehouse

Copy Into enhancements

Faster workspace resource assignment powered by just in time database attachment, runtime 1.3 (apache spark 3.5, delta lake 3.1, r 4.3.3, python 3.11) – public preview, native execution engine for fabric runtime 1.2 (apache spark 3.4) – public preview , spark run series analysis, comment @tagging in notebook, notebook ribbon upgrade, notebook metadata update notification, environment is ga now, rest api support for workspace data engineering/science settings, fabric user data functions (private preview), introducing api for graphql in microsoft fabric (preview), copilot will be enabled by default, the ai and copilot setting will be automatically delegated to capacity admins, abuse monitoring no longer stores your data, real-time hub, source from real-time hub in enhanced eventstream, use real-time hub to get data in kql database in eventhouse, get data from real-time hub within reflexes, eventstream edit and live modes, default and derived streams, route streams based on content in enhanced eventstream, eventhouse is now generally available, eventhouse onelake availability is now generally available, create a database shortcut to another kql database, support for ai anomaly detector, copilot for real-time intelligence, eventhouse tenant level private endpoint support, visualize data with real-time dashboards, new experience for data exploration, create triggers from real-time hub, set alert on real-time dashboards, taking action through fabric items, general availability of the power query sdk for vs code, refresh the refresh history dialog, introducing data workflows in data factory, introducing trusted workspace access in fabric data pipelines.

  • Introducing Blob Storage Event Triggers for Data Pipelines
  • Parent/child pipeline pattern monitoring improvements

Fabric Spark job definition activity now available

Hd insight activity now available, modern get data experience in data pipeline.

Power BI tooltips are embarking on an evolution to enhance their functionality. To lay the groundwork, we are introducing the modern tooltip as the new default , a feature that many users may already recognize from its previous preview status. This change is more than just an upgrade; it’s the first step in a series of remarkable improvements. These future developments promise to revolutionize tooltip management and customization, offering possibilities that were previously only imaginable. As we prepare for the general availability of the modern tooltip, this is an excellent opportunity for users to become familiar with its features and capabilities. 

power bi premium capacity assignment

Discover the full potential of the new tooltip feature by visiting our dedicated blog . Dive into the details and explore the comprehensive vision we’ve crafted for tooltips, designed to enhance your Power BI experience. 

We’ve listened to our community’s feedback on improving our tabular visuals (Table and Matrix), and we’re excited to initiate their transformation. Drawing inspiration from the familiar PivotTable in Excel , we aim to build new features and capabilities upon a stronger foundation. In our May update, we’re introducing ‘ Layouts for Matrix .’ Now, you can select from compact , outline , or tabular layouts to alter the arrangement of components in a manner akin to Excel. 

power bi premium capacity assignment

As an extension of the new layout options, report creators can now craft custom layout patterns by repeating row headers. This powerful control, inspired by Excel’s PivotTable layout, enables the creation of a matrix that closely resembles the look and feel of a table. This enhancement not only provides greater flexibility but also brings a touch of Excel’s intuitive design to Power BI’s matrix visuals. Only available for Outline and Tabular layouts.

power bi premium capacity assignment

To further align with Excel’s functionality, report creators now have the option to insert blank rows within the matrix. This feature allows for the separation of higher-level row header categories, significantly enhancing the readability of the report. It’s a thoughtful addition that brings a new level of clarity and organization to Power BI’s matrix visuals and opens a path for future enhancements for totals/subtotals and rows/column headers. 

power bi premium capacity assignment

We understand your eagerness to delve deeper into the matrix layouts and grasp how these enhancements fulfill the highly requested features by our community. Find out more and join the conversation in our dedicated blog , where we unravel the details and share the community-driven vision behind these improvements. 

Following last month’s introduction of the initial line enhancements, May brings a groundbreaking set of line capabilities that are set to transform your Power BI experience: 

  • Hide/Show lines : Gain control over the visibility of your lines for a cleaner, more focused report. 
  • Customized line pattern : Tailor the pattern of your lines to match the style and context of your data. 
  • Auto-scaled line pattern : Ensure your line patterns scale perfectly with your data, maintaining consistency and clarity. 
  • Line dash cap : Customize the end caps of your customized dashed lines for a polished, professional look. 
  • Line upgrades across other line types : Experience improvements in reference lines, forecast lines, leader lines, small multiple gridlines, and the new card’s divider line. 

These enhancements are not to be missed. We recommend visiting our dedicated blog for an in-depth exploration of all the new capabilities added to lines, keeping you informed and up to date. 

This May release, we’re excited to introduce on-object formatting support for Small multiples , Waterfall , and Matrix visuals. This new feature allows users to interact directly with these visuals for a more intuitive and efficient formatting experience. By double-clicking on any of these visuals, users can now right-click on the specific visual component they wish to format, bringing up a convenient mini-toolbar. This streamlined approach not only saves time but also enhances the user’s ability to customize and refine their reports with ease. 

power bi premium capacity assignment

We’re also thrilled to announce a significant enhancement to the mobile reporting experience with the introduction of the pane manager for the mobile layout view. This innovative feature empowers users to effortlessly open and close panels via a dedicated menu, streamlining the design process of mobile reports. 

power bi premium capacity assignment

We recently announced a public preview for folders in workspaces, allowing you to create a hierarchical structure for organizing and managing your items. In the latest Desktop release, you can now publish your reports to specific folders in your workspace.  

When you publish a report, you can choose the specific workspace and folder for your report. The interface is simplistic and easy to understand, making organizing your Power BI content from Desktop better than ever. 

power bi premium capacity assignment

To publish reports to specific folders in the service, make sure the “Publish dialogs support folder selection” setting is enabled in the Preview features tab in the Options menu. 

power bi premium capacity assignment

Learn more about folders in workspaces.   

We’re excited to preview a new capability for Power BI Copilot allowing you to ask questions about the data in your model! You could already ask questions about the data present in the visuals on your report pages – and now you can go deeper by getting answers directly from the underlying model. Just ask questions about your data, and if the answer isn’t already on your report, Copilot will then query your model for the data instead and return the answer to your question in the form of a visual! 

power bi premium capacity assignment

We’re starting this capability off in both Edit and View modes in Power BI Service. Because this is a preview feature, you’ll need to enable it via the preview toggle in the Copilot pane. You can learn more about all the details of the feature in our announcement post here! (will link to announcement post)  

We are excited to announce the general availability of DAX query view. DAX query view is the fourth view in Power BI Desktop to run DAX queries on your semantic model.  

DAX query view comes with several ways to help you be as productive as possible with DAX queries. 

  • Quick queries. Have the DAX query written for you from the context menu of tables, columns, or measures in the Data pane of DAX query view. Get the top 100 rows of a table, statistics of a column, or DAX formula of a measure to edit and validate in just a couple clicks! 
  • DirectQuery model authors can also use DAX query view. View the data in your tables whenever you want! 
  • Create and edit measures. Edit one or multiple measures at once. Make changes and see the change in action in a DA query. Then update the model when you are ready. All in DAX query view! 
  • See the DAX query of visuals. Investigate the visuals DAX query in DAX query view. Go to the Performance Analyzer pane and choose “Run in DAX query view”. 
  • Write DAX queries. You can create DAX queries with Intellisense, formatting, commenting/uncommenting, and syntax highlighting. And additional professional code editing experiences such as “Change all occurrences” and block folding to expand and collapse sections. Even expanded find and replace options with regex. 

Learn more about DAX query view with these resources: 

  • Deep dive blog: https://powerbi.microsoft.com/blog/deep-dive-into-dax-query-view-and-writing-dax-queries/  
  • Learn more: https://learn.microsoft.com/power-bi/transform-model/dax-query-view  
  • Video: https://youtu.be/oPGGYLKhTOA?si=YKUp1j8GoHHsqdZo  

DAX query view includes an inline Fabric Copilot to write and explain DAX queries, which remains in public preview. This month we have made the following updates. 

  • Run the DAX query before you keep it . Previously the Run button was disabled until the generated DAX query was accepted or Copilot was closed. Now you can Run the DAX query then decide to Keep or Discard the DAX query. 

power bi premium capacity assignment

2. Conversationally build the DAX query. Previously the DAX query generated was not considered if you typed additional prompts and you had to keep the DAX query, select it again, then use Copilot again to adjust. Now you can simply adjust by typing in additional user prompts.   

power bi premium capacity assignment

3. Syntax checks on the generated DAX query. Previously there was no syntax check before the generated DAX query was returned. Now the syntax is checked, and the prompt automatically retried once. If the retry is also invalid, the generated DAX query is returned with a note that there is an issue, giving you the option to rephrase your request or fix the generated DAX query. 

power bi premium capacity assignment

4. Inspire buttons to get you started with Copilot. Previously nothing happened until a prompt was entered. Now click any of these buttons to quickly see what you can do with Copilot! 

power bi premium capacity assignment

Learn more about DAX queries with Copilot with these resources: 

  • Deep dive blog: https://powerbi.microsoft.com/en-us/blog/deep-dive-into-dax-query-view-with-copilot/  
  • Learn more: https://learn.microsoft.com/en-us/dax/dax-copilot  
  • Video: https://www.youtube.com/watch?v=0kE3TE34oLM  

We are excited to introduce you to the redesigned ‘Manage relationships’ dialog in Power BI Desktop! To open this dialog simply select the ‘Manage relationships’ button in the modeling ribbon.

power bi premium capacity assignment

Once opened, you’ll find a comprehensive view of all your relationships, along with their key properties, all in one convenient location. From here you can create new relationships or edit an existing one.

power bi premium capacity assignment

Additionally, you have the option to filter and focus on specific relationships in your model based on cardinality and cross filter direction. 

power bi premium capacity assignment

Learn more about creating and managing relationships in Power BI Desktop in our documentation . 

Ever since we released composite models on Power BI semantic models and Analysis Services , you have been asking us to support the refresh of calculated columns and tables in the Service. This month, we have enabled the refresh of calculated columns and tables in Service for any DirectQuery source that uses single sign-on authentication. This includes the sources you use when working with composite models on Power BI semantic models and Analysis Services.  

Previously, the refresh of a semantic model that uses a DirectQuery source with single-sign-on authentication failed with one of the following error messages: “Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery.” or “Refresh over a dataset with a calculated table or a calculated column which references a Direct Query data source is not supported.” 

Starting today, you can successfully refresh the calculated table and calculated columns in a semantic model in the Service using specific credentials as long as: 

  • You used a shareable cloud connection and assigned it and/or.
  • Enabled granular access control for all data connection types.

Here’s how to do this: 

  • Create and publish your semantic model that uses a single sign-on DirectQuery source. This can be a composite model but doesn’t have to be. 
  • In the semantic model settings, under Gateway and cloud connections , map each single sign-on DirectQuery connection to a specific connection. If you don’t have a specific connection yet, select ‘Create a connection’ to create it: 

power bi premium capacity assignment

  • If you are creating a new connection, fill out the connection details and click Create , making sure to select ‘Use SSO via Azure AD for DirectQuery queries: 

power bi premium capacity assignment

  • Finally, select the connection for each single sign-on DirectQuery source and select Apply : 

power bi premium capacity assignment

2. Either refresh the semantic model manually or plan a scheduled refresh to confirm the refresh now works successfully. Congratulations, you have successfully set up refresh for semantic models with a single sign-on DirectQuery connection that uses calculated columns or calculated tables!

We are excited to announce the general availability of Model Explorer in the Model view of Power BI, including the authoring of calculation groups. Semantic modeling is even easier with an at-a-glance tree view with item counts, search, and in context paths to edit the semantic model items with Model Explorer. Top level semantic model properties are also available as well as the option to quickly create relationships in the properties pane. Additionally, the styling for the Data pane is updated to Fluent UI also used in Office and Teams.  

A popular community request from the Ideas forum, authoring calculation groups is also included in Model Explorer. Calculation groups significantly reduce the number of redundant measures by allowing you to define DAX formulas as calculation items that can be applied to existing measures. For example, define a year over year, prior month, conversion, or whatever your report needs in DAX formula once as a calculation item and reuse it with existing measures. This can reduce the number of measures you need to create and make the maintenance of the business logic simpler.  

Available in both Power BI Desktop and when editing a semantic model in the workspace, take your semantic model authoring to the next level today!  

power bi premium capacity assignment

Learn more about Model Explorer and authoring calculation groups with these resources: 

  • Use Model explorer in Power BI (preview) – Power BI | Microsoft Learn  
  • Create calculation groups in Power BI (preview) – Power BI | Microsoft Learn  

Data connectivity  

We’re happy to announce that the Oracle database connector has been enhanced this month with the addition of Single Sign-On support in the Power BI service with Microsoft Entra ID authentication.  

Microsoft Entra ID SSO enables single sign-on to access data sources that rely on Microsoft Entra ID based authentication. When you configure Microsoft Entra SSO for an applicable data source, queries run under the Microsoft Entra identity of the user that interacts with the Power BI report. 

power bi premium capacity assignment

We’re pleased to announce the new and updated connectors in this release:   

  • [New] OneStream : The OneStream Power BI Connector enables you to seamlessly connect Power BI to your OneStream applications by simply logging in with your OneStream credentials. The connector uses your OneStream security, allowing you to access only the data you have based on your permissions within the OneStream application. Use the connector to pull cube and relational data along with metadata members, including all their properties. Visit OneStream Power BI Connector to learn more. Find this connector in the other category. 
  • [New] Zendesk Data : A new connector developed by the Zendesk team that aims to go beyond the functionality of the existing Zendesk legacy connector created by Microsoft. Learn more about what this new connector brings. 
  • [New] CCH Tagetik 
  • [Update] Azure Databricks  

Are you interested in creating your own connector and publishing it for your customers? Learn more about the Power Query SDK and the Connector Certification program .   

Last May, we announced the integration between Power BI and OneDrive and SharePoint. Previously, this capability was limited to only reports with data in import mode. We’re excited to announce that you can now seamlessly view Power BI reports with live connected data directly in OneDrive and SharePoint! 

When working on Power BI Desktop with a report live connected to a semantic model in the service, you can easily share a link to collaborate with others on your team and allow them to quickly view the report in their browser. We’ve made it easier than ever to access the latest data updates without ever leaving your familiar OneDrive and SharePoint environments. This integration streamlines your workflows and allows you to access reports within the platforms you already use. With collaboration at the heart of this improvement, teams can work together more effectively to make informed decisions by leveraging live connected semantic models without being limited to data only in import mode.  

Utilizing OneDrive and SharePoint allows you to take advantage of built-in version control, always have your files available in the cloud, and utilize familiar and simplistic sharing.  

power bi premium capacity assignment

While you told us that you appreciate the ability to limit the image view to only those who have permission to view the report, you asked for changes for the “Public snapshot” mode.   

To address some of the feedback we got from you, we have made a few more changes in this area.  

  • Add-ins that were saved as “Public snapshot” can be printed and will not require that you go over all the slides and load the add-ins for permission check before the public image is made visible. 
  • You can use the “Show as saved image” on add-ins that were saved as “Public snapshot”. This will replace the entire add-in with an image representation of it, so the load time might be faster when you are presenting your presentation. 

Many of us keep presentations open for a long time, which might cause the data in the presentation to become outdated.  

To make sure you have in your slides the data you need, we added a new notification that tells you if more up to date data exists in Power BI and offers you the option to refresh and get the latest data from Power BI. 

Developers 

Direct Lake semantic models are now supported in Fabric Git Integration , enabling streamlined version control, enhanced collaboration among developers, and the establishment of CI/CD pipelines for your semantic models using Direct Lake. 

power bi premium capacity assignment

Learn more about version control, testing, and deployment of Power BI content in our Power BI implementation planning documentation: https://learn.microsoft.com/power-bi/guidance/powerbi-implementation-planning-content-lifecycle-management-overview  

Visualizations 

Editor’s pick of the quarter .

– Animator for Power BI     Innofalls Charts     SuperTables     Sankey Diagram for Power BI by ChartExpo     Dynamic KPI Card by Sereviso     Shielded HTML Viewer     Text search slicer  

New visuals in AppSource 

Mapa Polski – Województwa, Powiaty, Gminy   Workstream   Income Statement Table  

Gas Detection Chart  

Seasonality Chart   PlanIn BI – Data Refresh Service  

Chart Flare  

PictoBar   ProgBar  

Counter Calendar   Donut Chart image  

Financial Reporting Matrix by Profitbase 

Making financial statements with a proper layout has just become easier with the latest version of the Financial Reporting Matrix. 

Users are now able to specify which rows should be classified as cost-rows, which will make it easier to get the conditional formatting of variances correctly: 

power bi premium capacity assignment

Selecting a row, and ticking “is cost” will tag the row as cost. This can be used in conditional formatting to make sure that positive variances on expenses are a bad for the result, while a positive variance on an income row is good for the result. 

The new version also includes more flexibility in measuring placement and column subtotals. 

Measures can be placed either: 

  • Default (below column headers) 
  • Above column headers 

power bi premium capacity assignment

  • Conditionally hide columns 
  • + much more 

Highlighted new features:  

  • Measure placement – In rows  
  • Select Column Subtotals  
  • New Format Pane design 
  • Row Options  

Get the visual from AppSource and find more videos here ! 

Horizon Chart by Powerviz  

A Horizon Chart is an advanced visual, for time-series data, revealing trends and anomalies. It displays stacked data layers, allowing users to compare multiple categories while maintaining data clarity. Horizon Charts are particularly useful to monitor and analyze complex data over time, making this a valuable visual for data analysis and decision-making. 

Key Features:  

  • Horizon Styles: Choose Natural, Linear, or Step with adjustable scaling. 
  • Layer: Layer data by range or custom criteria. Display positive and negative values together or separately on top. 
  • Reference Line : Highlight patterns with X-axis lines and labels. 
  • Colors: Apply 30+ color palettes and use FX rules for dynamic coloring. 
  • Ranking: Filter Top/Bottom N values, with “Others”. 
  • Gridline: Add gridlines to the X and Y axis.  
  • Custom Tooltip: Add highest, lowest, mean, and median points without additional DAX. 
  • Themes: Save designs and share seamlessly with JSON files. 

Other features included are ranking, annotation, grid view, show condition, and accessibility support.  

Business Use Cases: Time-Series Data Comparison, Environmental Monitoring, Anomaly Detection 

🔗 Try Horizon Chart for FREE from AppSource  

📊 Check out all features of the visual: Demo file  

📃 Step-by-step instructions: Documentation  

💡 YouTube Video: Video Link  

📍 Learn more about visuals: https://powerviz.ai/  

✅ Follow Powerviz : https://lnkd.in/gN_9Sa6U  

power bi premium capacity assignment

Exciting news! Thanks to your valuable feedback, we’ve enhanced our Milestone Trend Analysis Chart even further. We’re thrilled to announce that you can now switch between horizontal and vertical orientations, catering to your preferred visualization style.

The Milestone Trend Analysis (MTA) Chart remains your go-to tool for swiftly identifying deadline trends, empowering you to take timely corrective actions. With this update, we aim to enhance deadline awareness among project participants and stakeholders alike. 

power bi premium capacity assignment

In our latest version, we seamlessly navigate between horizontal and vertical views within the familiar Power BI interface. No need to adapt to a new user interface – enjoy the same ease of use with added flexibility. Plus, it benefits from supported features like themes, interactive selection, and tooltips. 

What’s more, ours is the only Microsoft Certified Milestone Trend Analysis Chart for Power BI, ensuring reliability and compatibility with the platform. 

Ready to experience the enhanced Milestone Trend Analysis Chart? Download it from AppSource today and explore its capabilities with your own data – try for free!  

We welcome any questions or feedback at our website: https://visuals.novasilva.com/ . Try it out and elevate your project management insights now! 

Sunburst Chart by Powerviz  

Powerviz’s Sunburst Chart is an interactive tool for hierarchical data visualization. With this chart, you can easily visualize multiple columns in a hierarchy and uncover valuable insights. The concentric circle design helps in displaying part-to-whole relationships. 

  • Arc Customization: Customize shapes and patterns. 
  • Color Scheme: Accessible palettes with 30+ options. 
  • Centre Circle: Design an inner circle with layers. Add text, measure, icons, and images. 
  • Conditional Formatting: Easily identify outliers based on measure or category rules. 
  • Labels: Smart data labels for readability. 
  • Image Labels: Add an image as an outer label. 
  • Interactivity: Zoom, drill down, cross-filtering, and tooltip features. 

Other features included are annotation, grid view, show condition, and accessibility support.  

Business Use Cases:   

  • Sales and Marketing: Market share analysis and customer segmentation. 
  • Finance : Department budgets and expenditures distribution. 
  • Operations : Supply chain management. 
  • Education : Course structure, curriculum creation. 
  • Human Resources : Organization structure, employee demographics.

🔗 Try Sunburst Chart for FREE from AppSource  

power bi premium capacity assignment

Stacked Bar Chart with Line by JTA  

Clustered bar chart with the possibility to stack one of the bars  

Stacked Bar Chart with Line by JTA seamlessly merges the simplicity of a traditional bar chart with the versatility of a stacked bar, revolutionizing the way you showcase multiple datasets in a single, cohesive display. 

Unlocking a new dimension of insight, our visual features a dynamic line that provides a snapshot of data trends at a glance. Navigate through your data effortlessly with multiple configurations, gaining a swift and comprehensive understanding of your information. 

Tailor your visual experience with an array of functionalities and customization options, enabling you to effortlessly compare a primary metric with the performance of an entire set. The flexibility to customize the visual according to your unique preferences empowers you to harness the full potential of your data. 

Features of Stacked Bar Chart with Line:  

  • Stack the second bar 
  • Format the Axis and Gridlines 
  • Add a legend 
  • Format the colors and text 
  • Add a line chart 
  • Format the line 
  • Add marks to the line 
  • Format the labels for bars and line 

If you liked what you saw, you can try it for yourself and find more information here . Also, if you want to download it, you can find the visual package on the AppSource . 

power bi premium capacity assignment

We have added an exciting new feature to our Combo PRO, Combo Bar PRO, and Timeline PRO visuals – Legend field support . The Legend field makes it easy to visually split series values into smaller segments, without the need to use measures or create separate series. Simply add a column with category names that are adjacent to the series values, and the visual will do the following:  

  • Display separate segments as a stack or cluster, showing how each segment contributed to the total Series value. 
  • Create legend items for each segment to quickly show/hide them without filtering.  
  • Apply custom fill colors to each segment.  
  • Show each segment value in the tooltip 

Read more about the Legend field on our blog article  

Drill Down Combo PRO is made for creators who want to build visually stunning and user-friendly reports. Cross-chart filtering and intuitive drill down interactions make data exploration easy and fun for any user. Furthermore, you can choose between three chart types – columns, lines, or areas; and feature up to 25 different series in the same visual and configure each series independently.  

📊 Get Drill Down Combo PRO on AppSource  

🌐 Visit Drill Down Combo PRO product page  

Documentation | ZoomCharts Website | Follow ZoomCharts on LinkedIn  

We are thrilled to announce that Fabric Core REST APIs are now generally available! This marks a significant milestone in the evolution of Microsoft Fabric, a platform that has been meticulously designed to empower developers and businesses alike with a comprehensive suite of tools and services. 

The Core REST APIs are the backbone of Microsoft Fabric, providing the essential building blocks for a myriad of functionalities within the platform. They are designed to improve efficiency, reduce manual effort, increase accuracy, and lead to faster processing times. These APIs help with scale operations more easily and efficiently as the volume of work grows, automate repeatable processes with consistency, and enable integration with other systems and applications, providing a streamlined and efficient data pipeline. 

The Microsoft Fabric Core APIs encompasses a range of functionalities, including: 

  • Workspace management: APIs to manage workspaces, including permissions.  
  • Item management: APIs for creating, reading, updating, and deleting items, with partial support for data source discovery and granular permissions management planned for the near future. 
  • Job and tenant management: APIs to manage jobs, tenants, and users within the platform. 

These APIs adhere to industry standards and best practices, ensuring a unified developer experience that is both coherent and easy to use. 

For developers looking to dive into the details of the Microsoft Fabric Core APIs, comprehensive documentation is available. This includes guidelines on API usage, examples, and articles managed in a centralized repository for ease of access and discoverability. The documentation is continuously updated to reflect the latest features and improvements, ensuring that developers have the most current information at their fingertips. See Microsoft Fabric REST API documentation  

We’re excited to share an important update we made to the Fabric Admin APIs. This enhancement is designed to simplify your automation experience. Now, you can manage both Power BI and the new Fabric items (previously referred to as artifacts) using the same set of APIs. Before this enhancement, you had to navigate using two different APIs—one for Power BI items and another for new Fabric items. That’s no longer the case. 

The APIs we’ve updated include GetItem , ListItems , GetItemAccessDetails , and GetAccessEntities . These enhancements mean you can now query and manage all your items through a single API call, regardless of whether they’re Fabric types or Power BI types. We hope this update makes your work more straightforward and helps you accomplish your tasks more efficiently. 

We’re thrilled to announce the public preview of the Microsoft Fabric workload development kit. This feature now extends to additional workloads and offers a robust developer toolkit for designing, developing, and interoperating with Microsoft Fabric using frontend SDKs and backend REST APIs. Introducing the Microsoft Fabric Workload Development Kit . 

The Microsoft Fabric platform now provides a mechanism for ISVs and developers to integrate their new and existing applications natively into Fabric’s workload hub. This integration provides the ability to add net new capabilities to Fabric in a consistent experience without leaving their Fabric workspace, thereby accelerating data driven outcomes from Microsoft Fabric. 

power bi premium capacity assignment

By downloading and leveraging the development kit , ISVs and software developers can build and scale existing and new applications on Microsoft Fabric and offer them via the Azure Marketplace without the need to ever leave the Fabric environment. 

The development kit provides a comprehensive guide and sample code for creating custom item types that can be added to the Fabric workspace. These item types can leverage the Fabric frontend SDKs and backend REST APIs to interact with other Fabric capabilities, such as data ingestion, transformation, orchestration, visualization, and collaboration. You can also embed your own data application into the Fabric item editor using the Fabric native experience components, such as the header, toolbar, navigation pane, and status bar. This way, you can offer consistent and seamless user experience across different Fabric workloads. 

This is a call to action for ISVs, software developers, and system integrators. Let’s leverage this opportunity to create more integrated and seamless experiences for our users. 

power bi premium capacity assignment

We’re excited about this journey and look forward to seeing the innovative workloads from our developer community. 

We are proud to announce the public preview of external data sharing. Sharing data across organizations has become a standard part of day-to-day business for many of our customers. External data sharing, built on top of OneLake shortcuts, enables seamless, in-place sharing of data, allowing you to maintain a single copy of data even when sharing data across tenant boundaries. Whether you’re sharing data with customers, manufacturers, suppliers, consultants, or partners; the applications are endless. 

How external data sharing works  

Sharing data across tenants is as simple as any other share operation in Fabric. To share data, navigate to the item to be shared, click on the context menu, and then click on External data share . Select the folder or table you want to share and click Save and continue . Enter the email address and an optional message and then click Send . 

power bi premium capacity assignment

The data consumer will receive an email containing a share link. They can click on the link to accept the share and access the data within their own tenant. 

power bi premium capacity assignment

Click here for more details about external data sharing . 

Following the release of OneLake data access roles in public preview, the OneLake team is excited to announce the availability of APIs for managing data access roles. These APIs can be used to programmatically manage granular data access for your lakehouses. Manage all aspects of role management such as creating new roles, editing existing ones, or changing memberships in a programmatic way.  

Do you have data stored on-premises or behind a firewall that you want to access and analyze with Microsoft Fabric? With OneLake shortcuts, you can bring on-premises or network-restricted data into OneLake, without any data movement or duplication. Simply install the Fabric on-premises data gateway and create a shortcut to your S3 compatible, Amazon S3, or Google Cloud Storage data source. Then use any of Fabric’s powerful analytics engines and OneLake open APIs to explore, transform, and visualize your data in the cloud. 

Try it out today and unlock the full potential of your data with OneLake shortcuts! 

power bi premium capacity assignment

Data Warehouse 

We are excited to announce Copilot for Data Warehouse in public preview! Copilot for Data Warehouse is an AI assistant that helps developers generate insights through T-SQL exploratory analysis. Copilot is contextualized your warehouse’s schema. With this feature, data engineers and data analysts can use Copilot to: 

  • Generate T-SQL queries for data analysis.  
  • Explain and add in-line code comments for existing T-SQL queries. 
  • Fix broken T-SQL code. 
  • Receive answers regarding general data warehousing tasks and operations. 

There are 3 areas where Copilot is surfaced in the Data Warehouse SQL Query Editor: 

  • Code completions when writing a T-SQL query. 
  • Chat panel to interact with the Copilot in natural language. 
  • Quick action buttons to fix and explain T-SQL queries. 

Learn more about Copilot for Data Warehouse: aka.ms/data-warehouse-copilot-docs. Copilot for Data Warehouse is currently only available in the Warehouse. Copilot in the SQL analytics endpoint is coming soon. 

Unlocking Insights through Time: Time travel in Data warehouse (public preview)

As data volumes continue to grow in today’s rapidly evolving world of Artificial Intelligence, it is crucial to reflect on historical data. It empowers businesses to derive valuable insights that aid in making well-informed decisions for the future. Preserving multiple historical data versions not only incurred significant costs but also presented challenges in upholding data integrity, resulting in a notable impact on query performance. So, we are thrilled to announce the ability to query the historical data through time travel at the T-SQL statement level which helps unlock the evolution of data over time. 

The Fabric warehouse retains historical versions of tables for seven calendar days. This retention allows for querying the tables as if they existed at any point within the retention timeframe. Time travel clause can be included in any top level SELECT statement. For complex queries that involve multiple tables, joins, stored procedures, or views, the timestamp is applied just once for the entire query instead of specifying the same timestamp for each table within the same query. This ensures the entire query is executed with reference to the specified timestamp, maintaining the data’s uniformity and integrity throughout the query execution. 

From historical trend analysis and forecasting to compliance management, stable reporting and real-time decision support, the benefits of time travel extend across multiple business operations. Embrace the capability of time travel to navigate the data-driven landscape and gain a competitive edge in today’s fast-paced world of Artificial Intelligence. 

We are excited to announce not one but two new enhancements to the Copy Into feature for Fabric Warehouse: Copy Into with Entra ID Authentication and Copy Into for Firewall-Enabled Storage!

Entra ID Authentication  

When authenticating storage accounts in your environment, the executing user’s Entra ID will now be used by default. This ensures that you can leverage A ccess C ontrol L ists and R ole – B ased a ccess c ontrol to authenticate to your storage accounts when using Copy Into. Currently, only organizational accounts are supported.  

How to Use Entra ID Authentication  

  • Ensure your Entra ID organizational account has access to the underlying storage and can execute the Copy Into statement on your Fabric Warehouse.  
  • Run your Copy Into statement without specifying any credentials; the Entra ID organizational account will be used as the default authentication mechanism.  

Copy into firewall-enabled storage

The Copy Into for firewall-enabled storage leverages the trusted workspace access functionality ( Trusted workspace access in Microsoft Fabric (preview) – Microsoft Fabric | Microsoft Learn ) to establish a secure and seamless connection between Fabric and your storage accounts. Secure access can be enabled for both blob and ADLS Gen2 storage accounts. Secure access with Copy Into is available for warehouses in workspaces with Fabric Capacities (F64 or higher).  

To learn more about Copy into , please refer to COPY INTO (Transact-SQL) – Azure Synapse Analytics and Microsoft Fabric | Microsoft Learn  

We are excited to announce the launch of our new feature, Just in Time Database Attachment, which will significantly enhance your first experience, such as when connecting to the Datawarehouse or SQL endpoint or simply opening an item. These actions trigger the workspace resource assignment process, where, among other actions, we attach all necessary metadata of your items, Data warehouses and SQL endpoints, which can be a long process, particularly for workspaces that have a high number of items.  

This feature is designed to attach your desired database during the activation process of your workspace, allowing you to execute queries immediately and avoid unnecessary delays. However, all other databases will be attached asynchronously in the background while you are able to execute queries, ensuring a smooth and efficient experience. 

Data Engineering 

We are advancing Fabric Runtime 1.3 from an Experimental Public Preview to a full Public Preview. Our Apache Spark-based big data execution engine, optimized for both data engineering and science workflows, has been updated and fully integrated into the Fabric platform. 

The enhancements in Fabric Runtime 1.3 include the incorporation of Delta Lake 3.1, compatibility with Python 3.11, support for Starter Pools, integration with Environment and library management capabilities. Additionally, Fabric Runtime now enriches the data science experience by supporting the R language and integrating Copilot. 

power bi premium capacity assignment

We are pleased to share that the Native Execution Engine for Fabric Runtime 1.2 is currently available in public preview. The Native Execution Engine can greatly enhance the performance for your Spark jobs and queries. The engine has been rewritten in C++ and operates in columnar mode and uses vectorized processing. The Native Execution Engine offers superior query performance – encompassing data processing, ETL, data science, and interactive queries – all directly on your data lake. Overall, Fabric Spark delivers a 4x speed-up on the sum of execution time of all 99 queries in the TPC-DS 1TB benchmark when compared against Apache Spark.  This engine is fully compatible with Apache Spark™ APIs (including Spark SQL API). 

It is seamless to use with no code changes – activate it and go. Enable it in your environment for your notebooks and your SJDs. 

power bi premium capacity assignment

This feature is in the public preview, at this stage of the preview, there is no additional cost associated with using it. 

We are excited to announce the Spark Monitoring Run Series Analysis features, which allow you to analyze the run duration trend and performance comparison for Pipeline Spark activity recurring run instances and repetitive Spark run activities from the same Notebook or Spark Job Definition.   

  • Run Series Comparison: Users can compare the duration of a Notebook run with that of previous runs and evaluate the input and output data to understand the reasons behind prolonged run durations.  
  • Outlier Detection and Analysis: The system can detect outliers in the run series and analyze them to pinpoint potential contributing factors. 
  • Detailed Run Instance Analysis: Clicking on a specific run instance provides detailed information on time distribution, which can be used to identify performance enhancement opportunities. 
  • Configuration Insights : Users can view the Spark configuration used for each run, including auto-tuned configurations for Spark SQL queries in auto-tune enabled Notebook runs. 

You can access the new feature from the item’s recent runs panel and Spark application monitoring page. 

power bi premium capacity assignment

We are excited to announce that Notebook now supports the ability to tag others in comments, just like the familiar functionality of using Office products!   

When you select a section of code in a cell, you can add a comment with your insights and tag one or more teammates to collaborate or brainstorm on the specifics. This intuitive enhancement is designed to amplify collaboration in your daily development work. 

Moreover, you can easily configure the permissions when tagging someone who doesn’t have the permission, to make sure your code asset is well managed. 

power bi premium capacity assignment

We are thrilled to unveil a significant enhancement to the Fabric notebook ribbon, designed to elevate your data science and engineering workflows. 

power bi premium capacity assignment

In the new version, you will find the new Session connect control on the Home tab, and now you can start a standard session without needing to run a code cell. 

power bi premium capacity assignment

You can also easily spin up a High concurrency session and share the session across multiple notebooks to improve the compute resource utilization. And you can easily attach/leave a high concurrency session with a single click. 

power bi premium capacity assignment

The “ View session information ” can navigate you to the session information dialog, where you can find a lot of useful detailed information, as well as configure the session timeout. The diagnostics info is essentially helpful when you need support for notebook issues. 

power bi premium capacity assignment

Now you can easily access the powerful “ Data Wrangler ” on Home tab with the new ribbon! You can explore your data with the fancy low-code experience of data wrangler, and the pandas DataFrames and Spark DataFrames are all supported.   

power bi premium capacity assignment

We recently made some changes to the Fabric notebook metadata to ensure compliance and consistency: 

Notebook file content: 

  • The keyword “trident” has been replaced with “dependencies” in the notebook content. This adjustment ensures consistency and compliance. 
  • Notebook Git format: 
  • The preface of the notebook has been modified from “# Synapse Analytics notebook source” to “# Fabric notebook source”. 
  • Additionally, the keyword “synapse” has been updated to “dependencies” in the Git repo. 

The above changes will be marked as ‘uncommitted’ for one time if your workspace is connected to Git. No action is needed in terms of these changes , and there won’t be any breaking scenario within the Fabric platform . If you have any further updates or questions, feel free to share with us. 

We are thrilled to announce that the environment is now a generally available item in Microsoft Fabric. During this GA timeframe, we have shipped a few new features of Environment. 

  • Git support  

power bi premium capacity assignment

The environment is now Git supported. You can check-in the environment into your Git repo and manipulate the environment locally with its YAML representations and custom library files. After updating the changes from local to Fabric portal, you can publish them by manual action or through REST API. 

  • Deployment pipeline  

power bi premium capacity assignment

Deploying environments from one workspace to another is supported.  Now, you can deploy the code items and their dependent environments together from development to test and even production. 

With the REST APIs, you can have the code-first experience with the same abilities through Fabric portal. We provide a set of powerful APIs to ensure you the efficiency in managing your environment. You can create new environments, update libraries and Spark compute, publish the changes, delete an environment, attach the environment to a notebook, etc., all actions can be done locally in the tools of your choice. The article – Best practice of managing environments with REST API could help you get started with several real-world scenarios.  

  • Resources folder   

power bi premium capacity assignment

Resources folder enables managing small resources in the development cycle. The files uploaded in the environment can be accessed from notebooks once they’re attached to the same environment. The manipulation of the files and folders of resources happens in real-time. It could be super powerful, especially when you are collaborating with others. 

power bi premium capacity assignment

Sharing your environment with others is also available. We provide several sharing options. By default, the view permission is shared. If you want the recipient to have access to view and use the contents of the environment, sharing without permission customization is the best option. Furthermore, you can grant editing permission to allow recipients to update this environment or grant share permission to allow recipients to reshare this environment with their existing permissions. 

We are excited to announce the REST api support for Fabric Data Engineering/Science workspace settings.  Data Engineering/Science settings allows users to create/manage their Spark compute, select the default runtime/default environment, enable or disable high concurrency mode or ML autologging.  

power bi premium capacity assignment

Now with the REST api support for the Data Engineering/Science settings, you would be able to  

  • Choose the default pool for a Fabric Workspace 
  • Configure the max nodes for Starter pools 
  • Create/Update/Delete the existing Custom Pools, Autoscale and Dynamic allocation properties  
  • Choose Workspace Default Runtime and Environment  
  • Select a default runtime 
  • Select the default environment for the Fabric workspace  
  • Enable or Disable High Concurrency Mode 
  • Enable or Disable ML Auto logging.  

Learn more about the Workspace Spark Settings API in our API documentation Workspace Settings – REST API (Spark) | Microsoft Learn  

We are excited to give you a sneak peek at the preview of User Data Functions in Microsoft Fabric. User Data Functions gives developers and data engineers the ability to easily write and run applications that integrate with resources in the Fabric Platform. Data engineering often presents challenges with data quality or complex data analytics processing in data pipelines, and using ETL tools may present limited flexibility and ability to customize to your needs. This is where User data functions can be used to run data transformation tasks and perform complex business logic by connecting to your data sources and other workloads in Fabric.  

During preview, you will be able to use the following features:  

  • Use the Fabric portal to create new User Data Functions, view and test them.  
  • Write your functions using C#.   
  • Use the Visual Studio Code extension to create and edit your functions.  
  • Connect to the following Fabric-native data sources: Data Warehouse, Lakehouse and Mirrored Databases.   

You can now create a fully managed GraphQL API in Fabric to interact with your data in a simple, flexible, and powerful way. We’re excited to announce the public preview of API for GraphQL, a data access layer that allows us to query multiple data sources quickly and efficiently in Fabric by leveraging a widely adopted and familiar API technology that returns more data with less client requests.  With the new API for GraphQL in Fabric, data engineers and scientists can create data APIs to connect to different data sources, use the APIs in their workflows, or share the API endpoints with app development teams to speed up and streamline data analytics application development in your business. 

You can get started with the API for GraphQL in Fabric by creating an API, attaching a supported data source, then selecting specific data sets you want to expose through the API. Fabric builds the GraphQL schema automatically based on your data, you can test and prototype queries directly in our graphical in-browser GraphQL development environment (API editor), and applications are ready to connect in minutes. 

Currently, the following supported data sources can be exposed through the Fabric API for GraphQL: 

  • Microsoft Fabric Data Warehouse 
  • Microsoft Fabric Lakehouse via SQL Analytics Endpoint 
  • Microsoft Fabric Mirrored Databases via SQL Analytics Endpoint 

Click here to learn more about how to get started. 

power bi premium capacity assignment

Data Science 

As you may know, Copilot in Microsoft Fabric requires your tenant administrator to enable the feature from the admin portal. Starting May 20th, 2024, Copilot in Microsoft Fabric will be enabled by default for all tenants. This update is part of our continuous efforts to enhance user experience and productivity within Microsoft Fabric. This new default activation means that AI features like Copilot will be automatically enabled for tenants who have not yet enabled the setting.  

We are introducing a new capability to enable Copilot on Capacity level in Fabric. A new option is being introduced in the tenant admin portal, to delegate the enablement of AI and Copilot features to Capacity administrators.  This AI and Copilot setting will be automatically delegated to capacity administrators and tenant administrators won’t be able to turn off the delegation.   

We also have a cross-geo setting for customers who want to use Copilot and AI features while their capacity is in a different geographic region than the EU data boundary or the US. By default, the cross-geo setting will stay off and will not be delegated to capacity administrators automatically.  Tenant administrators can choose whether to delegate this to capacity administrators or not. 

power bi premium capacity assignment

Figure 1.  Copilot in Microsoft Fabric will be auto enabled and auto delegated to capacity administrators. 

power bi premium capacity assignment

Capacity administrators will see the “Copilot and Azure OpenAI Service (preview)” settings under Capacity settings/ Fabric Capacity / <Capacity name> / Delegated tenant settings. By default, the capacity setting will inherit tenant level settings. Capacity administrators can decide whether to override the tenant administrator’s selection. This means that even if Copilot is not enabled on a tenant level, a capacity administrator can choose to enable Copilot for their capacity. With this level of control, we make it easier to control which Fabric workspaces can utilize AI features like Copilot in Microsoft Fabric. 

power bi premium capacity assignment

To enhance privacy and trust, we’ve updated our approach to abuse monitoring: previously, we retained data from Copilot in Fabric, including prompt inputs and outputs, for up to 30 days to check for misuse. Following customer feedback, we’ve eliminated this 30-day retention. Now, we no longer store prompt related data, demonstrating our unwavering commitment to your privacy and security. We value your input and take your concerns seriously. 

Real-Time Intelligence 

This month includes the announcement of Real-Time Intelligence, the next evolution of Real-Time Analytics and Data Activator. With Real-Time Intelligence, Fabric extends to the world of streaming and high granularity data, enabling all users in your organization to collect, analyze and act on this data in a timeline manner making faster and more informed business decisions. Read the full announcement from Build 2024. 

Real-Time Intelligence includes a wide range of capabilities across ingestion, processing, analysis, transformation, visualization and taking action. All of this is supported by the Real-Time hub, the central place to discover and manage streaming data and start all related tasks.  

Read on for more information on each capability and stay tuned for a series of blogs describing the features in more detail. All features are in Public Preview unless otherwise specified. Feedback on any of the features can be submitted at https://aka.ms/rtiidea    

Ingest & Process  

  • Introducing the Real-Time hub 
  • Get Events with new sources of streaming and event data 
  • Source from Real-Time Hub in Enhanced Eventstream  
  • Use Real-Time hub to Get Data in KQL Database in Eventhouse 
  • Get data from Real-Time Hub within Reflexes 
  • Eventstream Edit and Live modes 
  • Default and derived streams 
  • Route data streams based on content 

Analyze & Transform  

  • Eventhouse GA 
  • Eventhouse OneLake availability GA 
  • Create a database shortcut to another KQL Database 
  • Support for AI Anomaly Detector  
  • Copilot for Real-Time Intelligence 
  • Tenant-level private endpoints for Eventhouse 

Visualize & Act  

  • Visualize data with Real-Time Dashboards  
  • New experience for data exploration 
  • Create triggers from Real-Time Hub 
  • Set alert on Real-time Dashboards 
  • Taking action through Fabric Items 

Ingest & Process 

Real-Time hub is the single place for all data-in-motion across your entire organization. Several key features are offered in Real-Time hub: 

1. Single place for data-in-motion for the entire organization  

Real-Time hub enables users to easily discover, ingest, manage, and consume data-in-motion from a wide variety of sources. It lists all the streams and KQL tables that customers can directly act on. 

2. Real-Time hub is never empty  

All data streams in Fabric automatically show up in the hub. Also, users can subscribe to events in Fabric gaining insights into the health and performance of their data ecosystem. 

3. Numerous connectors to simplify data ingestion from anywhere to Real-Time hub  

Real-Time hub makes it easy for you to ingest data into Fabric from a wide variety of sources like AWS Kinesis, Kafka clusters, Microsoft streaming sources, sample data and Fabric events using the Get Events experience.  

There are 3 tabs in the hub:  

  • Data streams : This tab contains all streams that are actively running in Fabric that user has access to. This includes all streams from Eventstreams and all tables from KQL Databases. 
  • Microsoft sources : This tab contains Microsoft sources (that user has access to) and can be connected to Fabric. 
  • Fabric events : Fabric now has event-driven capabilities to support real-time notifications and data processing. Users can monitor and react to events including Fabric Workspace Item events and Azure Blob Storage events. These events can be used to trigger other actions or workflows, such as invoking a data pipeline or sending a notification via email. Users can also send these events to other destinations via Event Streams. 

Learn More  

You can now connect to data from both inside and outside of Fabric in a mere few steps.  Whether data is coming from new or existing sources, streams, or available events, the Get Events experience allows users to connect to a wide range of sources directly from Real-Time hub, Eventstreams, Eventhouse and Data Activator.  

This enhanced capability allows you to easily connect external data streams into Fabric with out-of-box experience, giving you more options and helping you to get real-time insights from various sources. This includes Camel Kafka connectors powered by Kafka connect to access popular data platforms, as well as the Debezium connectors for fetching the Change Data Capture (CDC) streams. 

Using Get Events, bring streaming data from Microsoft sources directly into Fabric with a first-class experience.  Connectivity to notification sources and discrete events is also included, this enables access to notification events from Azure and other clouds solutions including AWS and GCP.  The full set of sources which are currently supported are: 

  • Microsoft sources : Azure Event Hubs, Azure IoT hub 
  • External sources : Google Cloud Pub/Sub, Amazon Kinesis Data Streams, Confluent Cloud Kafka 
  • Change data capture databases : Azure SQL DB (CDC), PostgreSQL DB (CDC), Azure Cosmos DB (CDC), MySQL DB (CDC)  
  • Fabric events : Fabric Workspace Item events, Azure Blob Storage events  

power bi premium capacity assignment

Learn More   

With enhanced Eventstream, you can now stream data not only from Microsoft sources but also from other platforms like Google Cloud, Amazon Kinesis, Database change data capture streams, etc. using our new messaging connectors. The new Eventstream also lets you acquire and route real-time data not only from stream sources but also from discrete event sources, such as: Azure Blob Storage events, Fabric Workspace Item events. 

To use these new sources in Eventstream, simply create an eventstream with choosing “Enhanced Capabilities (preview)”. 

power bi premium capacity assignment

You will see the new Eventstream homepage that gives you some choices to begin with. By clicking on the “Add external source”, you will find these sources in the Get events wizard that helps you to set up the source in a few steps. After you add the source to your eventstream, you can publish it to stream the data into your eventstream.  

Using Eventstream with discrete sources to turn events into streams for more analysis. You can send the streams to different Fabric data destinations, like Lakehouse and KQL Database. After the events are converted, a default stream will appear in Real-Time Hub. To turn them, click Edit on ribbon, select “Stream events” on the source node, and publish your eventstream. 

To transform the stream data or route it to different Fabric destinations based on its content, you can click Edit in ribbon and enter the Edit mode. There you can add event processing operators and destinations. 

With Real-Time hub embedded in KQL Database experience, each user in the tenant can view and add streams which they have access to and directly ingest it to a KQL Database table in Eventhouse.  

This integration provides each user in the tenant with the ability to access and view data streams they are permitted to. They can now directly ingest these streams into a KQL Database table in Eventhouse. This simplifies the data discovery and ingestion process by allowing users to directly interact with the streams. Users can filter data based on the Owner, Parent and Location and provides additional information such as Endorsement and Sensitivity. 

You can access this by clicking on the Get Data button from the Database ribbon in Eventhouse. 

power bi premium capacity assignment

This will open the Get Data wizard with Real-Time hub embedded. 

Inserting image...

You can use events from Real-Time hub directly in reflex items as well. From within the main reflex UI, click ‘Get data’ in the toolbar: 

power bi premium capacity assignment

This will open a wizard that allows you to connect to new event sources or browse Real-Time Hub to use existing streams or system events. 

Search new stream sources to connect to or select existing streams and tables to be ingested directly by Reflex. 

power bi premium capacity assignment

You then have access to the full reflex modeling experience to build properties and triggers over any events from Real-Time hub.  

Eventstream offers two distinct modes, Edit and Live, to provide flexibility and control over the development process of your eventstream. If you create a new Eventstream with Enhanced Capabilities enabled, you can modify it in an Edit mode. Here, you can design stream processing operations for your data streams using a no-code editor. Once you complete the editing, you can publish your Eventstream and visualize how it starts streaming and processing data in Live mode .   

power bi premium capacity assignment

In Edit mode, you can:   

  • Make changes to an Eventstream without implementing them until you publish the Eventstream. This gives you full control over the development process.  
  • Avoid test data being streamed to your Eventstream. This mode is designed to provide a secure environment for testing without affecting your actual data streams. 

For Live mode, you can :  

  • Visualize how your Eventstream streams, transforms, and routes your data streams to various destinations after publishing the changes.  
  • Pause the flow of data on selected sources and destinations, providing you with more control over your data streams being streamed into your Eventstream.  

When you create a new Eventstream with Enhanced Capabilities enabled, you can now create and manage multiple data streams within Eventstream, which can then be displayed in the Real-Time hub for others to consume and perform further analysis.  

There are two types of streams:   

  • Default stream : Automatically generated when a streaming source is added to Eventstream. Default stream captures raw event data directly from the source, ready for transformation or analysis.  
  • Derived stream : A specialized stream that users can create as a destination within Eventstream. Derived stream can be created after a series of operations such as filtering and aggregating, and then it’s ready for further consumption or analysis by other users in the organization through the Real-Time Hub.  

The following example shows that when creating a new Eventstream a default stream alex-es1-stream is automatically generated. Subsequently, a derived stream dstream1 is added after an Aggregate operation within the Eventstream. Both default and derived streams can be found in the Real-Time hub.  

power bi premium capacity assignment

Customers can now perform stream operations directly within Eventstream’s Edit mode, instead of embedding in a destination. This enhancement allows you to design stream processing logics and route data streams in the top-level canvas. Custom processing and routing can be applied to individual destinations using built-in operations, allowing for routing to distinct destinations within the Eventstream based on different stream content. 

These operations include:  

  • Aggregate : Perform calculations such as SUM, AVG, MIN, and MAX on a column of values and return a single result. 
  • Expand : Expand array values and create new rows for each element within the array.  
  • Filter : Select or filter specific rows from the data stream based on a condition. 
  • Group by : Aggregate event data within a certain time window, with the option to group one or more columns.  
  • Manage Fields : Customize your data streams by adding, removing, or changing data type of a column.  
  • Union : Merge two or more data streams with shared fields (same name and data type) into a unified data stream.  

Analyze & Transform 

Eventhouse, a cutting-edge database workspace meticulously crafted to manage and store event-based data, is now officially available for general use. Optimized for high granularity, velocity, and low latency streaming data, it incorporates indexing and partitioning for structured, semi-structured, and free text data. With Eventhouse, users can perform high-performance analysis of big data and real-time data querying, processing billions of events within seconds. The platform allows users to organize data into compartments (databases) within one logical item, facilitating efficient data management.  

Additionally, Eventhouse enables the sharing of compute and cache resources across databases, maximizing resource utilization. It also supports high-performance queries across databases and allows users to apply common policies seamlessly. Eventhouse offers content-based routing to multiple databases, full view lineage, and high granularity permission control, ensuring data security and compliance. Moreover, it provides a simple migration path from Azure Synapse Data Explorer and Azure Data Explorer, making adoption seamless for existing users. 

power bi premium capacity assignment

Engineered to handle data in motion, Eventhouse seamlessly integrates indexing and partitioning into its storing process, accommodating various data formats. This sophisticated design empowers high-performance analysis with minimal latency, facilitating lightning-fast ingestion and querying within seconds. Eventhouse is purpose-built to deliver exceptional performance and efficiency for managing event-based data across diverse applications and industries. Its intuitive features and seamless integration with existing Azure services make it an ideal choice for organizations looking to leverage real-time analytics for actionable insights. Whether it’s analyzing telemetry and log data, time series and IoT data, or financial records, Eventhouse provides the tools and capabilities needed to unlock the full potential of event-based data. 

We’re excited to announce that OneLake availability of Eventhouse in Delta Lake format is Generally Available. 

Delta Lake  is the unified data lake table format chosen to achieve seamless data access across all compute engines in Microsoft Fabric. 

The data streamed into Eventhouse is stored in an optimized columnar storage format with full text indexing and supports complex analytical queries at low latency on structured, semi-structured, and free text data. 

Enabling data availability of Eventhouse in OneLake means that customers can enjoy the best of both worlds: they can query the data with high performance and low latency in their  Eventhouse and query the same data in Delta Lake format via any other Fabric engines such as Power BI Direct Lake mode, Warehouse, Lakehouse, Notebooks, and more. 

To learn more, please visit https://learn.microsoft.com/en-gb/fabric/real-time-analytics/one-logical-copy 

A database shortcut in Eventhouse is an embedded reference to a source database. The source database can be one of the following: 

  • (Now Available) A KQL Database in Real-Time Intelligence  
  • An Azure Data Explorer database  

The behavior exhibited by the database shortcut is similar to that of a follower database  

The owner of the source database, the data provider, shares the database with the creator of the shortcut in Real-Time Intelligence, the data consumer. The owner and the creator can be the same person. The database shortcut is attached in read-only mode, making it possible to view and run queries on the data that was ingested into the source KQL Database without ingesting it.  

This helps with data sharing scenarios where you can share data in-place either within teams, or even with external customers.  

AI Anomaly Detector is an Azure service for high quality detection of multivariate and univariate anomalies in time series. While the standalone version is being retired October 2026, Microsoft open sourced the anomaly detection core algorithms and they are now supported in Microsoft Fabric. Users can leverage these capabilities in Data Science and Real-Time Intelligence workload. AI Anomaly Detector models can be trained in Spark Python notebooks in Data Science workload, while real time scoring can be done by KQL with inline Python in Real-Time Intelligence. 

We are excited to announce the Public Preview of Copilot for Real-Time Intelligence. This initial version includes a new capability that translates your natural language questions about your data to KQL queries that you can run and get insights.  

Your starting point is a KQL Queryset, that is connected to a KQL Database, or to a standalone Kusto database:  

power bi premium capacity assignment

Simply type the natural language question about what you want to accomplish, and Copilot will automatically translate it to a KQL query you can execute. This is extremely powerful for users who may be less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse. 

power bi premium capacity assignment

Stay tuned for more capabilities from Copilot for Real-Time Intelligence!   

Customers can increase their network security by limiting access to Eventhouse at a tenant-level, from one or more virtual networks (VNets) via private links. This will prevent unauthorized access from public networks and only permit data plane operations from specific VNets.  

Visualize & Act 

Real-Time Dashboards have a user-friendly interface, allowing users to quickly explore and analyze their data without the need for extensive technical knowledge. They offer a high refresh frequency, support a range of customization options, and are designed to handle big data.  

The following visual types are supported, and can be customized with the dashboard’s user-friendly interface: 

power bi premium capacity assignment

You can also define conditional formatting rules to format the visual data points by their values using colors, tags, and icons. Conditional formatting can be applied to a specific set of cells in a predetermined column or to entire rows, and lets you easily identify interesting data points. 

Beyond the support visual, Real-Time Dashboards provide several capabilities to allow you to interact with your data by performing slice and dice operations for deeper analysis and gaining different viewpoints. 

  • Parameters are used as building blocks for dashboard filters and can be added to queries to filter the data presented by visuals. Parameters can be used to slice and dice dashboard visuals either directly by selecting parameter values in the filter bar or by using cross-filters. 
  • Cross filters allow you to select a value in one visual and filter all other visuals on that dashboard based on the selected data point. 
  • Drillthrough capability allows you to select a value in a visual and use it to filter the visuals in a target page in the same dashboard. When the target page opens, the value is pushed to the relevant filters.    

Real-Time Dashboards can be shared broadly and allow multiple stakeholders to view dynamic, real time, fresh data while easily interacting with it to gain desired insights. 

Directly from a real-time dashboard, users can refine their exploration using a user-friendly, form-like interface. This intuitive and dynamic experience is tailored for insights explorers craving insights based on real-time data. Add filters, create aggregations, and switch visualization types without writing queries to easily uncover insights.  

With this new feature, insights explorers are no longer bound by the limitations of pre-defined dashboards. As independent explorers, they have the freedom for ad-hoc exploration, leveraging existing tiles to kickstart their journey. Moreover, they can selectively remove query segments, and expand their view of the data landscape.  

power bi premium capacity assignment

Dive deep, extract meaningful insights, and chart actionable paths forward, all with ease and efficiency, and without having to write complex KQL queries.  

Data Activator allows you to monitor streams of data for various conditions and set up actions to be taken in response. These triggers are available directly within the Real-Time hub and in other workloads in Fabric. When the condition is detected, an action will automatically be kicked off such as sending alerts via email or Teams or starting jobs in Fabric items.  

When you browse the Real-Time Hub, you’ll see options to set triggers in the detail pages for streams. 

power bi premium capacity assignment

Selecting this will open a side panel where you can configure the events you want to monitor, the conditions you want to look for in the events, and the action you want to take while in the Real-Time hub experience. 

power bi premium capacity assignment

Completing this pane creates a new reflex item with a trigger that monitors the selected events and condition for you. Reflexes need to be created in a workspace supported by a Fabric or Power BI Premium capacity – this can be a trial capacity so you can get started with it today! 

power bi premium capacity assignment

Data Activator has been able to monitor Power BI report data since it was launched, and we now support monitoring of Real-Time Dashboard visuals in the same way.

From real-time dashboard tiles you can click the ellipsis (…) button and select “Set alert”

power bi premium capacity assignment

This opens the embedded trigger pane, where you can specify what conditions, you are looking for. You can choose whether to send email or Teams messages as the alert when these conditions are met.

When creating a new reflex trigger, from Real-time Hub or within the reflex item itself, you’ll notice a new ‘Run a Fabric item’ option in the Action section. This will create a trigger that starts a new Fabric job whenever its condition is met, kicking off a pipeline or notebook computation in response to Fabric events. A common scenario would be monitoring Azure Blob storage events via Real-Time Hub, and running data pipeline jobs when Blog Created events are detected. 

This capability is extremely powerful and moves Fabric from a scheduled driven platform to an event driven platform.  

power bi premium capacity assignment

Pipelines, spark jobs, and notebooks are just the first Fabric items we’ll support here, and we’re keen to hear your feedback to help prioritize what else we support. Please leave ideas and votes on https://aka.ms/rtiidea and let us know! 

Real-Time Intelligence, along with the Real-Time hub, revolutionizes what’s possible with real-time streaming and event data within Microsoft Fabric.  

Learn more and try it today https://aka.ms/realtimeintelligence   

Data Factory 

Dataflow gen2 .

We are thrilled to announce that the Power Query SDK is now generally available in Visual Studio Code! This marks a significant milestone in our commitment to providing developers with powerful tools to enhance data connectivity and transformation. 

The Power Query SDK is a set of tools that allow you as the developer to create new connectors for Power Query experiences available in products such as Power BI Desktop, Semantic Models, Power BI Datamarts, Power BI Dataflows, Fabric Dataflow Gen2 and more. 

This new SDK has been in public preview since November of 2022, and we’ve been hard at work improving this experience which goes beyond what the previous Power Query SDK in Visual Studio had to offer.  

The latest of these biggest improvements was the introduction of the Test Framework in March of 2024 that solidifies the developer experience that you can have within Visual Studio Code and the Power Query SDK for creating a Power Query connector. 

The Power Query SDK extension for Visual Studio will be deprecated by June 30, 2024, so we encourage you to give this new Power Query SDK in Visual Studio Code today if you haven’t.  

power bi premium capacity assignment

To get started with the Power Query SDK in Visual Studio Code, simply install it from the Visual Studio Code Marketplace . Our comprehensive documentation and tutorials are available to help you harness the full potential of your data. 

Join our vibrant community of developers to share insights, ask questions, and collaborate on exciting projects. Our dedicated support team is always ready to assist you with any queries. 

We look forward to seeing the innovative solutions you’ll create with the Power Query SDK in Visual Studio Code. Happy coding! 

Introducing a convenient enhancement to the Dataflows Gen2 Refresh History experience! Now, alongside the familiar “X” button in the Refresh History screen, you’ll find a shiny new Refresh Button . This small but mighty addition empowers users to refresh the status of their dataflow refresh history status without the hassle of exiting the refresh history and reopening it. Simply click the Refresh Button , and voilà! Your dataflow’s refresh history status screen is updated, keeping you in the loop with minimal effort. Say goodbye to unnecessary clicks and hello to streamlined monitoring! 

power bi premium capacity assignment

  • [New] OneStream : The OneStream Power Query Connector enables you to seamlessly connect Data Factory to your OneStream applications by simply logging in with your OneStream credentials. The connector uses your OneStream security, allowing you to access only the data you have based on your permissions within the OneStream application. Use the connector to pull cube and relational data along with metadata members, including all their properties. Visit OneStream Power BI Connector to learn more. Find this connector in the other category. 

Data workflows  

We are excited to announce the preview of ‘Data workflows’, a new feature within the Data Factory that revolutionizes the way you build and manage your code-based data pipelines. Powered by Apache Airflow, Data workflows offer seamless authoring, scheduling, and monitoring experience for Python-based data processes defined as Directed Acyclic Graphs (DAGs). This feature brings a SaaS-like experience to running DAGs in a fully managed Apache Airflow environment, with support for autoscaling , auto-pause , and rapid cluster resumption to enhance cost-efficiency and performance.  

It also includes native cloud-based authoring capabilities and comprehensive support for Apache Airflow plugins and libraries. 

To begin using this feature: 

  • Access the Microsoft Fabric Admin Portal. 
  • Navigate to Tenant Settings. 

Under Microsoft Fabric options, locate and expand the ‘Users can create and use Data workflows (preview)’ section. Note: This action is necessary only during the preview phase of Data workflows. 

power bi premium capacity assignment

2. Create a new Data workflow within an existing or new workspace. 

power bi premium capacity assignment

3. Add a new Directed Acyclic Graph (DAG) file via the user interface. 

power bi premium capacity assignment

4.  Save your DAG(s). 

power bi premium capacity assignment

5. Use Apache Airflow monitoring tools to observe your DAG executions. In the ribbon, click on Monitor in Apache Airflow. 

power bi premium capacity assignment

For additional information, please consult the product documentation .   If you’re not already using Fabric capacity, consider signing up for the Microsoft Fabric free trial to evaluate this feature. 

Data Pipelines 

We are excited to announce a new feature in Fabric that enables you to create data pipelines to access your firewall-enabled Azure Data Lake Storage Gen2 (ADLS Gen2) accounts. This feature leverages the workspace identity to establish a secure and seamless connection between Fabric and your storage accounts. 

With trusted workspace access, you can create data pipelines to your storage accounts with just a few clicks. Then you can copy data into Fabric Lakehouse and start analyzing your data with Spark, SQL, and Power BI. Trusted workspace access is available for workspaces in Fabric capacities (F64 or higher). It supports organizational accounts or service principal authentication for storage accounts. 

How to use trusted workspace access in data pipelines  

Create a workspace identity for your Fabric workspace. You can follow the guidelines provided in Workspace identity in Fabric . 

Configure resource instance rules for the Storage account that you want to access from your Fabric workspace. Resource instance rules for Fabric workspaces can only be created through ARM templates. Follow the guidelines for configuring resource instance rules for Fabric workspaces here . 

Create a data pipeline to copy data from the firewall enabled ADLS gen2 account to a Fabric Lakehouse. 

To learn more about how to use trusted workspace access in data pipelines, please refer to Trusted workspace access in Fabric . 

We hope you enjoy this new feature for your data integration and analytics scenarios. Please share your feedback and suggestions with us by leaving a comment here. 

Introducing Blob Storage Event Triggers for Data Pipelines 

A very common use case among data pipeline users in a cloud analytics solution is to trigger your pipeline when a file arrives or is deleted. We have introduced Azure Blob storage event triggers as a public preview feature in Fabric Data Factory Data Pipelines. This utilizes the Fabric Reflex alerts capability that also leverages Event Streams in Fabric to create event subscriptions to your Azure storage accounts. 

power bi premium capacity assignment

Parent/Child pipeline pattern monitoring improvements

Today, in Fabric Data Factory Data Pipelines, when you call another pipeline using the Invoke Pipeline activity, the child pipeline is not visible in the monitoring view. We have made updates to the Invoke Pipeline activity so that you can view your child pipeline runs. This requires an upgrade to any pipelines that you have in Fabric that already use the current Invoke Pipeline activity. You will be prompted to upgrade when you edit your pipeline and then provide a connection to your workspace to authenticate. Another additional new feature that will light up with this invoke pipeline activity update is the ability to invoke pipeline across workspaces in Fabric. 

power bi premium capacity assignment

We are excited to announce the availability of the Fabric Spark job definition activity for data pipelines. With this new activity, you will be able to run a Fabric Spark Job definition directly in your pipeline. Detailed monitoring capabilities of your Spark Job definition will be coming soon!  

power bi premium capacity assignment

To learn more about this activity, read https://aka.ms/SparkJobDefinitionActivity  

We are excited to announce the availability of the Azure HDInsight activity for data pipelines. The Azure HDInsight activity allows you to execute Hive queries, invoke a MapReduce program, execute Pig queries, execute a Spark program, or a Hadoop Stream program. Invoking either of the 5 activities can be done in a singular Azure HDInsight activity, and you can invoke this activity using your own or on-demand HDInsight cluster. 

To learn more about this activity, read https://aka.ms/HDInsightsActivity  

power bi premium capacity assignment

We are thrilled to share the new Modern Get Data experience in Data Pipeline to empower users intuitively and efficiently discover the right data, right connection info and credentials.   

power bi premium capacity assignment

In the data destination, users can easily set destination by creating a new Fabric item or creating another destination or selecting existing Fabric item from OneLake data hub. 

power bi premium capacity assignment

In the source tab of Copy activity, users can conveniently choose recent used connections from drop down or create a new connection using “More” option to interact with Modern Get Data experience. 

power bi premium capacity assignment

Related blog posts

Microsoft fabric april 2024 update.

Welcome to the April 2024 update! This month, you’ll find many great new updates, previews, and improvements. From Shortcuts to Google Cloud Storage and S3 compatible data sources in preview, Optimistic Job Admission for Fabric Spark, and New KQL Queryset Command Bar, that’s just a glimpse into this month’s update. There’s much more to explore! … Continue reading “Microsoft Fabric April 2024 Update”

Microsoft Fabric March 2024 Update

Welcome to the March 2024 update. We have a lot of great features this month including OneLake File Explorer, Autotune Query Tuning, Test Framework for Power Query SDK in VS Code, and many more! Earn a free Microsoft Fabric certification exam!  We are thrilled to announce the general availability of Exam DP-600, which leads to … Continue reading “Microsoft Fabric March 2024 Update”

Microsoft Power BI Blog

Copilot in microsoft fabric is now generally available in the power bi experience.

Headshot of article author Kim Manis

In the right hands, new innovations have always sparked transformation. From the combustion engine to the creation of the web, these innovations reshaped how we interact with the world around us. And today, we are seeing the next great shift with AI. It’s one of the most exciting innovations of our generation, and its impact is already being felt across individuals, entire teams, and industries. Use cases range from customer service chat to better knowledge mining, to content generation and speech analytics.

One of the biggest opportunities we’re seeing is in accelerating the path to a data-rich culture by enhancing the productivity of your data teams and making analytics more accessible to every user. That’s why we’ve invested heavily in bringing to life Copilot in Microsoft Fabric —an AI experience embedded directly into Microsoft Fabric that can help every user unlock the full potential of their data. We are thrilled to share that Copilot in Fabric is now generally available, starting with the Microsoft Power BI experience.

Elcome, one of the world’s largest marine electronics companies, used Microsoft Fabric and Copilot in Fabric to build a new service called “Welcome” that helps maritime crews stay connected to their families and friends.

“With a three-person development team, we were able to leverage Microsoft Fabric and Copilot to build out an ISP-grade monitoring and management platform that is scalable, flexible, and future-proof. Leveraging Microsoft Fabric and Copilot allows them to focus on customer safety and connectivity while at sea, without having to be data scientists,” said Jimmy Grewal, Managing Director of Elcome.  

Copilot in Fabric is now generally available

With Copilot in Fabric, you can use simple conversational language to integrate, transform, prepare, and visualize your data. In the Power BI experience, Copilot can help you create stunning reports and summarize your insights into narrative summaries in seconds. You can simply provide a high-level prompt, and Copilot in Fabric will create an entire report page for you by identifying the tables, fields, measures, and charts relevant to your prompt and adding visuals that best highlight insights in your data. It’s a fast and easy way to get started with a report.

You can also generate custom summaries and answers to questions about the data visualized across the report , taking analysis from hours or even days down to seconds.   Previously, Copilot could only summarize or answer questions about one page at a time, but we are expanding its capabilities to support full report summaries and questions .

The general availability of Copilot in Fabric for the Power BI workload  will be rolling out over the coming weeks to all customers with Power BI Premium capacity (P1 or higher) or Fabric capacity (F64 or higher) .         

While the report creation experience is now generally available, there are other Copilot experiences in Power BI that are available in preview. For example, the Copilot-powered “Data overview” button in the Explore experience provides a summary of your semantic model to help you get started and can even generate synonyms and linguistic relationships to ready your model for natural language interpretation. You can also use Copilot to help you write and explain DAX queries in the DAX query view. And, when creating a report, you can add the Narrative video to summarize the insights on the report page into a written form—helping you communicate the insights that matter most about the data.

We are also excited to share an expansion of Copilot’s capabilities to answer your data-driven questions. Copilot will not only answer the questions about the visuals on your report pages but can search across the entire semantic model. This feature is available in preview in both edit and view modes in the Power BI service.

The Copilot in Fabric experiences for Data Factory, Data Engineering, Data Science, Data Warehouse, and Real-Time Intelligence are also all still available in preview. Copilot in Fabric is also now enabled on-by-default for all eligible tenants, including the preview experiences.

Create Q&A experiences with AI skills

Model explorer  and dax query view are  now generally available.

We are excited to help analysts work even faster and more effectively by releasing the model explorer into general availability.

The model explorer in the model view in Power BI provides end-to-end visibility of your semantic model in a single tree view, helping you find items in your data fast. With the model explorer, you can find items in the search bar, expand and collapse the tree view, see how many of each item are in your semantic mode, and get a single pane access to create or edit semantic model items. You can also use the model explorer to create calculation groups and reduce the number of measures by reusing calculation logic and simplifying semantic model consumption.

The DAX query view in Power BI Desktop  gives users the ability to write, edit, and see the results of Data Analysis Expressions or DAX queries on data in their semantic model. Similar to the Explore feature, users working with a model can validate data and measures without having to build a visual or use an additional tool. With the DAX query view, you can employ a quick query to preview data or show summary statistics to help you better understand the data without creating visuals or writing a DAX query. You can also see multiple measures at once in the editor, make changes, run the query to rest, and update the model—all from the save view. You can debug visuals by seeing the DAX query used by the visual to get data. And finally, you can create and run your own DAX queries and even define and use measures and variables for that DAX query that do not exist in the model.

Explore all the updates across Microsoft Fabric

We are excited to announce additions to the Microsoft Fabric workloads that will make Microsoft Fabric’s capabilities even more robust and customizable to meet the unique needs of each organization. These enhancements include:

  • A completely redesigned workload, Real-Time Intelligence, that brings together and enhances Synapse Real-Time Analytics and Data Activator to help you analyze and act on high-volume, high-granular event streaming data and even explore your organization’s real-time data in the new Real-Time hub. To learn more about the Real-Time Intelligence workload, watch the “ Ingest, analyze and act in real time with Microsoft Fabric ” session at Microsoft Build 2024.
  • A new Microsoft Fabric Workload Development Kit, currently in preview, that makes it easier for software developers and customers to design, build, and interoperate applications within Microsoft Fabric. Applications built with this kit will appear as a native workload within Fabric, and software developers can publish and monetize their custom workloads through the Microsoft Azure Marketplace. Learn more by watching the “ Extend and enhance your analytics applications with Microsoft Fabric ” Microsoft Build session.
  • We are empowering developers with API for GraphQL and user data functions in Microsoft Fabric. API for GraphQL is a flexible and powerful RESTful API that allows data professionals to access data from multiple sources in Fabric with a single query API. User data functions are user-defined functions built for Microsoft Fabric experiences across all data services, such as notebooks, pipelines, or event streams. You can watch these features in action in the “ Introducing API for GraphQL and User Data Functions in Microsoft Fabric ” Microsoft Build session.
  • A new feature in the Data Factory experience called data workflows, powered by Apache Airflow, that can help you author, schedule, and monitor workflows or data pipelines using Python.
  • An expansion of OneLake shortcuts to connect to data from on-premises and network-restricted data sources beyond just Azure Data Lake Service Gen2, now in preview. To learn more about this announcement, watch the Microsoft Build session “ Unify your data with OneLake and Microsoft Fabric. “
  • New Copilot in Fabric experience for Real-Time Intelligence, currently in preview, that enables users to explore real-time data with ease. Starting with a Kusto Query Language (KQL) Queryset connected to a KQL Database in an Eventhouse or a standalone Azure Data Explorer database, you can type your question in conversational language and Copilot will automatically translate it to a KQL query you can execute.
  • An expansion of our existing partnership with Snowflake to create interoperability between Snowflake and Fabric’s OneLake, including support for the Apache Iceberg standard in OneLake and bi-directional data access. Coming first, you will be able to shortcut data from Snowflake and all Apache Iceberg data sources into OneLake.
  • Coming soon, you will be able to access Microsoft Azure Databricks Unity Catalog tables directly in Fabric, making it even easier to unify Azure Databricks with Fabric. Also coming soon, Fabric users will be able to access Fabric data items like lakehouses as a catalog in Azure Databricks.

Check out this video to see a sneak peak of all these announcements:

To learn more about these announcements and others coming to Fabric, check out the Microsoft Fabric blog .

Watch these announcements in action at Microsoft Build 2024

Join us at  Microsoft Build  from May 21 to 23, 2024, to see all of these announcements in action across the following sessions:

  • “ Microsoft Build opening keynote ” with Satya Nadella, Kevin Scott, and Rajesh Jha.
  • “ Next generation AI for developers with the Microsoft Cloud ” with Scott Guthrie.
  • “ Microsoft Fabric: What’s new and what’s next ” with Arun Ulagaratchagan and Amir Netz.
  • “ Ingest, analyze and act in real time with Microsoft Fabric ” with Yitzhak Kesselman and Tessa Kloster.
  • “ Extend and enhance your analytics applications with Microsoft Fabric ” with Dipti Borkar and Teddy Bercovitz.
  • “ GraphQL API and User Defined Data Functions in Microsoft Fabric ” with Monica Boris and Dipti Borkar.
  • “ Unify your data with OneLake and Microsoft Fabric ” with Josh Caplan and Priya Sathy.
  • “ Accelerate insights with real-time Azure SQL data and Microsoft Fabric ” with Anna Hoffman and Bob Ward.
  • “ Integrating Azure AI and Microsoft Fabric for Next-Gen AI Solutions ” with Nellie Gustafsson and Facundo Santiago.
  • “ Activate enterprise data in AI-enabled business applications ” with Rakesh Krishnan and Srikumar Nair.

You can also try out these new capabilities and everything Fabric has to offer yourself by signing up for a free 60-day trial—no credit card information required. To start your free trial , sign up for a free account (Power BI customers can use their existing account), and once signed in, select start trial within the account manager tool in the Fabric app. Existing Power BI Premium customers can already access Microsoft Fabric by simply turning on Fabric in their Fabric admin portal. Learn more on the Fabric get started page .

Join us at the Power Platform and Fabric Community Conferences

If you want to get hands-on experience with everything Power BI and Fabric have to offer, please join us at these upcoming Power Platform and Fabric Community conferences:

  • European Power Platform Conference held in Brussels, Belgium, from June 11 to 13, 2024.
  • Power Platform Conference held in Las Vegas, Nevada, from September 18 to 20, 2024.
  • European Microsoft Fabric Community Conference held in Stockholm, Sweden, from September 23 to 26, 2024.

At all three events, you will hear from leading Microsoft and community experts from around the world and get hands-on experiences with the latest features from Microsoft Fabric, Power BI, the rest of Microsoft Power Platform, and more. You can learn directly from the people behind the products while having the chance to interact with your peers and share your story. Register for the events today and see how cutting-edge technologies from Microsoft can enable your business success.

Explore additional resources for Power BI and Microsoft Fabric

If you want to learn more about Power BI and Microsoft Fabric:

  • Signing up for the  Microsoft Fabric free trial  or, if you already have Power BI Premium capacity, simply turning on Microsoft Fabric in your admin portal.
  • View the updated Microsoft Fabric Roadmap .
  • Visiting the  Power BI   and  Microsoft Fabric  websites.
  • Join the  Fabric community .
  • Reading more in-depth, technical blogs on the  Microsoft Fabric Updates blog .
  • Exploring the  Power BI   and Microsoft Fabric  technical documentation.

IMAGES

  1. Configure and manage capacities in Power BI Premium

    power bi premium capacity assignment

  2. Connect to Power BI Premium Capacity Metrics

    power bi premium capacity assignment

  3. Configure and manage capacities in Power BI Premium

    power bi premium capacity assignment

  4. Microsoft Power BI: Premium capacity

    power bi premium capacity assignment

  5. Monitor Power BI Premium capacities with the Premium metrics app

    power bi premium capacity assignment

  6. Monitor Power BI Premium capacities with the Premium metrics app

    power bi premium capacity assignment

VIDEO

  1. Power BI Licenses: Everything you need to know!! #powerbilicenses #powerbi #powerbitutorial #bcp

  2. PL 300 Test Prep

  3. Demystifying Power BI Licenses Confusion

  4. Change Power BI Workspace from hosted via Power BI PRO to run under Azure Premium Capacity

  5. 23-05-2024 Livestream Microsoft ngừng cung cấp Power BI Premium

  6. POWER BI FIRST TO END OVERVIEW || Microsoft Power BI Training || Certification

COMMENTS

  1. Deploying and Managing Power BI Premium Capacities

    An overview of Power BI Premium, covering the basics of reserved capacities, supported workloads, unlimited content sharing, and other features. Detailed information about configuring and managing capacities and workloads. Answers to questions around purchase and licensing, features, and common scenarios.

  2. Power BI Premium

    Power BI P remium capacities give enterprises access to their own dedicated resources in the cloud. With this access enterprise s can distribute Power BI content as wide ly as they choose; it also lets them unlock greater scale, better performance and advanced capabilities that are not available to Power BI users on shared capacities. All of these are possible because you can choose how a ...

  3. Power BI Premium Capacity Settings

    The Capacity Portal Allows for the following configurations: Capacity Admin Assignment: Current Capacity Admins are configured here by a Power BI Administrator.These users can edit settings, allocate virtual cores and workloads/resources within the capacity, manage assignment permissions and restart the capacity.

  4. Power BI Premium Per User

    Premium Per User is the lowest entry-point for Power BI Premium features. It's built upon the Premium platform with built-in mechanisms ensuring that PPU users can use the platform's ability to scale. PPU is designed to support enterprise workloads including Power BI items with size limits equivalent to that of a P3.

  5. PDF Power BI Premium Planning and Deployment

    In order for any interaction to happen on an imported dataset, it must be loaded into Premium capacity's in-memory engine. Let's drill into the details of what happens when a model is loaded into memory: Moving the data from data source to Power BI The first step is moving the data from the data source to Power BI, this doesn't take any

  6. Demystifying Power BI

    A Power BI Premium capacity node can be considered a virtual machine: a specified amount of computing resources to be used by one customer within the Power BI service. Unlike shared capacities ...

  7. Assign Power BI workspaces to a capacity automatically

    A dedicated capacity (Power BI Premium, Embedded, or Premium per User license) Python skills; Understanding REST APIs; Setting the scene. For my demo purpose I'm going to use a Power BI Embedded capacity - a so-called A-SKU - from the Azure Portal. If you're interested in how to create an Embedded capacity, follow this link.

  8. How To Use The Power BI Premium Capacity Metrics App

    Installing the PBI Premium Capacity Metrics App is easy and can be done directly from Power BI Service. Step 1: Open the "Apps" page using the left panel and click the green "Get apps" button on the top right. Step 2: Navigate to the "Template apps" section and search Premium Capacity Utilization and click on the app icon (note ...

  9. Managing Power BI Premium capacity

    In this video, Adam looks at how to manage the Power BI Premium capacity that you have purchased. He looks at what Global Admins can see, how you can assign capacity admins and users with workspace assignment, along with how to assign workspaces to a Power BI Premium capacity. asaxton 2017-07-11T12:48:42-05:00. Share This Story, Choose Your ...

  10. A Complete Guide to Power BI Pricing and Capacity Management

    Suppose an organization contains a total of 4500 users who'll have access to Power BI. Let's divide these users into 3 categories - 20% pro users, 35% frequent users, and 45% occasional users. Based on the pricing calculator, the total cost for 4500 users will be $23,976/month.

  11. PowerBI premium capacity administration: dataset monitoring and tuning

    Recently a team of developers managed to bring down an entire Power BI Premium capacity by publishing a single report without testing it, and without knowing how to monitor its performance on the Power BI portal after publishing it. ... Over more than 20 years Fabriano has worked on assignments involving database architecture, Microsoft SQL ...

  12. Questions For a Power BI Premium Deployment

    In addition to the Office 365 admin and Power BI admin roles, there are capacity admins and capacity assignment permissions. Capacity admins can assign workspaces to their capacity and update the capacity assignment list. Users with capacity assignment permissions can add any workspace in which they are an admin to dedicated capacity.

  13. Power BI premium capacity admins

    Capacity Admins have administrative control over the given capacity but must also be granted assignment permissions in the Users with assignment permissions setting to assign workspaces to premium capacities if the capacity admin will be responsible for associating an app workspace to premium capacity. Power BI Admins are expected to have the ...

  14. Users with assignment permissions

    04-22-2020 02:18 PM. We have our tenant capacity setting set so that only specifc users are able to assign the premium capacity. Currently doing a clean up to remove premium capacity on app workspaces that do not require it. In doing so, we've found new app workspaces getting assigned premium capacity by a user that does not have assignment ...

  15. Power BI: Pricing Plan

    Learn more about purchasing Power BI Premium per user. Power BI Pro license is required for all Power BI Premium ("P") and Fabric Capacity ("F") SKUs to publish Power BI content to Microsoft Fabric. Enabling content consumers to review and interact with Power BI reports without additional paid per-user licenses is available at P1 and ...

  16. PowerBI Questions Data limit and restriction

    With Power Bi you can create a Workspace and only grant certain people access to it in order to restrict who can see your published reports/ dashboards. If you need to create one report or dashboard that will alter permissions depending on who is viewing the report, you can set up static or dynamic RLS. ...

  17. Sign up for Fabric and Power BI licensing for users in your

    The options are Fabric free, Pro, and Premium Per User (PPU). Capacity subscription - There are two types of Power BI subscriptions for organizations: standard and Premium. Premium provides enhancements to Power BI, and a comprehensive portfolio of Premium features. Power BI Premium is a tenant-level Microsoft 365 subscription, available in a ...

  18. Migrating From Power BI P-SKU Premium Capacities To F-SKU Capacities Is

    (Lower model sizes, etc…) Now, what's cool is the Fabric workspaces can include Power BI content and share security controls (works with). So… with a lower F SKU capacity working with Power BI feels just like the higher F SKUs, but the Power BI feature set is different. Summary F64 and higher - Power BI included and brings Premium P ...

  19. Co pilot tenant settings

    Co pilot tenant settings. I have a P1 license and my wokspace is in premium capacity, yet I am not able to use co pilot in power BI service, it is saying my workspace is not configurred for copilot settings. I found online that I need to enable copilot in tenant settings but I am not able to find tenant settings in admin portal even though I ...

  20. Microsoft Fabric May 2024 Update

    Welcome to the May 2024 update. Here are a few, select highlights of the many we have for Fabric. You can now ask Copilot questions about data in your model, Model Explorer and authoring calculation groups in Power BI desktop is now generally available, and Real-Time Intelligence provides a complete end-to-end solution for ingesting, processing, analyzing, visualizing, monitoring, and acting ...

  21. Copilot in Microsoft Fabric is now generally available in the Power BI

    If you want to learn more about Power BI and Microsoft Fabric: Signing up for the Microsoft Fabric free trial or, if you already have Power BI Premium capacity, simply turning on Microsoft Fabric in your admin portal. View the updated Microsoft Fabric Roadmap. Visiting the Power BI and Microsoft Fabric websites. Join the Fabric community.