At the end of May Microsoft started deploying the Power BI deployment pipelines feature in public preview. The possibility to deploy Power BI artifacts to different deployment stages, in a managed and governed way, was an important feature that was missing in the Power BI service feature portfolio.
In this insight, we will clarify what out-of-the-box deployment pipelines are all about and what the feature can do for your organization.
What are deployment pipelines in Power BI?
Power BI deployment pipelines are automated processes, which are able to deploy dashboards, reports and datasets to different deployment stages. During the deployment, the new feature copies the Power BI artifacts metadata from the selected stage into the target stage. It is important to notice that only the metadata is copied and not the data within the Power BI datasets. Therefore, after each deployment, the dataset needs to be refreshed to see the latest data.
The Power BI artifacts are deployed, throughout the Development and Test environments, to eventually deliver the Power BI artifacts in the Production environment.
How do I get started?
Before being able to start using the deployment pipelines, you need to provision Power BI Embedded capacity in the Azure portal or have a Power BI Premium license.
For production workspaces you need dedicated capacity (minimum A4 Power BI Embedded capacity or Power BI Premium capacity) and for development and test workspaces you need at least shared Power BI Embedded capacity. Deployment pipelines do not work with Power BI Pro licenses.
How can deployment pipelines help you streamline the Power BI deployment process?
Deployment pipelines facilitate how analytics teams can release deployment-ready objects much faster to the different deployment stages in a well-governed way. The deployments are less-error prone because of the way Power BI handles the comparison between the artifacts and the dataset sources.
In the past Power BI users had to plan and design their Power BI environment and manually "copy-paste" the objects between Power BI workspaces, or develop PowerShell/Rest API scripts to automate the deployment process. Not to mention the hassle that came with the dataset connections that needed to be manually changed within the Power BI dataset, every time the deployment was executed.
Deploy content from one stage to another
There are 2 ways of deploying the Power BI artefacts between deployment stages: deployment to an empty stage or deployment to an existing stage.
When the manually triggered deployment to an empty stage is done, a new workspace is created on premium capacity and the metadata is copied. Users or user groups who are assigned to these workspaces can see the visualizations. The workspace name is generated automatically but it can be renamed afterwards.
The deployment pipelines need to be manually triggered. You can also deploy content backwards, from a later stage in the deployment pipeline, to an earlier one.
Create dataset rules
Within the deployment stages you can specify different data source connections, and this comes in handy if your data warehouse environments also follow the same deployment stages. Not only data source connections can be adapted but also the parameters that you are using in your Power BI Datasets.
Is there an integration with Azure DevOps?
At the moment there's no way of integrating the deployment pipelines into Azure DevOps Pipelines. It would be great to call REST API's that trigger the deployment process but for now it's not on the feature agenda.
What's coming and what's missing?
- Version Management: for now, it's not possible to see who changed the Power BI artefacts and rollback to an historic state.
- The integration with Azure DevOps would achieve that Power BI deployments would fit into the bigger data warehouse deployment process
- Only Power BI datasets, reports and dashboards can be deployed right now. Deployment pipelines support for Paginated reports and dataflows are scheduled to be released in the future.