The PowerBI.Dataflows(null) connector?
The dataflows connector in the Power BI laptop is a default connector. Once we connect, you may get an overview of all your workspaces which does consist of dataflows and all underlying entities.
Let’s count on that I do have 3 Power BI workspaces and each one carries a dataflow. All entities do have an equal call and equal underlying desk name. Since we already understand the environment with the aid of the workspace name, I continually use the equal entity and tables names across all DTAP stages.
As an example, I just selected the desk PimpYourDataflowConnector within the improvement workspace and loaded this one. When we test the generated code through Power Query inside the superior editor, it's going to look something like this:
Looking at this generated code, it is containing my workspace and data flowed tough-coded. The first line on this code where the function PowerBI.Dataflows(null) is known as, is probably the maximum exciting one. This step returns all dataflows wherein the logged-in user has got admission to on tenant level.
Make it flexible!
Power Query is an expression-based language. Every step has its personal result, which is known as an entry for the step afterward. Looking at the code, we also see a few filters carried out, as an example in the source step, wherein we clear out the column workspace at the value known as afterward. Besides that, we see [Data] referred to as after the clear out. With this, Power Query will handiest return the bring about the column called Data.
Now, we recognize this, we can build our very own good judgment which makes it easier to interchange among the dataflows in one-of-a-kind workspaces. We will do that with the aid of following the underneath steps:
As a first step, we call the connector PowerBI.Dataflows(null). As already stated, this can result in a listing of all our dataflows across the tenant in which the logged-in person has get admission to.
Second, we create a parameter for our workspace name which we can use for filtering in the subsequent step. The parameter must be type text to make the clear out work. Of course, it can be available to feature all possible results as a list inside the parameter. But understand that you won’t be able to see this list whilst you want to exchange the parameter inside the Power BI Service.
Read greater approximately converting parameters within the Power BI service in my preceding blog post Parameterize your data source.
Next is to use the created parameter as a filter out to our table which includes all dataflows. We can do that by using truly adding a filter out with the dropdown at the column workspace name and observe the following:
Now, we handiest have one workspace left. We need to have the tables covered in the statistics column. We can get these records using drilling down on the Data column. But if we do so, Power Query will nevertheless apply a filter based on the workspace. To avoid that, we can right-click at the Data column and do eliminate other columns.
Only the Data column is left so far. As a subsequent step, we can extend the Data column to get all underlying content material and extend all columns.
Wrap things up
In the preceding 9 steps, we've optimized our dataflows connector to effortlessly transfer over workspaces. This will are available in handy while you want to exchange your records supply using moving to the next degree in your development process.
Looking at the code we generated, you may nonetheless have some things tough-coded. What are the dataflow name and entity name? Besides those objects, there are hard-coded column names for increasing in the code as well. In the Expanded statistics step, that doesn’t make a big difference, since this is all meta statistics of the dataflows. But our remaining step, Expanded Data1 consists of all column names covered in our entity. In case we upload some other column to our entity inside the destiny, it won’t show up. To make this greater flexible, please read my blog about Dynamically expanding and appending tables with Power Query.
Template
To make it even simpler for you, I generated a template record. Please feel free to download this template file and reuse it. Since I don’t recognize which column names your dataflow is composed of, I’ve added a few extra common sense to the Power Query to dynamically extend the desk. As already stated inside the section earlier, you can examine how to do that in my preceding weblog post like Power BI online Training
By commencing the template report, you may be triggered to enter 3 parameters. These 3 parameters maybe your workspace name, dataflow name, and entityName. After imparting this information, it will load your desk.
Let’s count on that I do have 3 Power BI workspaces and each one carries a dataflow. All entities do have an equal call and equal underlying desk name. Since we already understand the environment with the aid of the workspace name, I continually use the equal entity and tables names across all DTAP stages.
As an example, I just selected the desk PimpYourDataflowConnector within the improvement workspace and loaded this one. When we test the generated code through Power Query inside the superior editor, it's going to look something like this:
let Source = PowerBI.Dataflows(null), #"12a34567-1ab2-1234-a123-1a2b34cde567" = Source{[workspaceId="12a34567-1ab2-1234-a123-1a2b34cde567"]}[Data], #"9z8yx765-zyx9-98z7-9zy8-zyxw987vu654" = #"12a34567-1ab2-1234-a123-1a2b34cde567"{[dataflowId="9z8yx765-zyx9-98z7-9zy8-zyxw987vu654"]}[Data], PimpYourDataflowsConnector = #"9z8yx765-zyx9-98z7-9zy8-zyxw987vu654"{[entity="PimpYourDataflowsConnector"]}[Data] in PimpYourDataflowsConnector
Looking at this generated code, it is containing my workspace and data flowed tough-coded. The first line on this code where the function PowerBI.Dataflows(null) is known as, is probably the maximum exciting one. This step returns all dataflows wherein the logged-in user has got admission to on tenant level.
Make it flexible!
Power Query is an expression-based language. Every step has its personal result, which is known as an entry for the step afterward. Looking at the code, we also see a few filters carried out, as an example in the source step, wherein we clear out the column workspace at the value known as afterward. Besides that, we see [Data] referred to as after the clear out. With this, Power Query will handiest return the bring about the column called Data.
Now, we recognize this, we can build our very own good judgment which makes it easier to interchange among the dataflows in one-of-a-kind workspaces. We will do that with the aid of following the underneath steps:
As a first step, we call the connector PowerBI.Dataflows(null). As already stated, this can result in a listing of all our dataflows across the tenant in which the logged-in person has get admission to.
Second, we create a parameter for our workspace name which we can use for filtering in the subsequent step. The parameter must be type text to make the clear out work. Of course, it can be available to feature all possible results as a list inside the parameter. But understand that you won’t be able to see this list whilst you want to exchange the parameter inside the Power BI Service.
Read greater approximately converting parameters within the Power BI service in my preceding blog post Parameterize your data source.
Next is to use the created parameter as a filter out to our table which includes all dataflows. We can do that by using truly adding a filter out with the dropdown at the column workspace name and observe the following:
Now, we handiest have one workspace left. We need to have the tables covered in the statistics column. We can get these records using drilling down on the Data column. But if we do so, Power Query will nevertheless apply a filter based on the workspace. To avoid that, we can right-click at the Data column and do eliminate other columns.
Only the Data column is left so far. As a subsequent step, we can extend the Data column to get all underlying content material and extend all columns.
- As a result, we can see all to be had dataflows for the required workspace. In this situation that is the simplest one dataflow, but this could result in more than one dataflow, so a couple of rows.
- We can do an easy drill-down now, by clicking on Table inside the column which is called Data.1. If we do, Power Query will create a step with the subsequent code:
- #” Expanded Data”0[Data.1]
- The component in which you see 0 methods that it returns the result for the first row. (Power Query starts counting at zero). For now, this works. But maybe we upload new dataflows in the destiny which might impact the order of rows. As a result of that, the primary row might be one of a kind and return a distinctive dataset. To avoid troubles like this, please first add a clear out where your handiest pick the specified dataflow. You can do this again with the aid of the use of a parameter. I simply hardcoded filtered the dataflow call here. Simply because my dataflow does have an equal name in every stage (Dev / Acc / Prd).
- After filtering, we simplest have one dataflow left. Now, we will without difficulty click the Table element within the Data.1 column to be expanded.
- This will bring about a list of all underlying entities.
- For entities, the identical tale applies as explained for dataflows in step 6. It is better to first apply a filter out based at the entity you want to extract, before clicking the Table object within the Data column. In my example, I carried out a clear-out on the entity column to same the entity called “PimpYourDataflowsConnector”. This step will bring about the handiest one row left. This one row includes the entity you want to extract.
- The final step is to enlarge the Table object inside the Data column. By clicking the Table object, Power Query will once more practice a difficult-coded clear out in the code which we don’t want to. In this situation that doesn’t matter so much, because this filter out can be similar to we applied in step 8. If we want to optimize this, we can try this by first apply an extra step to remove all columns except the Data column. If you extend the table item afterward, this could result in the table protected in the entity.
Wrap things up
In the preceding 9 steps, we've optimized our dataflows connector to effortlessly transfer over workspaces. This will are available in handy while you want to exchange your records supply using moving to the next degree in your development process.
Looking at the code we generated, you may nonetheless have some things tough-coded. What are the dataflow name and entity name? Besides those objects, there are hard-coded column names for increasing in the code as well. In the Expanded statistics step, that doesn’t make a big difference, since this is all meta statistics of the dataflows. But our remaining step, Expanded Data1 consists of all column names covered in our entity. In case we upload some other column to our entity inside the destiny, it won’t show up. To make this greater flexible, please read my blog about Dynamically expanding and appending tables with Power Query.
let Source = PowerBI.Dataflows(null), #"Filtered Rows" = Table.SelectRows(Source, each [workspaceName] = WorkspaceFilter), #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows",{"Data"}), #"Expanded Data" = Table.ExpandTableColumn(#"Removed Other Columns", "Data", {"dataflowId", "dataflowName", "dataflowDescription", "Data", "ItemKind", "ItemName", "IsLeaf"}, {"dataflowId", "dataflowName", "dataflowDescription", "Data.1", "ItemKind", "ItemName", "IsLeaf"}), #"Filtered Dataflow" = Table.SelectRows(#"Expanded Data", each [dataflowName] = "DemoEntity"), #"Expand Data.1 Column" = #"Filtered Dataflow"{0}[Data.1], #"Filtered Entity" = Table.SelectRows(#"Expand Data.1 Column", each [entity] = "PimpYourDataflowsConnector"), #"Removed Other Columns1" = Table.SelectColumns(#"Filtered Entity",{"Data"}), #"Expanded Data1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Data", {"Environment"}, {"Environment"}) in #"Expanded Data1"
Template
To make it even simpler for you, I generated a template record. Please feel free to download this template file and reuse it. Since I don’t recognize which column names your dataflow is composed of, I’ve added a few extra common sense to the Power Query to dynamically extend the desk. As already stated inside the section earlier, you can examine how to do that in my preceding weblog post like Power BI online Training
By commencing the template report, you may be triggered to enter 3 parameters. These 3 parameters maybe your workspace name, dataflow name, and entityName. After imparting this information, it will load your desk.
Comments
Post a Comment