Import Dynamics 365 F&O Aggregate Measures from Azure Data Lake Gen 2 to Power BI

Recently we were tasked with importing Dynamics 365 Finance & Operations data into Power BI.com for external reporting of application data. With business requirements stating that integration with external sources was required this meant a need to export data out of the application and restricted the use of Embedded Power BI.




Sourcing


Prerequisites

To enable the import of D365 F&O Aggregate Measures into Power BI you will need to ensure that D365 F&O and your Azure Data Lake Gen2 Storage have been configured correctly- its a lengthy list but take the time to work through this before continuing. if you have an Azure Administrator at your company now would be time to call them in to help!


The process to configure Aggregate Measure exports from Dynamics 365 F&O to Azure Data Lake Gen 2 storage is documented here : Make Entity store available as a Data Lake


Note: At the time of writing the export of Entities (Tables) from D365 F&O to Azure Data Lake Gen2 Storage is not supported in Production - only the export of Aggregate Measures. The export of Entities (tables) to Data Lake is still in Preview. This method is documented here : Finance and Operations apps data in Azure Data Lake


Once your D365 F&O application is set up to export the Aggregate Measures to your Azure Data Lake Gen 2 storage you will see the data residing in a folder structure similar to what is shown below.

The (ADLS Gen2) text after the name of the Storage Account confirms that it is Azure Data Lake Gen 2 storage


Note: You should ensure the user account you use to login into the Power BI Service should have read and execute permissions on all files before continuing


Consuming


Now the your Aggregate Measures have been exported they are ready to be consumed by Power BI!

  • From Azure Data Explorer, right click on the appropriate model.json file, select Copy URL and paste the URL into Notepad.



  • Next edit the storage account name to remove 'blob' and replace with 'dfs'

  • Also replace any '%2F' strings in the URL with '/'

An example is shown below:

from:


https://xxxxxxxxxxx.blob.core.windows.net/xxxxxxxxxxxxxx/xxxxx%2FAggregateMeasurements%2Fxxxxxxx%2Fmodel.json

to:

https://xxxxxxxxxxx.dfs.core.windows.net/xxxxxxxxxxxxxx/xxxxx/AggregateMeasurements/xxxxxxx/model.json


  • Next, log into Powerbi.com and navigate to the workspace where you would like to create your Data Flow. Click New->Dataflow



  • On the Attach a Common Data folder connector option, click Create and attach


Note: this feature is still in Preview so proceed with caution if using in a Production environment - have a backup plan should this connection fail and you have to wait for Microsoft to resolve!


  • Next paste in the URL to the model.json file from above and name the Dataflow after the Aggregate Measure as shown below - then hit Create and Attach



  • Now the data flow will appear within your workspace under the Datasets+dataflows section



Note that as the dataflows are reading directly from Azure Data Lake Gen2 storage, they cannot be refreshed - all data sits in the Azure Data Lake Gen2 storage in csv files.

  • From the Power BI desktop it is now possible to add this dataflow as a Data Source and therefore use for report development

  • Import all of your Aggregate Measures to a Shared Dataset pbix file and once published set up a scheduled refresh.


And that is it! You now have your D365 F&O Aggregate Measure data imported from your Azure Data Lake Gen2 Storage to Power BI!




0 views

LETS WORK

TOGETHER

Brisbane, Queensland, 4101

jack@thedataguys.com.au

©2020 The Data Guys