Back again with another article in the Azure Logic Apps and D365 FinOps integration serie, this time we will continue exploring the data integration scenarios within D365 FinOps, to see how to create and use Azure Logic Apps as an intermediate connector or data bridge between different systems in a data import scenario with D365FinOps.
One of the primary use cases of Azure Logic Apps is data integration. It provides a set of connectors and tools to connect with various data sources, such as Azure SQL Database, Azure Blob Storage, SharePoint, Salesforce, and many others. These connectors allow data to be accessed, transformed, and moved across different systems and applications.
Azure Logic Apps can be used for a variety of data integration scenarios, including:
Overall, Azure Logic Apps is a powerful tool for data integration that can help streamline processes, improve efficiency, and reduce errors. It provides a flexible and scalable platform that can adapt to changing business needs and support complex data integration scenarios.
Importing data into D365 FinOps can be a complex and time-consuming process. However, using Azure Logic Apps can simplify the process and automate data import tasks. With Azure Logic Apps, it is possible to create workflows that extract data from various sources, transform it into the required format, and load it into D365FinOps.
In this scenario we will use the following components :
Azure Blob storage as a cloud storage solution which will host the XML files to be imported later in D365FinOps via Azure Logic Apps.
In this example, our XML files have been uploaded previously to the blob.
Azure Logic Apps plays a crucial role in this data integration scenario, as data bridge solution for creating automated data integration tasks between the blob storage (Source) and D365FinOps application (Target). In this scenario, Azure Logic Apps will be used to extract and transform data from XML files stored in Azure Blob Storage and integrate it with D365FinOps.
Read data from XML files stored in Azure Blob Storage:
Import data from XML files stored in Azure Blob Storage to D365FinOps:
In this scenario we have created a custom data entity in D365FinOps that maps to the data structure of the XML data being imported from Azure Logic Apps via Data management workspace in D365FinOps.
To import data from Azure Logic Apps into Dynamics 365 for Finance and Operations (D365FinOps) via recurring data import job, the following configurations should be done:
Before importing XML file into a data entity in Dynamics 365 for Finance and Operations via Azure Logic Apps, there are several things that must be respected in the XML file for the import to be successful. Here are some important things to consider:
Now let’s explore more in details each action in the Logic Apps.
The first step in any Logic App workflow is to define a trigger. This is an event that starts the workflow. In our case of data integration with D365FinOps, we will use « Reccurence » as a trigger of the workflow, you could use also « When a new blob is added or modified », which means when a new file added to a cloud storage service, or any other event that initiates the workflow.
In our case we have predefined times to call the Logic Apps.
The next step is to retrieve the data that needs to be integrated with D365FinOps. we are using the « lists blobs in a container » action to retrieve the list of blobs from our taget container in the Azure Blob Storage. This action critical in workflow to retrieve the list of XML files stored in Blob Storage and then perform further actions on each file.
To use the « List Blobs » action in Azure Logic Apps, you need to specify the following parameters:
The « List Blobs » action returns a JSON array that contains information about each blob in the specified container, including the blob name, size, content type, last modified date, etag, and other metadata. You can use this information to perform further actions on each blob, such as downloading, copying, or deleting the blob.
The « For Each » action in Azure Logic Apps is used to iterate over a collection of items and perform a set of actions on each item. Here we are looping the list of the extracted azure blobs from the previous step. this action is useful when you need to perform a repetitive task on a set of data, such as processing a list of files, sending multiple emails, or inserting data into. In our case we are going to import file by file inside the for each loop.
Here are the main parameters of the « For Each » action in Azure Logic Apps:
The « Encapsulate » action in Azure Logic Apps is used to group a set of actions into a single block, making it easier to manage and reuse the logic in your workflow. This action can be useful when you have a complex set of actions that are used in multiple places in your workflow, or when you want to simplify the logic by grouping related actions together.
The advantage when you use the « Encapsulate » action, you can create a reusable block of logic that can be called from other parts of your workflow. This can help simplify your logic by grouping related actions together and reducing the number of actions that need to be managed in your workflow.
In our case we are encapsulating all the import data actions to D365FinOps in the same block.
Here are the main features of the « Encapsulate Import Into D365 data project » action in Azure Logic Apps:
Retrieving blob contents using an ID in Azure Logic Apps involves accessing a specific blob in a storage account and extracting the contents of that blob.
The output of this action is a combination of different metadata plus the binary data that represents the contents of the specified blob.
The « body » output is the actual content of the blob that we retrieved. We will use this output in the next step of the workflow, to start the data import via an HTTP Post Call.
The HTTP action in Azure Logic Apps can be used to import data into Dynamics 365 Finance and Operations (D365FinOps) by sending a POST request to the Data Management API. The Data Management API is a RESTful API that allows you to create, read, update, and delete data in D365FinOps.
Here are the key steps involved in using the HTTP action to import data into D365FinOps:
The authentication type for the HTTP action in Azure Logic Apps can be specified in the « Authentication » tab of the action configuration.
Azure Logic Apps supports several authentication methods, including:
As we did with the import action, here we are encapsulating again another block of actions to check the execution message status and « Delay » action inside the same block.
When we encapsulate a set of actions, we have to define inputs and outputs for the encapsulated logic app, which can be used by other actions in the workflow. This makes it easy to reuse the same logic in multiple places without having to repeat the same actions.
Now let’s discover the actions inside.
Before getting the final result of your data import to D365FinOps, you have to add a « Do Until » action to verify the current message status of the data batch job.
The « Do Until » action in Azure Logic Apps plays the role of a loop control action that allows you to repeat a set of actions until a specific condition is met. In our case the final status of the data import have to be « Processed » otherwise the import will not successed and we have to manage the different exception results.
The « Delay » action allows to pause the execution of the workflow for a specified amount of time. It is used here to introduce a time delay between the action of : next loop in the « Do Until » and the « Get import execution status » action in the workflow, time before executing the next action.
Useful in our case to wait « execute action » complete the operation before proceeding with the rest of the workflow.
This is the important step inside the « Do Until » block, the goal here is to get the data import job status.
The « DataManagementDefinitionGroups-GetMessageStatus » action is one of the actions available in the « Execute Action » connector for D365 FinOps in Azure Logic Apps.
In our case it will be used to retrieve the status of the message that was submitted to the data management framework in D365 FinOps.
Once the import job message is submitted to the data management framework, it will be processed asynchronously in the background. The « DataManagementDefinitionGroups-GetMessageStatus » action allows you to check the status of a specific message to see if it has completed processing or if there are any errors or warnings that need to be addressed.
To use the « DataManagementDefinitionGroups-GetMessageStatus » action, you will need to provide the message ID for the message you want to retrieve the status of. In our case the message ID have been generated from the previous step when we have submited the HTTP Post Call to the data management framework.
Here is an example of an output message ID :
The final step in our workflow is to manage the import execution results as they come from the previous step, the results can be : Error/ Processed with Error….
In our case we have added the conditional control action to evaluate the multiple conditions and status and execute different actions based on the result of each condition to push the XML file into « Success container » or « Error container ».
Recurring data job import in D365FinOps allows to set up a scheduled job that automatically imports data from a file or external source on a regular basis. Useful in our case to keep D365FinOps data entities up-to-date with data from XML files on a regular schedule.
Once your Logic Apps workflow have been successfully executed, you will be able to see the result of your reccuring data job directly from your data import project.
Original Post https://dynvision365.com/2023/02/28/import-data-in-d365-finops-via-azure-logic-apps/