Hey there! it’s been a while since i last shared something here, but here we are again, today, I have something really exciting to talk about! Working with AS/400 (IBM i) can feel like stepping back in time, yet many businesses still depend on it for critical operations. So how do we bring this legacy system into the modern cloud world of Microsoft Fabric and D365 FinOps ?
What better way to return than by tackling a real challenge ? Bringing AS/400 data into the modern world with Microsoft Fabric, Ai, & D365 FinOps ! This isn’t just about moving data; it’s about transforming it, making it smarter, and ensuring it seamlessly integrates with today’s cloud-powered ERP system like D365 FinOps. So, let’s dive in and bridge the gap between the old and the new!
Today’s article provides a comprehensive guide on integrating a local IBM AS/400 emulator (PUB400) with Microsoft Fabric to facilitate data transformation, AI-driven analytics, and seamless integration into Dynamics 365 FinOps. Many entreprises continue to rely on IBM AS/400 due to its reliability & stability, and ability to manage complex entreprise operations, making modernization crucial for improved efficiency and cloud compatibility.
Today’s integration journey will highlight how to bridge the gap between legacy and Dynamics 365 FinOps by leveraging Microsoft Frabric as an advanced data processing hub. By extracting data from PUB400, applying AI-powered transformation in Fabric Lakehouse, and migratig it seamlessly into D365 FinOps.
Throughout this article, we will explore:
By the end of this guide, you will have a strcutured approach to transforming legacy data systems into modern, cloud-compatible solutions, ensuring smoth integration with Microsoft’s modern ecosystem tools.
Many customers today continue to rely on IBM AS/400 systems to manage their critical business processes, including inventory management, finance and related operations. While these systems are stable secure, they were not designed to seamlessly integrate with modern cloud platforms like Microsoft Fabric and D365 FinOps.
In the next section, we will explore the scenario of extracting « products » data from AS/400 (PUB 400) using ODBC, bring it into Microsoft Fabric’s Lakehouse, apply transformations to clean and enhance the data, and finally, migrate it into D365 FinOps using DMF and Fabric Data Factory. This step by step process will showcase how legacy data can be modernized and integrated into D365 FinOps.
Before diving into the technical implementation, let’s first understand the overall architecture of this integration. The goal is :
You have seen « PUB400 » word in last sections, but what is excatly PUB400 ? and how we will use it ?
PUB400 is a free IBM i (AS/400) cloud-based emulator, allowing developers and IT professionals to work with IBM i (AS/400) technology without requiring a physical AS/400 server. It provides an environment similar to a real IBM system, making it ideal for testing and development.
Want to learn more about PUB400 ? check out the official website here : PUB400 Official Website
In this article, we are using PUB400 as an AS/400 emulator instead of a production IBM i system because:
In our case, we will use an IBM profile in PUB400. This means instead of having physical AS/400 system, we will access a hosted IBM i environment via PUB400. The credentials generated during this process will be essential for:
NB : These credentials will be required later when configuring ODBC and connecting to Microsoft Fabric, so keep them safe! |
Before procceeding with the integration, let’s discuss the Prodcut data stored in our AS/400 system. This table is crucial for our scenario as it contains inventory data that will be migrated and utilized in D365 FinOps. This table is the equivalent of « Released Products » data entity in D365 FinOps which we gonna use later for integration.
NB : For this demonstration, we will not use Production tables but instead create our own sample PRODUCTS table in AS/400 (PUB400). This allow us to showcase the integration process and migrate the data into Fabric and then to D365 FinOps. |
We used SQL commands in the IBM i 5250 terminal or IBM i Access Client Solutions (ACS) to create and populate the table.
Inserting Sample Data :
The product data is stored in a DB2 databse within the PUB400 emulator. The table named « ReleasedProductsAS400″ consists of the following fields:
Field Name | Data Type | Description |
ProductID | INTEGER | Unique identifier for each product |
ProductName | VARCHAR(100) | Name of the product |
Category | VARCHAR(50) | Product category (e.g, Electronics) |
Price | DECIMAL | Product price in currency format |
StockQuantity | INTEGER | Number of units available in stock |
To establish a smooth data integration between AS/400 and Microsoft Fabric, we will divide the setup into two key steps :
The AS/400 (PUB400) does not have native connectivity. To integrate it with Microsoft Fabric, we use ODBC as a bridge to pull data into the Fabric Lakehouse.
Now that we have established our IBM profile in PUB400 and prepared our local configuration, it’s time to set up the cloud-side integration in Microsoft Fabric. This step ensures that ou data has a proper storage and processing pipeline before we integrate it into D365 FinOps.
Before we can begin ingesting and processing our AS/400 data, we need a dedicated workspace in Microsoft Fabric to manage the entire dataflows, pipelines and all the necessary items we gonna use for the data migration project.
The workspace will serve as our central hub for managing our data ingestion, transformation, and migration pipeline.
The Lakehouse plays a crucial role in our architecture as the staging area for AS/400 data before it is transformed and migrated to D365 FinOps. It acts as a centralized data storage where raw data from AS/400 is ingested, structured and processed.
Since PUB400 (AS/400) does not have native cloud connectivity, we will use ODBC to extract data into the Fabric Lakehouse.
To do this, we need to set up a data ingestion pipeline using Dataflows Gen2
The ODBC connection setup allows Microsoft Fabric to communicate with your IBM i system (PUB400) via ODBC, here is a breakdown of each field and its purpose:
Field Name | Description | Details |
ODBC Connection String | This is the connection Id that specifies the ODBC driver, system(server), and authentication details to connect to the IBM server | eg : driver={IBM i Access ODBC Driver};system=xxxxx;port=23 |
Connection Name | The name to identify the connection inside Microsoft Fabric | e.g : AS400Connection |
Data Gateway | Specifies the Microsoft On-Premises Data gateway used to securely bridge the local ODBC connection with Microsoft Fabric | Automatically detected inside Microsoft Fabric once you have installed the Data Gateway in your system |
Authentification Kind | Defines how Microsoft Fabric will authenticate with your IBM system (PUB400 / AS400) | Here we are using Basic authentication with the username and the password created during IBM profile creation process in PUB400.com |
Username | The PUB400 user ID used to authenticate and access the AS/400 system | This is the username provided when registering on PUB400. |
Password | The corresponding password for the PUB400 user. | Set during the account creation process in PUB400. |
Privacy Level | Defines how Fabric handles the connection’s security. Options: None, Organizational, or Private. |
Once the ODBC connection is successfully set up, you will be able to see all the tables available in your IBM i system database. This allows you to:
Once the AS/400 data is extracted using ODBC and loaded, we will use Dataflow Gen2 to apply necessary tranfromations.
In our scenario, we applied the following transformations:
Adding Custom Fields : Creating some custom fields required for our final destination
D365 FinOps Product long name
(Concatenation of ProductName
+ ProductDesc
).Valid
(A calculated date field to track product validation date).Once you have finished all your transformations, the transformed AS/400 data is cleaned, structured, and ready for the next step.
Verify the output in your Gen2 dataflow before Publish it into the Lakehouse.
Now that our data has been extracted and transformed, the next logical step is to automate and streamline its movement to ensure efficient integration with D365 FinOps.
A data Pipeline in Microsoft Fabric allows us to orchestrate, schedule, and automate the movement of data between different sources and destinations. In our case, we will use it to:
When migrating data from AS/400 system into D365 FinOps, careful data mapping and format alignement is crucial to ensure the legacy data fits the FinOps data structures. This involves mapping each source field in the IBM i database to the correct target data entity and field in FinOps and transforming the data as needed. there is some key considerations include:
Create a detailed mapping document that links AS/400 fields to FinOps entities. For example, as AS/400 customer master record must be broken out into the appropriate FinOps CustomerV3 entity fields (name, address, customer ID…). This mapping should cover all relevant tables and columns from the legacy system.
Ensure all data is converted to the format FinOps expects. like converting date fields, when AS/400 store dates on different formats, which need conversion to standard date formats, numeric fields, text encoding.
Map legacy codes or enumerations to FinOps equivalents. For instance an AS/400 might represent order status codes differently than D365 FinOps. Like « Open Transactions status » or « Invoices » status, like « Open/Closed/… ». These values must be translated to the FinOps options during migration.
FinOps data DMF standard mechnisim will catch format mismatches or missing required values during import. Test always before importing high volume of data, to ensure that the mapping and transformations produce correctly formatted data in FinOps. Verifying data quality post-import is also essential to confirm that no data was lost during conversion.
NB : Note in this article we will not cover every possible scenario, we will only highlights various effective options you can consider based on common integration requirements. |
When integrating AS/400 data into Dynamics 365 FinOps through Microsoft Fabric, choosing the right integration pattern based on your specific data volume and update frequency is critical. Here’s detailed overview of each method to guide our decision.
In this article we will not cover how to connect F&O Dataverse environement with Microsoft Fabric, but for export scenarios, Microsoft’s Synapse Link / Fabric Link to Dataverse provides an advanced way to expose D365 FinOps data. This integration is fundamentally one-way – and it brings your data into Azure OneLake for analytics and reporting.
Which is the case for our today’s architecture
NB : For scenarios that require near real-time data transfer and lower volume integration, FinOps exposes OData REST APIs for all public data entities. and can be called via Fabric data pipeline. |
Fabric Data Pipelines are essentially Azure Data Factory pipelines hosted in the Fabric environment. They allow complex ETL workflows, multi-step pipelines, and integration with different sources.
For medium to large data volumes on a regular schedule, an automated pipeline feeding into the DMF is recommended. In this pattern, data files or data packages can be created and pushed via data pipeline from Microsoft Fabric
Before creating this pipeline, you can create a copy data activity to get your transformed and cleaned data from your Lakehouse and then push them into an Azure Data Lake Storage Gen2 account for treacability reasons.
In the Lookup action you can go and iterate the records from your products file and then push everything into D365 FinOps , via DMF import project , with reccuring integration job activated.
Another Fabric pipeline option is to leveraging the Dataverse connector in Fabric to be used in Pipelines via Copy data action to F&O Dataverse tables.
Azure Logic Apps can be used as an orcchestrator to push transformed and cleaned files data from your ADLS into D365 FinOps.
See the following video to know more about Data integration in D365 FinOps and Azure Logic Apps
Staying informed about data changes, Real-time notifications for new or updated records can improve data accuracy and increase operational efficiency.
To efficiently monitor and react to data changes, Microsoft Fabric provides a built-in feature.
The good thing with Data Activator, its event-driven service included with Microsoft Fabric, that monitors your data enabling real-time actions and notifications.
Its can instantly triggers predefined actions (like email notifications, alerts, or even automated workflows) based on configured conditions.
In our case, we used the Data Activator to :
Copilot in Microsoft Fabric as all Microsoft products is enabled by default. and can be disable from the admin portal, according to Microsoft The following requirements must be met to use Copilot
The power of Microsoft Fabric and AI-driven automation is revolutionizing data migration and integration. By combining intelligent pipelines, Copilot-powered insights, and real-time tracking, and this article was an example for this case, to show how Microsoft Fabric and Copilot can bridge the gap between these two systems and ensure seamless transition from AS/400 into D365 FinOps with greater accuracy, effiiency, and minimal manual effort.
Thank you.
Original Post https://dynvision365.com/2025/03/11/unlocking-ibm-i-as-400-data-potential-with-microsoft-fabric-and-d365-finops/