Unlocking IBM i (AS/400) Data Potential with Microsoft Fabric and D365 FinOps

Summary – Time to Bridge the Gap!

Hey there! it’s been a while since i last shared something here, but here we are again, today, I have something really exciting to talk about! Working with AS/400 (IBM i) can feel like stepping back in time, yet many businesses still depend on it for critical operations. So how do we bring this legacy system into the modern cloud world of Microsoft Fabric and D365 FinOps ?

What better way to return than by tackling a real challenge ? Bringing AS/400 data into the modern world with Microsoft Fabric, Ai, & D365 FinOps ! This isn’t just about moving data; it’s about transforming it, making it smarter, and ensuring it seamlessly integrates with today’s cloud-powered ERP system like D365 FinOps. So, let’s dive in and bridge the gap between the old and the new!

Today’s article provides a comprehensive guide on integrating a local IBM AS/400 emulator (PUB400) with Microsoft Fabric to facilitate data transformation, AI-driven analytics, and seamless integration into Dynamics 365 FinOps. Many entreprises continue to rely on IBM AS/400 due to its reliability & stability, and ability to manage complex entreprise operations, making modernization crucial for improved efficiency and cloud compatibility.

Today’s integration journey will highlight how to bridge the gap between legacy and Dynamics 365 FinOps by leveraging Microsoft Frabric as an advanced data processing hub. By extracting data from PUB400, applying AI-powered transformation in Fabric Lakehouse, and migratig it seamlessly into D365 FinOps.

Throughout this article, we will explore:

  • The significance of AS/400 in modern entreprises and why many organizations still depend on it
  • The suggested architecture we will use for this scenario (From PUB400 > Microsoft Fabric > D365 FinOps)
  • How to establish ODBC connectivity between PUB400 and Microsoft Fabric
  • What is exactly PUB400 ?
  • How Microsoft Fabric’s Lakehouse can serve as a staging area for data cleansing, enrichment, and AI-driven insights.
  • Step-by-step migration of structured data from AS/400 to D365 FinOps using DMF
  • The broader role of AI in enhancing data quality, automation, and anomaly detection during migration.

By the end of this guide, you will have a strcutured approach to transforming legacy data systems into modern, cloud-compatible solutions, ensuring smoth integration with Microsoft’s modern ecosystem tools.

Today’s Business Scenario: The need for AS/400 Modernization

Many customers today continue to rely on IBM AS/400 systems to manage their critical business processes, including inventory management, finance and related operations. While these systems are stable secure, they were not designed to seamlessly integrate with modern cloud platforms like Microsoft Fabric and D365 FinOps.

In the next section, we will explore the scenario of extracting « products » data from AS/400 (PUB 400) using ODBC, bring it into Microsoft Fabric’s Lakehouse, apply transformations to clean and enhance the data, and finally, migrate it into D365 FinOps using DMF and Fabric Data Factory. This step by step process will showcase how legacy data can be modernized and integrated into D365 FinOps.

Architecture of our Integration Scenario

Before diving into the technical implementation, let’s first understand the overall architecture of this integration. The goal is :

  • To move data from an AS/400 (PUB 400) system into Microsoft Fabric via Microsoft On-premises Data gateway and ODBC connection
  • Apply data transformations via dataflows and data pipelines
  • Then push it into D365 FinOps.

Key components of our architecture

AS/400 Emulator (PUB400) – Data Source

  • This is where our structured « Released products » data is stored in a DB2 database.
  • We extract data using ODBC connectivity

Microsoft Data Gateway – Secure Connectivity

  • The On-Premises Data Gateway acts as a bridge to securely connect the AS/400 (PUB400) to Microsoft Fabric
  • It enables real-time data access from AS/400
  • Ensuring network security and compliance

Microsoft Fabric – Data Processing & Transformation Layer

  • We set up a Workspace in Fabric to manage the data pipeline
  • We create a Lakehouse to store and stage AS/400 data
  • AI-powered Dataflows and Pipelines will clean, enhance, and transform the data

D365 FinOps – Final Data Destination

  • Once data is processed in Fabric, it is pushed into D365 FinOps
  • We use DMF for bulk imports

Understanding PUB400 and Its Role in our Integration

You have seen « PUB400 » word in last sections, but what is excatly PUB400 ? and how we will use it ?

What is PUB400 ?

PUB400 is a free IBM i (AS/400) cloud-based emulator, allowing developers and IT professionals to work with IBM i (AS/400) technology without requiring a physical AS/400 server. It provides an environment similar to a real IBM system, making it ideal for testing and development.

Want to learn more about PUB400 ? check out the official website here : PUB400 Official Website

Why Are We Using PUB400 ?

In this article, we are using PUB400 as an AS/400 emulator instead of a production IBM i system because:

  • We need an accessible and flexible test environment to simulate real-world AS/400 operations
  • It provides a DB2 databse that mirrors the strucutre of an actual AS/400 system
  • It allows us to extract, transform and integrate data without affecting a live production system

Creating and IBM profile in PUB400

In our case, we will use an IBM profile in PUB400. This means instead of having physical AS/400 system, we will access a hosted IBM i environment via PUB400. The credentials generated during this process will be essential for:

  1. Logging into the PUB400 emulator
  2. Connecting to the DB2 database via ODBC
  3. Setting up integration with Microsoft Fabric

Steps to Create your IBM Profile in PUB400

  • Go to the PUB400 registration page and complete the required details
  • Provide your information : Email / IBM Username
  • Wait for Account Approval : The PUB 400 will review your request
  • Recieve your credentials : Once approved you’ll recieve login credentials that you will use to
    • Access the 5250 terminal emulator for AS/400 commands
    • Connect to the DB2 database using ODBC
    • Integrate PUB400 with Microsoft Fabric
  • Login In and Explore : you can start using PUB400 as a real AS/400 system, by running SQL queries, testing data extraction and preparing for integration.
NB : These credentials will be required later when configuring ODBC and connecting to Microsoft Fabric, so keep them safe!

Scenario: Extracting and Transforming Products Data from AS/400 for Integration

Before procceeding with the integration, let’s discuss the Prodcut data stored in our AS/400 system. This table is crucial for our scenario as it contains inventory data that will be migrated and utilized in D365 FinOps. This table is the equivalent of « Released Products » data entity in D365 FinOps which we gonna use later for integration.

NB : For this demonstration, we will not use Production tables but instead create our own sample PRODUCTS table in AS/400 (PUB400). This allow us to showcase the integration process and migrate the data into Fabric and then to D365 FinOps.

We used SQL commands in the IBM i 5250 terminal or IBM i Access Client Solutions (ACS) to create and populate the table.

Inserting Sample Data :

Structure of the Product Table

The product data is stored in a DB2 databse within the PUB400 emulator. The table named « ReleasedProductsAS400″ consists of the following fields:

Field Name Data Type Description
ProductID INTEGER Unique identifier for each product
ProductName VARCHAR(100) Name of the product
Category VARCHAR(50) Product category (e.g, Electronics)
Price DECIMAL Product price in currency format
StockQuantity INTEGER Number of units available in stock

Configuring the Integration: Cloud and Local setup

To establish a smooth data integration between AS/400 and Microsoft Fabric, we will divide the setup into two key steps :

  • Local or On-Premises Side Configuration
    • Microsoft Data Gateway, ODBC and necessary installation
  • Cloud-Side Configuration
    • Microsoft Fabric Workspace, Dataflow Gen2, Fabric data pipeline, and Lakehouse setup, Data Activator, Copilot

Step 1: Local-Side Configuration (OBDC & Data Gateway Setup)

The AS/400 (PUB400) does not have native connectivity. To integrate it with Microsoft Fabric, we use ODBC as a bridge to pull data into the Fabric Lakehouse.

Setting up the Connection:

Install IBM i Access ODBC Driver

Install and Configure Microsoft On-Premises Data Gateway

  • Download the Microsoft On-Premises Data Gateway from Microsoft’s official site
  • Install it on the machine that has network acces to the AS/400 system (Here we using local emulator)
  • Log in with your Microsoft Fabric account and configure the gateway to enable secure communication between AS/400 (PUB400) and Microsoft Fabric

Configure ODBC Data Source (System DSN)

  • Open ODBC Data Srouce Admin
  • Add new IBM i Access ODBC Driver

Step 2: Cloud-Side Configuration (Microsoft Fabric Setup)

Now that we have established our IBM profile in PUB400 and prepared our local configuration, it’s time to set up the cloud-side integration in Microsoft Fabric. This step ensures that ou data has a proper storage and processing pipeline before we integrate it into D365 FinOps.

Step 2.1: Create a Workspace in Microsoft Fabric

Before we can begin ingesting and processing our AS/400 data, we need a dedicated workspace in Microsoft Fabric to manage the entire dataflows, pipelines and all the necessary items we gonna use for the data migration project.

  • Here i have created my « AS400Ingestion » workspace with all associated items.

The workspace will serve as our central hub for managing our data ingestion, transformation, and migration pipeline.

Step 2.2: Set Up a Lakehouse in Fabric

The Lakehouse plays a crucial role in our architecture as the staging area for AS/400 data before it is transformed and migrated to D365 FinOps. It acts as a centralized data storage where raw data from AS/400 is ingested, structured and processed.

Data Ingestion: How Will We Get the Data ?

Since PUB400 (AS/400) does not have native cloud connectivity, we will use ODBC to extract data into the Fabric Lakehouse.

Step 3.2: Use Dataflow Gen2 to Extract & Transform your Data

To do this, we need to set up a data ingestion pipeline using Dataflows Gen2

  • Configure a new Dataflow Gen2 to pull data from AS/400
  • Choose the Source:
    • Select ODBC Connector from the list of suggested data sources in Fabric (you can search from the search pane or you can find it in Others connections)

The ODBC connection setup allows Microsoft Fabric to communicate with your IBM i system (PUB400) via ODBC, here is a breakdown of each field and its purpose:

Field Name Description Details
ODBC Connection String This is the connection Id that specifies the ODBC driver, system(server), and authentication details to connect to the IBM server eg : driver={IBM i Access ODBC Driver};system=xxxxx;port=23
Connection Name The name to identify the connection inside Microsoft Fabric e.g : AS400Connection
Data Gateway Specifies the Microsoft On-Premises Data gateway used to securely bridge the local ODBC connection with Microsoft Fabric Automatically detected inside Microsoft Fabric once you have installed the Data Gateway in your system
Authentification Kind Defines how Microsoft Fabric will authenticate with your IBM system (PUB400 / AS400) Here we are using Basic authentication with the username and the password created during IBM profile creation process in PUB400.com
Username The PUB400 user ID used to authenticate and access the AS/400 system This is the username provided when registering on PUB400.
Password The corresponding password for the PUB400 user. Set during the account creation process in PUB400.
Privacy Level Defines how Fabric handles the connection’s security. Options: None, Organizational, or Private.

Once the ODBC connection is successfully set up, you will be able to see all the tables available in your IBM i system database. This allows you to:

  • Browse the list of tables and understand the available data structure
  • Execute SQL Queries via SQL Analytics endpoint option
  • Visualize your tables via Model Layouts
  • Create and Build Power BI reports for your tables

Configuring Dataflow Gen2 for Data Transformation

Once the AS/400 data is extracted using ODBC and loaded, we will use Dataflow Gen2 to apply necessary tranfromations.

Data Transformations Applied in Dataflow Gen2

In our scenario, we applied the following transformations:

Adding Custom Fields : Creating some custom fields required for our final destination

  • Custom Field 1: D365 FinOps Product long name (Concatenation of ProductName + ProductDesc).
  • Custom Field 2: Valid (A calculated date field to track product validation date).

Validating and Publishing the Dataflow

Once you have finished all your transformations, the transformed AS/400 data is cleaned, structured, and ready for the next step.

Verify the output in your Gen2 dataflow before Publish it into the Lakehouse.

Step 4.2: Configuring the Data Pipeline for AS/400 to D365 FinOps Integration

Now that our data has been extracted and transformed, the next logical step is to automate and streamline its movement to ensure efficient integration with D365 FinOps.

What is Fabric Data Pipeline, and Why do we need it ?

A data Pipeline in Microsoft Fabric allows us to orchestrate, schedule, and automate the movement of data between different sources and destinations. In our case, we will use it to:

  • Automate the process of extracting data from AS/400 via ODBC (Copy Data action)
  • Ensure cleaned and trasnformed data is moved efficiently to D365 FinOps
  • Manage scheduled refreshes to keep D365 FinOps updated with the latest AS/400 product data.

Step 3: Data Mapping Between AS/400 and D365 FinOps (Format Alignement)

When migrating data from AS/400 system into D365 FinOps, careful data mapping and format alignement is crucial to ensure the legacy data fits the FinOps data structures. This involves mapping each source field in the IBM i database to the correct target data entity and field in FinOps and transforming the data as needed. there is some key considerations include:

Field Mapping:

Create a detailed mapping document that links AS/400 fields to FinOps entities. For example, as AS/400 customer master record must be broken out into the appropriate FinOps CustomerV3 entity fields (name, address, customer ID…). This mapping should cover all relevant tables and columns from the legacy system.

Data Type and Format Conversion:

Ensure all data is converted to the format FinOps expects. like converting date fields, when AS/400 store dates on different formats, which need conversion to standard date formats, numeric fields, text encoding.

Code & Value Translations:

Map legacy codes or enumerations to FinOps equivalents. For instance an AS/400 might represent order status codes differently than D365 FinOps. Like « Open Transactions status » or « Invoices » status, like « Open/Closed/… ». These values must be translated to the FinOps options during migration.

Validations and Testing:

FinOps data DMF standard mechnisim will catch format mismatches or missing required values during import. Test always before importing high volume of data, to ensure that the mapping and transformations produce correctly formatted data in FinOps. Verifying data quality post-import is also essential to confirm that no data was lost during conversion.

Step 4: Data Integration Options into D365 FinOps (by Volume & Frequency)

NB : Note in this article we will not cover every possible scenario, we will only highlights various effective options you can consider based on common integration requirements.

When integrating AS/400 data into Dynamics 365 FinOps through Microsoft Fabric, choosing the right integration pattern based on your specific data volume and update frequency is critical. Here’s detailed overview of each method to guide our decision.

For Export scenarios:

Option 1: For export scenarios : Dynamics 365 FinOps – Dataverse Link to Microsoft Fabric

In this article we will not cover how to connect F&O Dataverse environement with Microsoft Fabric, but for export scenarios, Microsoft’s Synapse Link / Fabric Link to Dataverse provides an advanced way to expose D365 FinOps data. This integration is fundamentally one-way – and it brings your data into Azure OneLake for analytics and reporting.

For Import scenarios :

Which is the case for our today’s architecture

Option 1: Fabric Data Pipeline to DMF (Batch Imports for Larger Volumes)

NB : For scenarios that require near real-time data transfer and lower volume integration, FinOps exposes OData REST APIs for all public data entities. and can be called via Fabric data pipeline.

Fabric Data Pipelines are essentially Azure Data Factory pipelines hosted in the Fabric environment. They allow complex ETL workflows, multi-step pipelines, and integration with different sources.

For medium to large data volumes on a regular schedule, an automated pipeline feeding into the DMF is recommended. In this pattern, data files or data packages can be created and pushed via data pipeline from Microsoft Fabric

Before creating this pipeline, you can create a copy data activity to get your transformed and cleaned data from your Lakehouse and then push them into an Azure Data Lake Storage Gen2 account for treacability reasons.

In the Lookup action you can go and iterate the records from your products file and then push everything into D365 FinOps , via DMF import project , with reccuring integration job activated.

Option 2: Fabric-Dataverse Connector to Copy Lakehouse Data to D365 FinOps Dataverse Tables/Entities

Another Fabric pipeline option is to leveraging the Dataverse connector in Fabric to be used in Pipelines via Copy data action to F&O Dataverse tables.

Option 3: Azure Logic Apps As an Orchestrator

Azure Logic Apps can be used as an orcchestrator to push transformed and cleaned files data from your ADLS into D365 FinOps.

See the following video to know more about Data integration in D365 FinOps and Azure Logic Apps

Step 4: What about Enhancing Visibility with Fabric Data Activator ?

Staying informed about data changes, Real-time notifications for new or updated records can improve data accuracy and increase operational efficiency.

To efficiently monitor and react to data changes, Microsoft Fabric provides a built-in feature.

The good thing with Data Activator, its event-driven service included with Microsoft Fabric, that monitors your data enabling real-time actions and notifications.

Its can instantly triggers predefined actions (like email notifications, alerts, or even automated workflows) based on configured conditions.

In our case, we used the Data Activator to :

  • Monitor product data stored in the Lakehouse by actvating 6 Event alerts
  • Automatically detect new product entries or changes in the Lakehouse
  • Immediately notify relevant stakeholders through email or Microsoft Teams.

Step 5: Role and Usage of AI & Copilot in Microsoft Fabric

Copilot in Microsoft Fabric as all Microsoft products is enabled by default. and can be disable from the admin portal, according to Microsoft The following requirements must be met to use Copilot

For more details about Copilot features included in Microsoft Fabric, you can visit Microsoft Copilot Page

Conclusion

The power of Microsoft Fabric and AI-driven automation is revolutionizing data migration and integration. By combining intelligent pipelines, Copilot-powered insights, and real-time tracking, and this article was an example for this case, to show how Microsoft Fabric and Copilot can bridge the gap between these two systems and ensure seamless transition from AS/400 into D365 FinOps with greater accuracy, effiiency, and minimal manual effort.

Thank you.

Original Post https://dynvision365.com/2025/03/11/unlocking-ibm-i-as-400-data-potential-with-microsoft-fabric-and-d365-finops/

0 Votes: 0 Upvotes, 0 Downvotes (0 Points)

Leave a reply

Join Us
  • X Network2.1K
  • LinkedIn3.8k
  • Bluesky0.5K
Support The Site
Events
March 2025
MTWTFSS
      1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31       
« Feb   Apr »
Follow
Sign In/Sign Up Sidebar Search
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...