When working with Dynamics 365 Business Central (and generally speaking on a lot of other serverless projects) it’s quite common to have the need to handle and process blob files (JSON, XML, EDI, CSV, etc.) for integration scenarios.
In a modern cloud solution (serverless to the max) this integration requirement is often handled by using low-code services like Azure Logic Apps.
The standard plan in Azure Logic Apps has a predetermined, flat-rate pricing structure, with a fixed fee for the hosting plan accommodating multiple Logic App instances (irrespective of their frequency of execution or the number of actions performed) and it’s nowadays the modern low-code platform to use in Azure for integration tasks (I will never stop to say “no Power Automate for those tasks“).
The Azure Logic App platform has a Blob Storage connector that easily permits you to read and process blob files in a fully low-code way. The Azure Logic Apps platform also permits you to handle scalability and realiability of your workflows, so it’s a recommended first approach to follow.
Let’s check this piece of Logic Apps workflow:
The workflow is reading two blob files from an Azure Blob Storage container for a subsequent parsing of those files in the next actions (not displayed here) using the Azure Blob Storage connector.
In Azure Logic Apps, connectors are available in either a built-in version, managed version, or both:
All built-in connectors run natively on the Azure Logic Apps runtime.
Managed connectors are instead deployed, hosted and managed in Azure by Microsoft. Managed connectors mostly provide a proxy or a wrapper around an API that the underlying service or system uses to communicate with Azure Logic Apps.
Usually the built-in version of a connector provides better performance than the managed counterpart, so it’s why they are proposed as default.
If you monitor the Azure Logic Apps worklow defined above for CPU and memory usage, you will see that if the files are small, CPU and (expecially) memory usage are low. But if you start loading big files, the memory usage increases a lot:
This is because the built-in connectors load all the informations into memory by design.
Now let’s imagine to use Azure Logic Apps Standard in the WS1 plan (1 vCPU and 3.5 GB of memory, recommended starting point) for hosting your production workloads and to load 2 blob files with 500 MB in size.
Loading those two files approximatively consumes 500 * 2 = 1 Gb of memory. Plus you need to consider some memory required by default to host and run the workflow runtime (around 300 Mb usually) and this means that your workflow execution takes around 1,5 Gb of memory.
And what happens if you have other actions using the file or maybe other files to load (concurrent executions)? You can have an out of memory exception!
In an Azure Logic Apps workflow the memory is released after the workflow run completes, but when you have multiple executions and different actions at the same time, this problem exists.
The first response is: can you ask to your integration partner to provide small files?
Seriously speaking, I think this is not an option…
To avoid out of memory exception processing very large files, you need to check for different aspects. First of all, activate monitoring of your workflows and check for the workflow concurrency, in order to discover the memory usage of your runs.
As a second option (usually not a problem, but I’ve recently discovered that this can be a problem on workflows created long time ago) check the platform settings of your Logic Apps Standard instance. Your Logic Apps instance need to run on a 64 Bit platform (more memory usage). Old workflows could run on 32 Bit platform, so please change:
A third option that you need to evaluate when working with very large files in Logic Apps workflows is to delegate the file processing to a function.
An Azure Logic Apps Standard workflow can invoke an Azure Function:
or it can also invoke a function defined in code inside the workflow itself (if someone of you was at the “High performance low-code workflows” session I’ve done 2 years ago at Directions EMEA or in other Azure conferences and events I’ve done last year around this topic (like WPC Conference in Milan this year) you know what I mean:
These tecniques willl permit you to efficienly handle large files and they really help on improve the workflow performances (see here for a real-world scenario successfully solved using those techniques). Yes, you need to write a bit of code, but the results will be game changer.
Original Post https://demiliani.com/2025/06/25/azure-logic-apps-are-you-handling-large-blobs-keep-memory-usage-under-control/