In the last two months I was invited to speak at two partner’s events and during those events I had the chance to speak with many partners about their opened issues on Dynamics 365 Business Central SaaS projects.
Some of them raised to me the same question: we have some companion tables (like tables used for integrations, logs, data archive etc.) that are very huge in size. How can we delete the records in those tables? Unfortunately, doing that in AL seems not possible (the deletion process runs for hours and never finishes). Is it normal?
(P.S. some partners gives me their numbers and I saw tables with > 50 GB in size that must be deleted). Some examples I saw recently:
Unfortunately, there’s not a “DO THIS” response to that question. Data deletion in Dynamics 365 Business Central was from the beginning if its history not one of the most performant operations that you can do, and the same was in the Dynamics NAV era.
But when doing large data deletions, there’s a big difference between Dynamics 365 Business Central SaaS and on-premises…
In Dynamics 365 Business Central SaaS you have no access to the database, so you cannot perform operations at the database (T-SQL) level. The only way to perform a data deletion of a large table (obviously, when force uninstalling an extension is not an option) is to use an AL procedure (maybe scheduled).
When doing a data deletion in AL, you can use YourTable.DeleteAll() or you can loop through records and delete them. But stay sure that this will take ages for very large tables, also if you avoid running triggers…
If you want to do data deletion in AL languages for a giant table, start by deleteing rows in smaller batches to avoid locking the table for too long and to manage transaction log growth. This means that you should:
In this way you have more control on the deletion process and under the hood (Azure SQL) you have more control on resource usage (CPU, I/O, log space).
If also with this way your data deletion process will never finishes, you will be forced to go through Microsoft support.
With Dynamics 365 Business Central SaaS, we should have a way to perform the following table operations:
or at least the number 1 on this list
Having these SQL operations available in AL language can help a lot on handling massive data deletions, also if they will be provided with some limitations (if a table has SIFT indexes maybe using TRUNCATE can be a problem, but I think that probably 1% of the table where you need to perform a massive deletion is on this scenario).
I think that having at least the TRUNCATE option is becoming a must nowadays (expecially when SaaS databases after years of usage grow a lot), so I hope that the product team will start thinking on providing an option to execute a TRUNCATE operation on a Business Central table.
If someone of you were at the session I’ve done with Duilio at the latest Directions EMEA about large cloud customers, probably you remember a rule that I’ve shared with attendees and that I want to relaunch here (engrave it on the stone):
A planned Data Retention Strategy should be MANDATORY.
The tasks that a good Dynamics 365 Business Central project manager should always remember to include into the project planning phase are the following:
Also tattoo that in mind: enable retention policies at the beginning of the project, during the implementation phase. Doing that later can lead to previously discussed problems. Remember to calculate table’s growth rate (e.g. X GB/Year or /Month) by monitoring storage and Table Information at different times.
If you are on Dynamics 365 Business Central on-premises, you have full access to the database (SQL Server or Azure SQL) and in this case things change a lot.
For performing a very large data deletion in an on-premises environment, I don’t suggest you to go via AL language. Deleting data from a massive table via AL code takes ages also in an on-premises environment and in this case the best way is to go directly via SQL.
Deleting a very large table in SQL Server or Azure SQL Database requires careful considerations to minimize performance impact, avoid long-running transactions and manage resource usage effectively.
Here are the possible options/techniques that I recommend for a massive data deletion (in order of my personal priority).
Option 1: TRUNCATE TABLE
The TRUNCATE TABLE operation (as said before) removes all rows from the table instantly without logging individual row deletions, making it extremely fast and efficient. It’s probably the best option to choose when you need to delete all data in the table and don’t need to keep any of it.
To use it, simply execute the following T-SQL:
TRUNCATE TABLE [TableName];
If you can use TRUNCATE, this is absolutely the best option for speed and efficiency.
Option 2: DROP TABLE
The DROP TABLE operation completely removes the table and its data from the database. This ia a good option to use when you don’t need the table structure and the data anymore.
To use it, simply execute the following T-SQL:
DROP TABLE [TableName];
If you’re decommissioning the table entirely, this is probably the simplest and fastest approach.
Option 3: Batch Deletion
With this option you can delete rows in smaller batches to avoid locking the table for too long and to manage transaction log growth. This is a usefult technique when you need to delete specific rows (e.g., based on a condition) or can’t use TRUNCATE due to foreign keys or other constraints.
To do that:
This is an example of a T-SQL script that performs this technique:
WHILE EXISTS (SELECT 1 FROM [TableName] WHERE [Condition])
BEGIN
DELETE TOP (10000) FROM [TableName]
WHERE [Condition];
WAITFOR DELAY '00:00:01'; -- Optional: this is just to reduce resource contention
END;
Option 4: Partition Switching
This is an advanced technique that I have honestly used few times in my life, but in some scenarios when you have really huge tables it can help a lot, so it worth mentioning. With this technique, you can use table partitioning to “switch out” data to a staging table, then drops the staging table. The requirement to use this technique is that you need to have a partitioned table (so you have previously applied table partitioning on SQL).
The steps to perform in this case are the following:
Best practices to remember:
Before starting a massive data deletion process in your Dynamics 365 Business Central on-premises database, please remember the following aspects:
Yes, massive data deletion can be a problem on SaaS and I really hope to have a TRUNCATE operation available in the cloud environment for AL developers in the near future. Dear BC Team, please think on that…
While we wait for the gift from the product team, the recommendation I can give you is TO PREVENT.
Avoid tables that grow out of proportion, apply data compression where possible, periodically delete log data, integration tables, archive tables or move data from these tables externally (blob storage, Azure SQL database or other). Remember to apply retention policies to those tables AT THE BEGINNING of the project (this is an important project phase to not forget).
For desperate cases, pray the MS support team…
Original Post https://demiliani.com/2025/03/25/dynamics-365-business-central-the-problem-of-large-data-deletion/