Dynamics 365 Business Central: the problem of large data deletion.

In the last two months I was invited to speak at two partner’s events and during those events I had the chance to speak with many partners about their opened issues on Dynamics 365 Business Central SaaS projects.

Some of them raised to me the same question: we have some companion tables (like tables used for integrations, logs, data archive etc.) that are very huge in size. How can we delete the records in those tables? Unfortunately, doing that in AL seems not possible (the deletion process runs for hours and never finishes). Is it normal?

(P.S. some partners gives me their numbers and I saw tables with > 50 GB in size that must be deleted). Some examples I saw recently:

  • Change Log Entry table with > 60 millions of records.
  • Custom integration tables with > 50 GB in space and > 90 millions of records.
  • Document Attachment table with > 30 GB of space (and deletion of a single entry here takes lots of minutes).

Unfortunately, there’s not a “DO THIS” response to that question. Data deletion in Dynamics 365 Business Central was from the beginning if its history not one of the most performant operations that you can do, and the same was in the Dynamics NAV era.

But when doing large data deletions, there’s a big difference between Dynamics 365 Business Central SaaS and on-premises…

Large data deletion on Dynamics 365 Business Central SaaS.

In Dynamics 365 Business Central SaaS you have no access to the database, so you cannot perform operations at the database (T-SQL) level. The only way to perform a data deletion of a large table (obviously, when force uninstalling an extension is not an option) is to use an AL procedure (maybe scheduled).

When doing a data deletion in AL, you can use YourTable.DeleteAll() or you can loop through records and delete them. But stay sure that this will take ages for very large tables, also if you avoid running triggers…🐌🐌

If you want to do data deletion in AL languages for a giant table, start by deleteing rows in smaller batches to avoid locking the table for too long and to manage transaction log growth. This means that you should:

  1. Identify a condition or key to delete rows incrementally.
  2. Use a loop to delete in batches (for example 10,000 rows at a time).
  3. Add delays or checkpoints to allow the transaction log to clear and reduce resource contention.

In this way you have more control on the deletion process and under the hood (Azure SQL) you have more control on resource usage (CPU, I/O, log space).

If also with this way your data deletion process will never finishes, you will be forced to go through Microsoft support.

What is missing here?

With Dynamics 365 Business Central SaaS, we should have a way to perform the following table operations:

  1. TRUNCATE TABLE: removes all rows from the table instantly without logging individual row deletions, making it extremely fast and efficient.
  2. DROP TABLE: completely removes the table and its data from the database.

or at least the number 1 on this list😉

Having these SQL operations available in AL language can help a lot on handling massive data deletions, also if they will be provided with some limitations (if a table has SIFT indexes maybe using TRUNCATE can be a problem, but I think that probably 1% of the table where you need to perform a massive deletion is on this scenario).

I think that having at least the TRUNCATE option is becoming a must nowadays (expecially when SaaS databases after years of usage grow a lot), so I hope that the product team will start thinking on providing an option to execute a TRUNCATE operation on a Business Central table.

What is mandatory to do in a Business Central SaaS project?

If someone of you were at the session I’ve done with Duilio at the latest Directions EMEA about large cloud customers, probably you remember a rule that I’ve shared with attendees and that I want to relaunch here (engrave it on the stone):

A planned Data Retention Strategy should be MANDATORY.

The tasks that a good Dynamics 365 Business Central project manager should always remember to include into the project planning phase are the following:

  1. Identify and categorize your data: identify the various types of data within your BC system and categorize this data based on its purpose and usage (log and temporary data, historical data, operational data, audit and compliance data).
  2. Determine the retention requirements (how long the data should be retained).
  3. Create retention policies: define the procedures for safely deleting or archiving data once it reaches the end of its retention period.

Also tattoo that in mind: enable retention policies at the beginning of the project, during the implementation phase. Doing that later can lead to previously discussed problems. Remember to calculate table’s growth rate (e.g. X GB/Year or /Month) by monitoring storage and Table Information at different times.

Large data deletion on Dynamics 365 Business Central on-premises.

If you are on Dynamics 365 Business Central on-premises, you have full access to the database (SQL Server or Azure SQL) and in this case things change a lot.

For performing a very large data deletion in an on-premises environment, I don’t suggest you to go via AL language. Deleting data from a massive table via AL code takes ages also in an on-premises environment and in this case the best way is to go directly via SQL.

Deleting a very large table in SQL Server or Azure SQL Database requires careful considerations to minimize performance impact, avoid long-running transactions and manage resource usage effectively.

Here are the possible options/techniques that I recommend for a massive data deletion (in order of my personal priority).

🥇Option 1: TRUNCATE TABLE

The TRUNCATE TABLE operation (as said before) removes all rows from the table instantly without logging individual row deletions, making it extremely fast and efficient. It’s probably the best option to choose when you need to delete all data in the table and don’t need to keep any of it.

To use it, simply execute the following T-SQL:

TRUNCATE TABLE [TableName];
  • Advantages:
    • Minimal logging (only metadata changes are logged).
    • Resets identity columns (if any).
    • Very fast, even for large tables.
  • Limitations:
    • Cannot be used if the table is referenced by foreign keys (but this is not a problem in the Business Central database).
    • Deletes all rows (no selective deletion).
    • Cannot be rolled back in a transaction (though it’s atomic).

If you can use TRUNCATE, this is absolutely the best option for speed and efficiency.

🥈Option 2: DROP TABLE

The DROP TABLE operation completely removes the table and its data from the database. This ia a good option to use when you don’t need the table structure and the data anymore.

To use it, simply execute the following T-SQL:

DROP TABLE [TableName];
  • Advantages:
    • Instantaneous for removing the table and all its data.
    • Frees up all associated storage (including indexes).
  • Limitations:
    • Cannot be used if you need to retain the table structure or some data.
    • Foreign key constraints must be dropped first if they reference the table.

If you’re decommissioning the table entirely, this is probably the simplest and fastest approach.

🥉Option 3: Batch Deletion

With this option you can delete rows in smaller batches to avoid locking the table for too long and to manage transaction log growth. This is a usefult technique when you need to delete specific rows (e.g., based on a condition) or can’t use TRUNCATE due to foreign keys or other constraints.

To do that:

  1. Identify a condition or key to delete rows incrementally.
  2. Use a loop to delete in batches (for example 10,000 rows at a time).
  3. Add delays or checkpoints to allow the transaction log to clear and reduce resource contention.

This is an example of a T-SQL script that performs this technique:

WHILE EXISTS (SELECT 1 FROM [TableName] WHERE [Condition])
BEGIN
    DELETE TOP (10000) FROM [TableName]
    WHERE [Condition];
    WAITFOR DELAY '00:00:01'; -- Optional: this is just to reduce resource contention
END;
  • Advantages:
    1. Controls resource usage (CPU, I/O, log space).
    2. Works with foreign keys if constraints are handled.
    3. Allows selective deletion.
  • Limitations:
    1. Slower than TRUNCATE or DROP.
    2. Generates more transaction log activity (can be mitigated with smaller batches).

🪵Option 4: Partition Switching

This is an advanced technique that I have honestly used few times in my life, but in some scenarios when you have really huge tables it can help a lot, so it worth mentioning. With this technique, you can use table partitioning to “switch out” data to a staging table, then drops the staging table. The requirement to use this technique is that you need to have a partitioned table (so you have previously applied table partitioning on SQL).

The steps to perform in this case are the following:

  1. Ensure the table is partitioned.
  2. Create a staging table with the same structure as the source table.
  3. Switch the partition to the staging table:
    ALTER TABLE [TableName]; SWITCH PARTITION 1 TO [StagingTable];
  4. Drop the staging table:
    DROP TABLE [StagingTable];
  • Advantages:
    1. Extremely fast (metadata operation, similar to TRUNCATE).
    2. Minimal logging.
  • Limitations:
    1. Requires the table to be partitioned beforehand.
    2. More complex setup and maintenance.

Best practices to remember:

Before starting a massive data deletion process in your Dynamics 365 Business Central on-premises database, please remember the following aspects:

  • Monitor Log Space: large deletes can fill the transaction log. Ensure the database is in Simple Recovery Model (if possible) or back up the log during batch operations:
    BACKUP LOG [DatabaseName] TO DISK = ‘NUL’;
  • Test First: if possible, run the data deletion operation in a non-production environment to estimate time and resource impact.
  • Schedule Off-Peak: perform data deletions during low-usage periods to minimize user impact.

Conclusion

Yes, massive data deletion can be a problem on SaaS and I really hope to have a TRUNCATE operation available in the cloud environment for AL developers in the near future. Dear BC Team, please think on that… 🙏

While we wait for the gift from the product team, the recommendation I can give you is TO PREVENT.

Avoid tables that grow out of proportion, apply data compression where possible, periodically delete log data, integration tables, archive tables or move data from these tables externally (blob storage, Azure SQL database or other). Remember to apply retention policies to those tables AT THE BEGINNING of the project (this is an important project phase to not forget).

For desperate cases, pray the MS support team…

Original Post https://demiliani.com/2025/03/25/dynamics-365-business-central-the-problem-of-large-data-deletion/

0 Votes: 0 Upvotes, 0 Downvotes (0 Points)

Leave a reply

Join Us
  • X Network2.1K
  • LinkedIn3.8k
  • Bluesky0.5K
Support The Site
Events
March 2025
MTWTFSS
      1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31       
« Feb   Apr »
Follow
Sign In/Sign Up Sidebar Search
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...