It is often assumed that a one-time cleanup, e.g. as part of a migration, is sufficient. However, data is subject to permanent change. Take, for example, the data field address. Through street renaming, incorporation, etc., an address changes over time. If the data is then cleaned only once - at great expense and effort - without regular data quality mechanisms, problems will arise again in a few years because the data will become outdated again. This again results in a high effort, combined with high costs.
Permanent backup mechanisms are useful, for example, to check data directly as it is entered. It also makes sense to integrate mechanisms when transferring data between systems. In the best case, therefore, regular data updates are introduced to track changes and reduce costs and effort. Then the system is always "fit for use" and the data is suitable for the processes.