is it possible to write a trigger that, when the total records in a table, exceed 10.000 (for example), moves these record to another table and clean the first table?
For example, I have a Log table, and a Log_2019_11 table.
When the records in Log table are > 10.000
moves these records to Log_2019_11 table (same structure), and clean the Log table itself.
Thanks a lot.
I would write a scheduled job to do that. Run it daily.
Don't use a trigger to do this. You have no idea when your threshold (10.000) will be reached and the actions of the trigger may cause a deadlock.
With a scheduled task you can decide when to run it in a controlled way.
Also instead of a table with a date suffix, create only one table that includes a archived_date column.
Are these logging tables being created manually by you? That would be unsustainable
Yes, I create a new table every month with a SELECT INTO clause.
What is the maximum number of records in such a log backup table like Log_2019_11, Log_2019_10, Log_2019_09, ...?
It's about 180.000 - 200.000 records for every month.
That's about 2.5 million records maximum per year. I'd go with the suggestion of Yosiasz and use one table.
With your current setup, I don't expect you to query the log of the last month frequently. Keeping the size of the table low to speed up queries.
If you did, you would most likely have chosen for a sliding window, a table that just hold the logs of the last X days, say 30 days.
Is there a need to keep logs that are older than one year?
If not, I'd go with two tables: log_last_30_days and log_last_year.
In a scheduled job, that runs daily, you copy/move all the records from the Log table to log_last_30_days and to the log_last_year tables.
Then delete all records from the log_last_30_days table that are older than 30 days.
Then delete all records from the log_last_year table that are older than 365 days.
If you rarely ever query the log data, you may skip the small log_last_30_days table and only use the log_last_year table.