I've been reading through the Data Archiving best practices (https://success.outsystems.com/documentation/best_practices/architecture/data_archiving/) and am curious to here from those who have implemented something like this before how effective it was in the benefits documeted - specifically when looking at performance.
My main question is, at what point is it necessary to start archiving to actually see a benefit. 10 000 records? 100 000 records? 1000 000 records? When would you actually start seeing a difference in performance?
Ofcourse there are many factors involved when it comes to this question, but let's talk about a general 'rule of thumb'.
When archiving is needed in your MVP, i think implement it from the start. So even when you have only 10 records.
It is difficult to say something about performance, that depends on other things. Whether your datamodel is correct, when your indexes are correct. The queries that you perform.
So i think in some situations an application works perfect with 1 000 000 records en not perfect with 1000 record.
To check when to Archive data is something you have to experience and do not need to develop ahead of time, it all depends on how many database resources you have and the indexes you're using.
When you have any performance issues you have to check what the issue is, when you find out this is due to the amount of records you can think about using archiving to a different entity. In my experience with SQL server, you can query millions of records easily, when you have the correct indexes and the query is optimized.
So the decision to archive is based on experience and already feeling of the environments, can;t be told in preset numbers.
Thanks everyone for your input. Interesting to hear the different approaches.