Suppose I'm creating a multitenant application that has some logging feature that logs certain user actions and the application should also find specific information in that entity. At some point in the future, the entity that holds those logs will reach millions of rows.

What would be the best approach in this scenario? How can we ensure that the application will not be slow, if that's possible? If Tenant A created millions of rows, is it possible to ensure Tenant B won't experience slow downs? Is archiving of old data something that can be done, and what are good ways to do it?(I'm saying good because I can imagine some ways to do it but I'm unsure if they are the best way to go.)


Hi Michael,

Good question. By having an optimized index on your table, you won’t see much performance issues until your table is going to be really big. When you get to such a point, you can choose to purge data that is no longer needed or to move old records to a second table that doesn’t need the performance. This would become a timer that runs at times when you can handle a decrease in performance. 

Kind regards, Remco

Thanks Remco!