Reading Millions of Records from External DB and Storing in Entity
Application Type

We are starting a project where we are connecting to a master table (single table in oracle) which has millions of data records and we need to copy it to the OutSystems entities. What is the best way to do it? I have a plan to create a timer, read 5000 records, commit and wake the same timer again. But I'm afraid, what would happen if the timer fails in the middle or database connection is lost - how to resume from the row from which it has stopped? any idea/suggestions for this scenario who has worked on similar activity before?

Your proposed solution with the timer is the perfect tool to do this.

you can add an increased timeout instead of your 5000 records and check each loop how long to go untill time out and then restart your times.

Last problem is to know which records was the last. You need to had a sorting order ofcourse. for instance an autonumber field in the external databse. Or perhaps date and time. can check in your OS table which records was the last and start from there.


Hi Somesh,

 How do you currently get the next five thousand record?

Are there any columns that can work as a  sort key?  If so you can get the max key already imported into the Outsystems Database and use it as a where condition.

Also, Maybe you already know, please use BulkInsert forge.

Kind regards,

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.