It would be hard to tell you which solution is 'best' as I'm not privy to all the details of your environment and application. I can tell you that in my work with Outsystems, because I'm using the personal edition, I do not have direct access to the data base and have to use an Outsystems solution. I've had excellent results using the CSVUtil component in the Forge. Importing new data is very straight forward. I typically create a structure with a text attribute for each field and move the raw data into that structure. Then do the necessary validation and conversion as you move it into the entity record. Using this process allows for handling error conditions that would abort the process, even using an external tool (invalid dates, characters in a numeric field, etc.).
Updating existing records is more challenging as you need to have a unique key such as social security number, but it is still very doable in Outsystems.
If you need any help let me know,Curt
Do the conversion by Outsystems, only in a seperate espace/tables.
So the raw csv goes as is into the staging table.
then you do the conversion to your like to the real tables.
I have worked with big csv-files before with outsystems.
the best way, imho to handle it is.
- import csv via extension per XXX lines. (to start with, 50K)
- bulkimport the csv to a staging table (probably a custom extension, but you can look at the forge for an example).
repeat this until you have it all in the staging table.
then you can do whatever you want without having too much trouble with time-outs.
This can all be done via a timer ofcourse,
any more questions, let me know :)