Created on 18 June 2019
icon_unfollowing
Login to follow
large-file-data-import-sample

Large File Data Import Sample

Stable version 1.0.0 (Compatible with OutSystems 11)
Uploaded on 18 June 2019 by 
large-file-data-import-sample

Large File Data Import Sample

Details
Sample showing an Action that can be used with a Timer to allow large CSV file data to be imported into an Entity
Read more

Allows a large CSV file data to be imported into an Entity.

The Uploading can be replaced with other methods (FileSystem, SFTP, etc), but this demonstrates how to read a large file's data to be imported asynchronously using a Timer (Batch Processing). The large file can be placed anywhere as long as it can be accessed.


SCREENS:

  • CSV_Upload shows a place to do a File Upload, List of Files to be Processed (and their status), and the current Data content imported from the files.
  • A "Wipe ProcessFiles" button will erase the content of the list of Files to be Processed.
  • A "Wipe Data" button will erase the data contents imported from files.


SITE PROPERTIES:

  • CSVdelimiter - Delimiter of the CSV file (default: ",")
  • CSVencoding - Encoding of the CSV file (default: "utf-8")
  • LogMessageMod - Create a General Log entry every n rows of INSERT (default: 1000)
  • RowsToProcess - Number of CSV file Rows to read during Processing per Commit in internal Loop (default: 10000)


EXPLANATION:

  1. Define the data (i.e. CSV) format in: Structures (i.e. Sheet1_Data), and in the destination Data Entity (i.e. Sheet1 Entity).  There is a sample CSV file in Data/Resources (Sample_Data_0000100_UTF-8.csv) that is used for this example.
  2. In the CSV_Upload screen, select the file and click Upload. If no new rows are added to the bottom Table, it's possible the file is too large for the Application Server's configured File Upload Size Limit - so retrieving the file(s) via SFTP or directly through the FileSystem (i.e. on a Network Drive) are other options.
  3. You should see a new entry in the "Files to Process" list, showing the original filename, the Temporary file location, the Start Row (0) for the File Processing Action to start reading from, and whether that file has been completely processed (imported) yet.
  4. Click Process and the ProcessCSV Server Action will execute, processing the list of files one-by-one.
  5. To have the same Process run asynchrously via a Timer (instead of manually through the Process button), set up the Schedule for Processes/Timers/Timer_ProcessFiles (Service Studio allows 15-minute increments, while Service Center allows 5-minute increments).  If necessary, adjust the Timeout in Minutes for the Timer (default: 20 minutes) and make sure that the time between each Scheduler entry is larger than the Timeout to avoid having multiple Timers reading from the file and potentially having duplicate and/or out-of-order data inserted.  (This can be improved by adding a "Processing" status in the ProcessFiles table, and/or allowing multiple Timers if the order of the data does not matter).


NOTE: Includes a Modified CSVUtil 1.10.3 from Forge:

Updated LoadCSV2RecordList Action:

  • StartIndex (int-->long)
  • MaxNum (int-->long)
  • Added output parameter cntNum (long) for number of Rows actually read


Release notes (1.0.0)
Reviews (1)
by 
2021-01-14
in version 1.0.0
Good structure to use to process big CSV files. There are still a couple improvements that could be done, like avoid the foreach while creating data and use perhaps BulkInsert extension.
Category
Demos & samples, Files & documents
Tags
Support options
This asset is not supported by OutSystems. You may use the discussion forums to leave suggestions or obtain best-effort support from the community, including from  who created this asset.
Dependencies
See all 3 dependencies
Application Objects
Large File Data Import Sample has 10 AOs.
Team
Compatible with
Version 11
Database:
All
Asset consumers
No consumers yet.
Weekly downloads