[BulkInsert] [BulkInsert] Insert from Excel

[BulkInsert] [BulkInsert] Insert from Excel

  
Forge Component
(11)
Published on 25 Jun (4 weeks ago) by Marcelo Ferreira
11 votes
Published on 25 Jun (4 weeks ago) by Marcelo Ferreira

Hello all,

I'm a newby with this technology. I’m trying to do something what seems to be quite simple, but I don’t seem to get it quite right…

I want to read an excel file and insert into a table, the file has 55543 records, but the system runs out of memory. The message of the log is: "GC overhead limit exceeded"

Can anyone to help me, please?

Thanks.





Hi Joseph,

To load such huge amount of data from frontend always result in poor performance, time out, or memory related issues. May be you can use Sql Loader to read your excel and insert it into the table.

Sachin

Sachin Mahawar wrote:

Hi Joseph,

To load such huge amount of data from frontend always result in poor performance, time out, or memory related issues. May be you can use Sql Loader to read your excel and insert it into the table.

Sachin

Hi Sachin, thanks for you response.

I have an Excel file of 20 mb, and 60000 records, 
at the moment of uploading the file, in the enterprise environment, 
the system runs out of memory and I can not access, 
however in my personal environment it takes 2 minutes to complete the task.

Enterprise Environment:
-J2EE, Oracle

Personal Environment
-.Net, SQL Server

How to use SQL Loader in Outsystems?

I'm sorry I'm newby.


Hi Joseph,

There is an alternative in case you do not need to insert the record "in runtime".

Just follow this steps:

1 ) First create a table (EXCEL_IMPORT) to store the binary from the excel file. The table should include a binary field (to store the excel binary), a boolean field (indicates if  this record has been processed) and a text field ( to record errors that may occur)

 2 ) create a timer that processes the records from the new table that have the boolean field set as false. The timer will read the binary and then import the records to the table.

Since you are using a timer and not importing through the web browser you should be ok.

Importing through the web browser has several limitation that do not exist when using a timer.

Hope this is helpfull. 

If you have any more questios i can try to set up an example .oml for you.

Best regards 

Rodrigo Henriques

Rodrigo Henriques wrote:

Hi Joseph,

There is an alternative in case you do not need to insert the record "in runtime".

Just follow this steps:

1 ) First create a table (EXCEL_IMPORT) to store the binary from the excel file. The table should include a binary field (to store the excel binary), a boolean field (indicates if  this record has been processed) and a text field ( to record errors that may occur)

 2 ) create a timer that processes the records from the new table that have the boolean field set as false. The timer will read the binary and then import the records to the table.

Since you are using a timer and not importing through the web browser you should be ok.

Importing through the web browser has several limitation that do not exist when using a timer.

Hope this is helpfull. 

If you have any more questios i can try to set up an example .oml for you.

Best regards 

Rodrigo Henriques

Hi Rodrigo, Thanks for your reply.

I did what you said, but the system goes down.

If I insert "in runtime", the error is by memory, 

but with a Timer the system collapses.

I would appreciate you, if you shared me an example .oml.


Thank you very much


Hi there,

The bulk insert component may use several times the size of the file in memory (3x-6x, depending on the number of columns and data types), since it copies the input list while processing it. Still, running out of memory for a 20MB file makes it sound like your system needs more memory. OutSystems platform installations on the Java stack have very high memory requirements.  

Rodrigo's suggestion is definitely a good practice when dealing with large files, but it's possible that it doesn't sole the memory issue, if your system runs out of memory just from reading the file into memory. By the way, the bulk insert component may still be useful with the timer approach, because your timer may still need to insert large amounts of data when it gets to processing the file, but you should try not using it and see if that solves the memory issue.

If I insert "in runtime", the error is by memory, but with a Timer the system collapses.

What do you mean by "collapses"?

Hi Joao.

Any request to the server after running the process reamins in the limbo.

I can't access the service center. 

The system responds in about 30 minutes.


Thanks so much for your reply.



Joseph Espinosa wrote:

Hi Joao.

Any request to the server after running the process reamins in the limbo.

I can't access the service center. 

The system responds in about 30 minutes.


Thanks so much for your reply.



Hi Joseph,

I've attached an example .oml.

However i think that your problem is the volume of data that you are trying to import in a single file.

How many elements are you trying to import?

Can you split the file in multiple files and then use the timer (please view the example im sending) to import each file separately?

Best regards

Rodrigo Henriques

I have been able to import to the database 70 thousand records with 25 columns in 10 minutes. I just have used the set timeout action and it works well.