[BulkInsert] Unable to BulkInsert an large table

Forge Component
Published on 2018-11-10 by João Pedro Abreu
13 votes
Published on 2018-11-10 by João Pedro Abreu

Hi all,

I’m an newbi in the OutSystems world. I’m trying to do something what seems to be quite simple, but I don’t seem to get it quite right…

I want to copy an table with approximately 2 million records into another table. I’m using the Xif BulkInsert after trying it with an ForEach. For small tables it works like a charm, but with a “large” table I get an time out error in the timer.

Can anyone enlighten me?


Hi Martijn,

When an insert takes too long, it's usually because the target table has many indexes and/or foreign keys. If that's the case, there's not much you can do; It's possible to disable both index and foreign keys and reenable them after the insert is done, which will save you a some time, but that requires higher database access permissions than the platform user has, so you should consider carefully if that's something you can do.

If you don't mind the insert taking long and just want the timer not to timeout, you have another option:

  1. Save the current datetime to a variable as soon as you start the timer;
  2. create a loop where you insert just a few thousand records per iteration;
  3. after each iteration check how much time has passed since you started the timer;
  4. if you are close to reaching the timeout (default is 20 mins), call the timer's wake action and then exit the action; the instance of the timer you were running will end and a new one will start, resetting the timeout; just make sure you keep track of the records you already inserted, so the new instance doesn't insertv duplicate records.

Hi João,

Thanks for your quick response! The target table has no indexes or foreign key's.  It's a newly created table. The time frame doesn’t allow long insert times so I’m looking for a quicker solutions. It looks like when I exceed a certain number of rows, the Insert isn’t working properly.

Do you know if the target table is being accessed exclusively by this process? Doing such a large bulk insert will most likely request a table lock, which wouldn't be obtainable if the table is being accessed.

The target table is exclusively used by this process... 

I've solved this with the loop you mentioned João. But still......