Hi everyone,
I'm working on OutSystems Developer Cloud (ODC) and facing a performance issue while updating data in 3 different entities using a loop. The number of iterations depends on a dynamic count.
Initially, I used 3 separate service actions for each update. I then merged the logic into a single service action, thinking it would reduce the execution time. However, even with this change, the performance remains the same.
Currently, it takes around 5 seconds for just 2 iterations, which seems quite high. Unfortunately, I'm unable to share the ODC module due to restrictions.
Could anyone please suggest how I can reduce the execution/render time in ODC? Are there any performance best practices for handling loops, entity updates ?
NOTE : Already for the table update I've been using the filter.Warm regards,
Dinesh M
Hi Denish,
It would help if you would share your code (OML), to give a better advice. Based on the little information you shared, these are some suggestions that might help you improve execution time.
Here are some interesting articles to read, to get a better understanding of database bulk operations:
Regards,
Daniel
Hi Dinesh M, you may use this as a reference for some optimization checks
Best Regards,
Philip Paolo
Hi @Dinesh Murugan,Despite @Daniël Kuhlmann had already specified the respective best practices to taken into consideration; Here's what something I would suggest:1- You should re-asses your DB arch to make these types of usecases to be handle hassle free.2- If DB arch is good enough & no refactor is needed then in that case considering the situation where you want to save the data of 3 diff packets in one call.. I would recommend you to think of handeling this with the help of timer via a custom job approach. Like sharing the entire JSON meta to server & using timer to process this. But again this may affect your UX ... https://success.outsystems.com/documentation/outsystems_developer_cloud/building_apps/use_timers/create_and_run_timers/Good Luck
Hi @Dinesh Murugan ,
To improve execution time when processing multiple entity updates, avoid calling service actions or performing entity updates inside loops. Instead, accumulate the required changes in local data structures (e.g., lists or records) during the loop execution, and apply updates in bulk after the loop completes.
Also, minimize row-by-row database operations by using bulk update patterns where possible. This reduces database transaction overhead and improves scalability.
Additionally, ensure all necessary data is fetched once before the loop begins. Avoid repeated synchronous reads (like GetEntityById) within loops, as these can significantly degrade performance.
Thanks,
Senthil Nathan
Hi @Dinesh Murugan ,Adding to @Daniël Kuhlmann response.
I'd suggest going straight to Advanced SQL and bypassing the "For Each" loop entirely.
The lag might be caused by the round-trips between your app and the database. Even inside one Service Action, running Entity Actions in a loop will always be slow in a cloud environment like ODC.
Since you're in ODC, you can do a set-based update using PostgreSQL’s JSON functions. This way, the database handles everything in one go.
The Workflow:
Use JSON Serialize to turn your list into a string.
Pass that string to an Advanced SQL block.
Use jsonb_to_recordset to join that JSON directly to your entity.
Sample SQL (ODC):
UPDATE {YourEntity} AS Target SET [Status] = Source."Status", [ModifiedDate] = Source."ModifiedDate" FROM jsonb_to_recordset(@JSONInput::jsonb) AS Source( "Id" BIGINT, "Status" TEXT, "ModifiedDate" TIMESTAMP ) WHERE Target.[Id] = Source."Id";
What the @JSONInput looks like:
[ { "Id": 101, "Status": "Processed", "ModifiedDate": "2026-05-06T10:00:00Z" }]
This usually drops execution time from seconds to milliseconds because you're only hitting the database once. Worth a try!