35
Views
10
Comments
Time Out Error while fetching data on screen
Discussion

We all face timeout issue, it's common but not easy we all will agree on that.

I want to collect all links so it will help everyone here, from our community....

My issue: 
I'm having timeout issue while fetching data from service action in Core module to UI module in Data action, 
In service action I'm having data coming from two tables where I'm trying to fetch data for Property Name for which no meeting is scheduled in last 3 months and as well the last meeting held date and last assignee name for all if there is any.

2025-09-04 06-33-37
Nikhil___Vijay

Hello @Ayushi Kumari
If you're encountering server timeout issues, you can adjust the 'Server Request Timeout' setting. By default, it's set to 10 seconds, but you can customize it based on your requirements to prevent such timeouts.
Regards 
Nikhil Kumar Vijay

2025-06-19 10-02-53
Ayushi Kumari

Hi @Nikhil___Vijay
Thanks for suggesting the solutions.
I am having more than 3000 records.
In service action I'm running a loop which cover this 3k plus records and then appending it in a list based on filters.
I need to show this list into a table which I can be able to sort dynamically.

2026-02-26 06-29-24
Rahul
 
MVP

Hi @Ayushi Kumari ,

In service action what you are using Aggregate or SQL?
And have you set Indexes on attributes which your using for filter condition? 

How much records you are selecting at a time or doing any pagination like MaxRecord?

If not use it and try it. This also reduce load on query.


Regards

Rahul

2025-06-19 10-02-53
Ayushi Kumari

Hi @Rahul Sahu
Inside service action, I'm using Aggregate.
I have approx 4000 records.
In service action I'm running a loop which cover this 3k plus records and then appending it in a list based on filters. 
Currently I'm using client action (ListSort) to sort the table due to timeout error causing.
But I'm looking for more efficient way to resolve this kind of issues in future.

For max records I am using site var with default 50000 value

2024-01-04 09-21-21
Venkatesaiya

Hi @Ayushi Kumari ,

Please refer this Documentation timeouts_under_the_hood,which provides information about timeout Issues for all types 

Thanks

2025-06-19 10-02-53
Ayushi Kumari

Hi @Venkatesaiya
Thanks for sharing the link, I'll go through it.

Regards
Ayushi

2025-08-07 06-30-56
Amit J
Champion

Hi

Below are the points you can check to optimise the performance 

* Use filters directly in the Aggregate instead of using loops to filter records.

* Add sorting inside the Aggregate itself instead of using `ListSort` in the client action.

* Avoid loading all 4000+ records at once. Use pagination in the UI to fetch limited records per page.

* Reduce the `MaxRecords` from 50000 to a practical number like 1000–5000 unless absolutely needed.

* Replace loop and `ListAppend` logic with either:

 Well-filtered Aggregate, or  Advanced SQL with proper conditions and sorting.

* If complex logic is required, write it in Advanced SQL rather than looping over aggregate results.

* Use `StartIndex` and `LineCount` parameters in aggregates for pagination if required in advance query 

* Ensure indexes are created on frequently filtered and joined columns in the database.

* Only return the required fields instead of fetching all fields in the aggregate.

* Reuse pre-processed or cached data if applicable instead of fetching and filtering live every time.


Let me know if you'd like help converting a specific logic to Aggregate or SQL.




2025-06-19 10-02-53
Ayushi Kumari

Hi @Amit J

As for security issues I'm not able to directly use table in aggregate on UI module.
Can you suggest some reference case if possible to understand better.

Regards,
Ayushi

2019-11-11 17-10-24
Manish Jawla
 
MVP

Hi @Ayushi Kumari ,

Two things:

  • You need to increase the timeout property of your module.
  • Can you avoid loop in your logic & replace it with list filter?

https://success.outsystems.com/documentation/11/reference/outsystems_apis/system_actions/#ListFilter 

I believe the loop is taking more time and causing this issue.

Since you're looking for a long-term solution that can handle more than 50,000 records, the suggestions above may not be effective for your use case.

Have you considered creating a temporary table and using a timer to process this data on a daily basis or at regular intervals? You could then fetch data directly from this temp table for better performance.

While I’m not fully aware of your business context, I would recommend that for datasets larger than 1–2k records, the processing should be handled asynchronously in the background—rather than on-the-fly through a service or server action.

It’s worth considering whether you can create an additional table to store the filtered records, along with a timestamp field like CreatedAt. This would also allow you to manage the data by deleting older records or updating them each time the timer runs.

Hope this helps.

Regards,

Manish Jawla

2025-06-19 10-02-53
Ayushi Kumari

Hi @Manish Jawla
 
Thanks for the suggestion, I think it will resolve the issue.
I will try this approach.

Regards,
Ayushi

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.