181
Views
11
Comments
How to load massive amounts of data on a outsystems dashboard

I'm currently developing a dashboard in OutSystems, where the charts rely on various aggregates to fetch data. The charts work seamlessly in the development (dev) and quality assurance (QA) environments. However, upon deploying to the production environment, I encounter the 'error executing query' issue, likely due to timeouts caused by the substantial volume of data in production compared to dev and QA.

I'm seeking advice on optimizing the queries. One consideration was caching the data, but it might not be feasible due to the frequent data changes. It's worth mentioning that I load all aggregates at the start, not on demand.

Any insights or suggestions on speeding up queries in a production environment would be greatly appreciated.

2025-04-17 05-42-16
Ajit Kurane

Hi,

I would like to suggest check all the calculated attributes if you're using in the aggregates or filters in aggregates.

That would be great if you share the server centre error log details.

Along with that please go through below documents,

https://success.outsystems.com/documentation/best_practices/performance_and_monitoring/performance_best_practices_queries/

Thanks.

2023-09-02 22-03-06
Tristan Steed

Thanks Ajit , there seems to have been a problem with the data type conversion.


2025-04-17 05-42-16
Ajit Kurane

Did you resolve it now ?

If not then you can check below conversation to get it done. You can take hint from it. Also check format of date time and you can use FormatDate (date time, text) function to make it correct.

https://stackoverflow.com/questions/20838344/sql-the-conversion-of-a-varchar-data-type-to-a-datetime-data-type-resulted-in

Thanks.

2021-09-06 15-09-53
Dorine Boudry
 
MVP

Can you start by checking the full error in service center, i would expect a different message in case of timeouts

2023-09-02 22-03-06
Tristan Steed

Hello Dorine, thank you for your response. I've investigated the Service Center logs, and it appears that there was an issue with data type conversion. 

2021-09-06 15-09-53
Dorine Boudry
 
MVP

This is an external database ??

If not, are you using sql widgets with 'select *' ?

2023-09-02 22-03-06
Tristan Steed

Hello @Dorine Boudry, we are utilizing the OutSystems built-in database, but our production data is sourced from a legacy PostgreSQL database. We exported the PostgreSQL data into CSV files and subsequently converted them into Excel files for the purpose of bootstrapping the data.

2023-09-02 22-03-06
Tristan Steed

"If not, are you using sql widgets with 'select *' ? " - We are solely using Aggregates in the dashboard charts no advanced SQL query's.

2021-09-06 15-09-53
Dorine Boudry
 
MVP

i suspect something not going absolutely right in your data loading process, but it's going to be a hard job finding out what records in what entity are causing this.

It will probably be in the corner of datetime values being valid in postgress but not in sql server, but I'm not really seeing how they could have passed the bootstrapping (if done with Outsystems) 

So you'll have to investigate your datetime fields, and see of you find any odd formats, you'll have to probably improve on your conversion process, or, if that is not repeatable, write some scripts / timers to fix your data.

But first the tough job of finding what data is bad.


2021-09-06 15-09-53
Dorine Boudry
 
MVP

alternatively, you are using some datetime parameters for the aggregates that come from settings / site properties, and the value of one of those is not set to a correct date yet in production

2025-01-09 14-56-57
IQ78

Hi, i think the best practice is:

1. create entity for all the summary used in the dashboard and populate it in a timer (at midnight)

2. as it is real time, u need to summarize only the current day transaction  and then join with the 'already well-done' entity (no 1)...and i found it is very tuning.

regards  

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.