Massive loading time of charts due to huge volume data
Application Type


PS : This is not a coding question but something that can be useful for others as most of applications consists of hundreds of thousands - if not millions - of records.

I'm building a KPI-type application that generates charts(12 charts) based on entity that is created from excel file. There is a multi select dropdown (this) for projects selection. There's a select all option as well that generates charts based on data from all projects.

The issue is related to the loading time. The maximum time the loading of charts took was around 12 secs for a project that had around 3000 records. This is fine considering the timeout is 20 secs and 12 secs isn't that long to wait. But when users select 'Select all' option - which considers around 40 projects for chart generation - the time it takes is more than 50 seconds. The highest it took was 73 seconds(could be due to network traffic) to generate report. For this, I even got connection timeout at first when timeout was 20 secs, but then I increased it to 120 secs(just to check how much time the loading takes). 

Is there any better way (if, according to coding standards(guidelines) or any idea that comes to mind) to reduce the loading time? Or do something else so that the users understand that there are a lot of records, which on conversion into charts, takes time ? 

Or is there any way to make chart generation faster? I'm using dynamic highcharts. 


Aman Devrath

Hi Aman Devrath,

The amount of time it takes to fetch data depends on how you are fetching the data. Are you using aggregates and loops to generate your data or Advanced SQL? 

The best way as I can say is to generate a proper SQL logic and fetch only those columns that you need. Also if you can limit the records fetched that can also work.

See the best practices for reference.

Data Model Best Practices

Queries Best Practices

SQL Server Best Practices



Hi @Pranav Pandey ,

This is my process for generating charts.

  1. I upload an excel file that has all the data. This excel file is converted into an entity.
  2. Then I pick data from this entity and make 7 entities - for eg. backlogtickets, prioritypercentage, resourcewiseallocation etc. 
  3. Further, I have a dropdown on screen that shows project list. On selection of a project or all projects(using select all option), I loop through the 7 entities, pass some SQL queries for generating list or text or values so that they can be passed into highcharts as variables.
  4. Now these variables are passed into highcharts dynamically.
  5. So I have 21 server actions, 7 client actions, 11 data action and around 40 variables. (excluding variables inside server actions, client actions and data actions)

I can't remove or reduce these variables or server actions. Only feasible option is to reduce some loops if possible.

Thanks & Regards,
Aman Devrath


Hi Aman,

When creating reports take a long time, the typical approach is to use a Timer or a BPT-process to process the report, and signal the user when the report is ready (e.g. by sending an e-mail). Then all data will be in the database, and the user can look at the report without delays.

However, in your original question you ask about "chart generation". So is it the actual loading time of the chart alone that's slow (i.e. HighCharts is just slow with a huge amount of data), or is it the processing leading up to the chart being available that's slow?

Hi @Kilian Hekhuis ,

This is the log from after the submit button is clicked, i.e. , right from creation of entities, assigning values to variables, then passing the variables to highcharts, and finally chart generation.

Apparently, the loading of charts is a client action which is called through OnRender. and it is looping through it multiple times and I'm not able to figure out the time for just loading of charts. (Image below for reference)


Aman Devrath

Why are you using 'OnRender' event to display chart data? Whenever there is a change on screen, 'OnRender' will trigger and maybe that is why you are seeing that loop. Can you try 'OnReady' instead and check again?

Hello Aman,

It is hard to advice without more details, but you could look into preparing your datapoints when loading your data.  I'm assuming you are not actually showing 3000 datapoints in a single chart but some dort of aggregated data as totals per month and the like.

So try to look at all calculations that go on and see if you can pull them forward from the time of presentation to the time loading data.

This means storing lots of redundant information for the sake of performance.

I can't make it more specific than this without more information.


Hi Dorine,

Yes, I'm showing only the aggregated data, like count of low/medium/high tickets (date wise), no of open tickets, no of close tickets etc. these are all calculated while performing SQL operations.

"So try to look at all calculations that go on and see if you can pull them forward from the time of presentation to the time loading data. "
Sorry I could not understand your above statement. Could you please help? Specially the underlined part of statement.

I'm attaching the logs if you needed something else from me
Sorry I don't know what else I can share. 

loadapplicationdata is a client action that gets triggered on submit button click. 
start loadapplication is printed in logs at the start and end loadapplicationdata is printed at the end of the flow.
totalopen is a server action. 



Hi Aman,

It's not possible to really help on performance problems without details like action flows and data models and sql executed.

But from your replies so far, it looks like there is a complete process happening, including first producing data into your database from excel source all the way up to displaying the charts.  Is that the case ? Every time? Or is data loaded once and charts displayed many times with same data?

Your post title suggested that you have a performance problem in retrieving data for the charts.  IN THAT CASE, you could improve performance by storing the data in a more aggregated / chart-ready form (for example storing the count low/medium/high at loading time so you only have to read them instead of reaggregating them at display time)

That's what I meant with 'pull them forward'

And looking at your initial post, I think this might be an option, so that everytime users changes filter on project etc, your sql just has to retrieve different precalculated totals and averages etc. Instead of aggregating them again. 

IF HOWEVER you are doing the whole process of loading + displaying every time, the problem might lie in the loading rather than the displaying.  Then the real questions are : what is the source of your data, why on earth are you moving large amounts of data around between systems manually, can you load delta information instead of everything, etc.


Hi Aman Devrath,

Looking into your process steps the only thing I can suggest at the moment is trying to reduce the loop logic and if possible merge all the aggregate and SQL logic into a single SQL or aggregrate. Make use of List Append all functions instead of list append if you are merging two lists to generate a result. 

If you cannot shorten your current logic then another approach that I have used is to limit the data fetch to generate graphs. Show data on a week, month, or quarterly basis (3 months) or between two data with the fewer days difference (say max 3 months apart).

Hi Pranav,

Thank you for this. I'll definitely try this. 

Thanks & Regards,


Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.