59
Views
5
Comments
Solved
I am facing an issues can not loading data of aggregate
Question

Hi all,

I have a table get data from aggregate get data based on ID given. But for some ID, It can not be loaded and throw me this error (please refer image attached). For my table, I also add pagination to only show 100 records each time. 

I really appreciate for any ideas could help me solve this problem.

2020-11-05 04-47-48
Nghia Hoang
Solution

Hi @Cuong Su , please make sure you have proper Index for the Entities. Hope this helps.

UserImage.jpg
Cuong Su

It work now, appreciate for your support 👍

2024-09-17 08-54-53
Jerome Rajadurai J

Hi  @Cuong Su ,

I think the aggregate procees a larger set of data, Try to optimize the query. As a workaround, try to increase the module server request timeout. But this is not a best practice. Im suggesting the method just to test if data is extracted out from aggregate.

UserImage.jpg
Cuong Su

I think the reason might be because I only store all the data in one table and when I use aggregate to get data (of course I just get a few based on ID given). But seem like for some case these ID might be nearly at the end of the table so it take time to query it records in table to get.

Divide table will be not the best ideas because data now quite large and too much users are using now. Do you have any ideas?

I also increase time out to 900 but still not working.

UserImage.jpg
Nani

Hi, Cuong su

If you share your functionality we might understand you problem and try to come up with a solution.

Are there any binary attributes in the aggregate, if there is a large data it will take time from server to client and some times server will raise a error when thread couldn't handle the size of the data.


2020-11-05 04-47-48
Nghia Hoang
Solution

Hi @Cuong Su , please make sure you have proper Index for the Entities. Hope this helps.

UserImage.jpg
Cuong Su

It work now, appreciate for your support 👍

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.