Hi,
We have large data size in database. Below is the description of our query-
This query is taking approx. 2 minutes to execute.
We have tried the SQL query optimization.
We want to know, Is this time as expected or any scope of improvement?
Below is the sample masked SQL query, please ignore the syntax-
SELECT E.[A], E.[B], G.[C], G.[D],
Sum(CASE WHEN (I.[A] = 0) THEN 0 ELSE 1 END),
(CASE WHEN (Sum(CASE WHEN (I.[B] = 0) THEN 0 ELSE 1 END) >= convert(bigint, 1)) THEN 1 ELSE 0 END) E,
SUM(I.[B]) F,
SUM(I.[C]) G,
SUM(I.[D]) - SUM(I.[E]) H,
Sum(I.[E]) Ii,
Sum(I.[F]) J,
SUM(P.[A]) K,
SUM(IP.[PRT_ALL_VL]) L,
E.[ID] ID,
0,
@ProjectId PROJECTID,
COUNT(*) OVER()
FROM {ENT} E
INNER JOIN {GCC_ENT} GE ON (E.[ID] = GE.[ENID])
LEFT JOIN {GCC} G ON (GE.[GCCID] = G.[ID])
LEFT JOIN {IN_EN} IE ON (E.[ID] = IE.[ENTID])
INNER JOIN {INST} I ON (IE.[INST_ID] = I.[ID])
LEFT JOIN {INS_PROTEC} IP ON (IE.[INST_ID] = IP.[INST_ID])
LEFT JOIN {PROT} P ON (IP.[PRT_ID] = P.[ID])
INNER JOIN {CNTRY} CN ON (E.[CNTRY] = CN.[ID])
LEFT JOIN {IMPR_ST} IMPS ON (I.[IMP_ST] = IMPS.[ID])
WHERE (E.[ProjectID] = @ProjectId
AND (E.[INTG_CD] = @InteRat OR @InteRat = 0)
AND (E.[ECN_AC_TXT] = @FilterEcActivity OR @FilterEcActivity = '')
AND (@EcoClause)
AND (@IntClause)
AND (@CnClause)
AND (@ImClause)
AND (@InfClause)
AND (IMPS.[IMP_NM] = @ImpStatus OR @ImpStatus = '')
AND (E.[EN_NM] = @Ent OR @Ent = '')
AND ((@HasDiff = 1 AND E.[INTL_R_CD] <> E.[INT_R_P_CD]) OR @HasDiff = 0))
GROUP BY E.[A], E.[B],
G.[C],
G.[D],
E.[E], E.[F], E.[G], E.[H] - E.[Ii],
E.[J],
CN.[K], E.[L],
IMPS.[M],
E.[ID]
HAVING (((@AcRange = 0 OR SUM(I.[ACC_INSMNT]) >= @AcRange)
AND (@AcmImpRange = 0 OR SUM(I.[ACC_INSMNT]) <= @AcmImpRange))
AND ((@OutAmtStartRange = 0 OR Sum(I.[OTST_INST]) >= @OutAmtStartRange)
AND (@OutAmtEndRange = 0 OR Sum(I.[OTST_INST]) <= @OutAmtEndRange)))
ORDER BY @OrderBy
OFFSET @Skip ROWS
FETCH NEXT @PageSize ROWS ONLY
Best Regards,
Nitin
Hi @Nitin Chavan ,
Have you tried by limiting the number of rows by using the below link recommendation:
https://success.outsystems.com/documentation/11/managing_the_applications_lifecycle/manage_technical_debt/code_analysis_patterns/control_the_number_of_records_fetched_from_the_db/
Regards,
Manish Jawla
Hi @Manish Jawla ,
We are using pagination with Page size as 10-
40.000.000 records and you are getting several SUM() values... and calculate everything live. I think that wil never be very fast. If it is possible to use near-live data you could create a new table with all the calculated results and populate that one using a database job in the database.
If you query that table in your application, it will be much faster. Although the data could be a few minutes older.
Hi @Erwin van Rijsewijk ,
It's nice idea to create the new table to store calculated result.
But in our case, we have provided filters on screen, hence this calculations are dynamic based on filtered result.