276
Views
9
Comments
Timer Best Practice

I was studying the how to create a good timer. Now when I check the implementation I found that after GetFileUploads aggregate flow is going to the loop and than checking if condition.

I was wondering that is this is a write soliton or we need to first check GetFileUploads.List.Empty and then run the loop ?

Please share your thoughts on that. 

https://learn.outsystems.com/training/journeys/async-capabilities-649/implementing-a-good-timer-exercise/o11/811


2026-02-26 06-29-24
Rahul
 
MVP

Hi @Navneet Garg ,

You can read this point 

But this diagram should be updated.

But also if list is empty it come on if condition without going to loop and End.

2022-12-30 07-28-09
Navneet Garg

Agreed but I thing diagram must be more easy and flow should be more readable and easy if we use GetFileUploads.List.Empty just after the aggregate. It just about best practice (clean code).

2019-09-24 18-41-25
Jorge Martins
 
MVP

Hi @Navneet Garg,

The solution allows for scenarios where we already processed a batch of File Uploads, but there might be more available to process within the soft timeout limits:

  • First we process a batch of File Uploads (keeping tabs on the timeout)
  • Then, after the For Each node we determine if we should grab another batch of File Uploads to process or if there weren't any left to process.

The last If node is an outer loop to process multiple batches of File Uploads (read it as DO ... UNTIL FileUploads.List.Empty, where the ... starts with the FileUploads aggregate and finishes when the For Each finishes)

2016-04-22 00-29-45
Nuno Reis
 
MVP

The solution suggested by default is:

The if Timeout should be outside the loop and go all the way back to aggregate.

Give it enough time so the loop finishes.

On this approach you suggested

You are only checking if the loop is over, not if there is more data. You force the timer to restart to do a new aggregate. It is useful when you have no idea on the time each record takes. Usually that is not the case.


Example:

Fetch 500 records, 1 second to process each (that is a lot), it is done in less than 10 minutes. Check timeout, you have 10 more minutes to run before the standard 20 minutes are over.

If it takes 5 seconds each, process 100 records and so on.


If you have no idea of the time taken to process (for instance, it requires an API call that can take 2 to 20 seconds) your approach is safer.

2025-01-23 09-22-22
ABHIJITH G
Champion

@Jorge Martins Which means it will ensure, even new file uploads happened during the looping time the timer will taken care of those records also, correct?

2016-04-22 00-29-45
Nuno Reis
 
MVP

The aggregate is always in real time. If more data appears while the timer is running, running the refresh of the aggregate will fetch it. No need to restart the timer.

2019-09-24 18-41-25
Jorge Martins
 
MVP

Given the way the database and its transactions are handled in OutSystems:

Yes, if there are new records added to the database while the timer was processing a batch, they will be included in the timer's work (unless the timeout is reached, in which case they will be processed the next time the timer runs).

2016-04-22 00-29-45
Nuno Reis
 
MVP

Having the if right before or after the foreach is not that relevant performance-wise. And if you think about it, it will only save one instruction.


Imagine you have x loops to do.

x Loops run, x Ifs are called, (x+1)th Loop exits and calls one more If.

or

x Ifs are called, x Loops run, the (x+1)th If runs and exits.

The difference in only on the final empty loop.


Do it in the way you can read it better. Usually a simple comment on the side makes either solution quite easy to interpret.

2019-09-24 18-41-25
Jorge Martins
 
MVP

(reposting since original thread was deleted) 

I agree that swapping the direction of those arrows would (marginally) improve the performance of the timer, for the condition GetFileUploads.List.Empty that is in the scenario the best practice documentation describes. This is also what @Nuno Reis explained above. 

But we are talking about a tiny performance gain at the level of the CPU cycles, one more instruction to be executed at the end... this is relevant when optimising algorithms that are used so often that these gains accumulate to the point where they are visible. In most enterprise applications however (the bulk of applications built with OutSystems), the performance bottleneck is at the database-level (aggregates and entity actions take a lot longer than an extra If or a For Each with an empty list) and integrations with other systems (network latency, the other systems performance itself, etc).

For instance, if we wanted to reduce the number of (potentially slow) queries to the database, we might want to change the If condition to something along the lines of GetFileUploads.List.Length < Site.FileUploadsBatchSize. This way if the last aggregate didn't fetch all the records we asked for, we know we don't need to execute it once more because there is no more work to do. This saves us one unneeded database operation. And in this case the order of execution would need to be the one in the original flow.

As we become more senior and experienced developers, and face more complex scenarios, best practices are all about the right trade offs for the particular situation we are facing. And as you can imagine, the original flow might need adjustments to cater to our needs.

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.