He guys,

I have a design question.
There is a BPM process that has at outcome a call to a API. On this API there is a limit of 150 calls per 5 min. In the night the BPM processes are started seperatly from each other. There are around 500 processes so after 150 calls to the API the other BPM processes will have errors for 5 minutes. After 5 min again 150 will work etc etc.

Because the BPM process is started based on a event it's not possible to combine the processes. 

The external API is giving a return error when the limit is reached and informs in how many minutes the call can be retried.

What would be a good approach to do this?

You may use the message broker/queue in combination with the timers to schedule the API calls and its batches.


Regards,

Swatantra

Hi Freek,

One thing you can do, without much refactoring needed, is:

  • Instead triggering the BPT immediately (I'm assuming you have set "Launch On" to create<Entitity>) you just save the the data to your entity;
  • Extend the entity and add a new parameter, named isProcessed;
  • Create a new entity (e.g. QueuedEventsToProcess)
    • In this entity you need a FK for the entity holding the main data
  • Create a  timer that
    • Runs every 5 minute
    • Fetches 150 records from the main entity (the one holding the data)
    • Make sure to fetch where IsProcessed is set to false
    • For each row insert the primary key into the QueuedEventsToProcess
  • Update you BPT to 
    • "Launch On: CreateQueuedEventsToProcess"
    • Fetch the main data using the FK
    • Update the main data to isProcessed

The timer periodicity can be adjusted. The 150 should be in a SP so you can easily change it if the API throttling settings happen to change. 

One final note, bear in mind that you might have processes with active instances with errors. These will be retried and you might exceed the threshold of 150 call per minute. Consider this while designing the solution.

What Ivo says!

Only thing is there are multiple services that we use and for all of those there is a limit of 150 per 5 min. So maybe I should do something with timers and a tables but then store the next try datetime that we receive back by the API if the limit is reached. 


For example we do a call to there system to connect 2 object to each other. Then we get a webhook back when the connection is done. When we receive the webhook we want to a call to there system to fetch the object. So 1 connection is already 2 calls within 5 min. Then we also have other processes running that are also doing calls to them.