Duplicate rows in table during import
Application Type
Traditional Web
Service Studio Version
10.0.1116.0
Platform Version
10.0.1111.0

Problem statement.

We have a web application where the user enters an order number and then clicks on a button called "Import order". The application retrieves the order from another system and imports the data to the Outsystem database.

Problem is there was a case where an order had more than 300 order item lines and duplicate lines were found in the database entity.

Before going into the coding aspect, I wanted to know if the strategy to deal with this will work.

Background

There are multiple entities involved. Let's call them:

Orderheader

OrderheaderExtension1

OrderheaderExtension2

OrderItems

OrderItemsExtension1

OrderItemsExtension2

OrderItemDelivery

and more

Multiple clicks on the button are already disallowed through JavaScript code.

I suspect because the network is slow and when the number of items are large, as in this case, users are opening another tab or window to import again. 

Strategy

I understand in Outsystems, the DB commit only happens just before the HTTP response is sent back to the browser

So I would need to do an explicit commit in this solution.

I was thinking of doing the following:

- Inside one server action - let's call it sa1 -  insert the order number in the Order header table and commit explicitly.

- In another server action - let's call it sa2 - insert into the other entities and let Outsystems auto commit as it would by default.

- Now, if the user opened another tab, and tried to import the same order number, sa1 would first query the order number, find it and then exit and send an exception message to the user.

Can anyone figure any loopholes in this proposed solution?

Hi Anthony,

What I understand from your example is that OrderNumber should be a unique field. It might help if you mark that entity attribute as Unique in the Indexes section, so you also assure from the data model perspective that the same OrderNumber will not be inserted twice under no circumstances.

Other than that, what you explained seems to be a good idea. Validating a record against certain business rules (uniqueness etc.) before inserting into the DB and not limiting this validation to frontend is a best practice.

Hi Anthony,

To solve the problem of importing a large number of records what you can do is not processing the file data immediately but instead load it in new  database entity and save it in an binary field. Add also a status field. When you upload set the status to uploaded

Then you can have a timer to trigger that import as you are doing now, so when finished set the status to processed. 

Another advantage of this approach is that you will not hang the screen till the operation ends. 

check out this post

https://www.outsystems.com/forums/discussion/70362/excel-worksheets-data-import-asynchronously/

Hope that it helps you 

 

Hi Anthony,

I missed that your data was from a external source, this approach was for an excel import with a large number of records. 

Nevertheless you can apply what you are suggesting, ensuring that user does not wait till the insert data ends in the other entities. 

This will help you in the duplicate issue and for sure the user experience.

Hope that it helps you 


mvp_badge
MVP
Solution

Hi Anthony,

OutSystems does not allow composite keys, but it does allow (unique) indexes with more than one attribute.

Assuming your OrderItems is a junction entity (defining a many-to-many relationship between your OrderHeader entity and some hypothetical Product entity), you just need to create an index on it, set it to be unique, and add to it the reference attributes for the different entities that participate in that relationship (OrderHeaderId, ProductId, etc…)

This way the database will guarantee for you that there are no duplicate entries.

but if your data is large and takes time to process, the timer approach would still be useful: your users wouldn’t be stuck and tempted to try submitting again.

Hi Anthony,

What I understand from your example is that OrderNumber should be a unique field. It might help if you mark that entity attribute as Unique in the Indexes section, so you also assure from the data model perspective that the same OrderNumber will not be inserted twice under no circumstances.

Other than that, what you explained seems to be a good idea. Validating a record against certain business rules (uniqueness etc.) before inserting into the DB and not limiting this validation to frontend is a best practice.

Hi Anthony,

To solve the problem of importing a large number of records what you can do is not processing the file data immediately but instead load it in new  database entity and save it in an binary field. Add also a status field. When you upload set the status to uploaded

Then you can have a timer to trigger that import as you are doing now, so when finished set the status to processed. 

Another advantage of this approach is that you will not hang the screen till the operation ends. 

check out this post

https://www.outsystems.com/forums/discussion/70362/excel-worksheets-data-import-asynchronously/

Hope that it helps you 

 

Thank you Ozan Cali and Alexandre.

Ozan Cali, yes, the order number is unique. But the order items entity does not seem to have a composite key and that is why duplicate rows are being created.

The data is arriving from the remote system populating a List of lists.

Alexandre, I figure your proposal is to insert the order number in one entity first and respond to the user that the order number NNN is being imported.". The entity would have the order number set with a unqiue constraint. And then let the timer job import and fill all the other entities. In this way, the timer maintains control ensuring that duplicate entries do not happen. Is it like that?

mvp_badge
MVP
Solution

Hi Anthony,

OutSystems does not allow composite keys, but it does allow (unique) indexes with more than one attribute.

Assuming your OrderItems is a junction entity (defining a many-to-many relationship between your OrderHeader entity and some hypothetical Product entity), you just need to create an index on it, set it to be unique, and add to it the reference attributes for the different entities that participate in that relationship (OrderHeaderId, ProductId, etc…)

This way the database will guarantee for you that there are no duplicate entries.

but if your data is large and takes time to process, the timer approach would still be useful: your users wouldn’t be stuck and tempted to try submitting again.

Hi Anthony,

I missed that your data was from a external source, this approach was for an excel import with a large number of records. 

Nevertheless you can apply what you are suggesting, ensuring that user does not wait till the insert data ends in the other entities. 

This will help you in the duplicate issue and for sure the user experience.

Hope that it helps you 


Thank you everyone. Now I can plan for a robust solution.

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.