I have an action to insert large data from a CSV file by writing an SQL INSERT command in an extension. When inserting 6 million records into a table with about 30 columns, there is no error. However, when I insert 1 million records into a table with up to 400 columns, I get the error: “Request failed with status 502.” This error cannot even be caught as an exception in the extension, so I have no idea what’s causing the issue. I would like to ask if anyone has ever encountered this situation and how to resolve this error? Thank you!
Hi @Duc Vuong ,
This might be an infrastructure issue, I would recommend contacting OutSystems support team for official support on this issue.
You can also take a look at this article: Troubleshooting HTTP 502 bad gateway
Hello,
I think inserting 1 million records at once is too heavy on the backend. I’d say use a timer to process the insertion in chunks. For example, insert 200k records at a time, commit the transaction, and then loop again until you approach the timeout limit. You can then trigger the timer again to continue the process, and of course add a retry mechanism to handle any failures that may occur.
In addition to others,
For very large data loads, avoid row-by-row inserts in OutSystems. Use database bulk insert instead.SqlBulkCopy (or the BulkInsert Forge component) sends data to SQL Server in optimized batches, drastically reducing overhead and preventing 502 errors.
Regards,
Manish Jawla