I am working on an application where I need to handle large Excel files (up to 40 MB). Here’s how I’m processing the files:
Upload and Chunking:
Processing with Timer:
Issue:
name: 'OutSystems.Application.ErrorHandling.ExtensionException', message: 'Input payload is too large (33.33MB), maximum allowed is 5.5MB.', stack: 'OutSystems.Application.ErrorHandling.ExtensionException'
Questions:
Additional Details:
Looking forward to suggestions and best practices for managing this scenario. Thank you!
Dear @Mayank Dharmpurikar .
Implement an external File Processing microservice to handle file merging and processing.
1. OutSystems uploads the chunks to a cloud storage service (e.g., AWS S3, Azure Blob Storage).
2. Metadata for chunks (e.g., file URLs) is sent to a microservice.
3. The microservice merges and processes the file, returning the result to OutSystems.
Please have a look at the "Use with large binary files" of the documentation at https://success.outsystems.com/documentation/outsystems_developer_cloud/building_apps/extend_your_apps_with_external_logic_using_custom_code/external_libraries_sdk_readme/ . This talks about the issue you are facing and some work arounds.
Hi,
I am also experiencing the same issue using AWS SDK for multi part upload in S3 bucket.
We are already slicing the binary into 5MB (5242880) but when we pass it to server action (external logic), it throws an error Input payload is too large (6.67MB), maximum allowed is 5.50MB
Any idea for workaround?
At least any document or info on how ODC compute for the Input Payload?
Thank you!
FYI. The minimum file that AWS accepts is 5MB. Hence the reason why I am chunking it to 5MB. Ref: https://docs.aws.amazon.com/AmazonS3/latest/userguide/qfacts.html