[File Plugin] Get binary data (>30MB) from local file doesn´t work

[File Plugin] Get binary data (>30MB) from local file doesn´t work

  
Forge Component
(3)
Published on 16 Oct (3 days ago) by Pedro Oliveira
3 votes
Published on 16 Oct (3 days ago) by Pedro Oliveira

Hello,


I'm currently using the "GetFileDataFromURI" to get the binary data which I then use to save it temporarly in the database. It works for files under 30MB, but it doesn´t return the binary for the ones greater than 30MB. Is there any workaround for this?


Many thanks,

André Paula

Hi André, could you specify which version of the platform server you are working on and whether this is on Java or .NET? Could you also specify on whether you would like to have this working on a mobile app (in case of P10) or a web app? Finally, could you describe in more detail your use case? What do you want to do with the file downloaded?


Greetings, pedro

Pedro Rodrigues wrote:

Hi André, could you specify which version of the platform server you are working on and whether this is on Java or .NET? Could you also specify on whether you would like to have this working on a mobile app (in case of P10) or a web app? Finally, could you describe in more detail your use case? What do you want to do with the file downloaded?


Greetings, pedro

Hey Pedro,

To clarify:

Use Case: user opens mobile app and records a video or audio, storing it in the phone gallery (already working using media capture cordova) and then when pressing send, the application gets the file (with filepath) and sends it to a s3 amazon bucket. Version P10 Mobile


I have 2 "working" versions,  however they have the same problem - Can´t send more than 30MB if I use the logic from File Plugin


Version 1: FilePlugin + C# Extension -> client + server side

I use the "GetFileDataFromURI" function of File plugin to get the binary data and use c# extension to send it to server. Works if file is less than 30MB.

If I use the binary data of a video directly from the resources with more than 30MB to the extension it works. So the problem is probably on the blob or payload (don´t know) of the file plugin.


Version 2: File Plugin Blob creation logic + Amazon AWS cordova plugin -> full client side

Using blob logic of the same action to get only the blob (i don´t convert it to binary base 64 because it´s not needed) and send it directly to s3 bucket using the javascript api from aws plugin. It works if file is less than 30MB. Gives "NetworkError" if >30MB


Do you got any insight on this? Thanks for your help.


Hi André, I can't see where the problem could lie. It could be that there is some max size limitation on the network you are at. You could try the same code on a different network. You could also upload your code in here for others to check whether they can replicate your problem.

If the problem does not seem to have anything to do with the network, maybe you could give it a go with the File Transfer Plugin in order to upload the file from the device either straight into S3 or first to the platform server and from there to S3. You can also download the File Transfer Sample to check how the plugin is supposed to be used. Check if this helps.

Let us know if you were able to make any progress.


Greetings, pedro

Pedro Rodrigues wrote:

Hi André, I can't see where the problem could lie. It could be that there is some max size limitation on the network you are at. You could try the same code on a different network. You could also upload your code in here for others to check whether they can replicate your problem.

If the problem does not seem to have anything to do with the network, maybe you could give it a go with the File Transfer Plugin in order to upload the file from the device either straight into S3 or first to the platform server and from there to S3. You can also download the File Transfer Sample to check how the plugin is supposed to be used. Check if this helps.

Let us know if you were able to make any progress.


Greetings, pedro

Hi Pedro,

Since I have a working solution of the amazon API, I dont want to change that. Also, I've already tried the File Transfer Sample in another week and it didn´t meet the requirements.

I just need a way to store the file to a blob and then use it to send the info to s3 bucket. I think it´s a problem of the Read javascript function, that probably can´t read more than 30MB at a time or so.

The test is really simple and takes about 5 min to do it. Just install the application, save a video on the phone and try to get the binary of the video with the  "GetFileDataFromURI" function using the file path, like file:///..... For a <30MB file works, if greater, doesn´t.

There was another person with the "same" problem on this exactly component, but didn´t post a solution.


Hi André, it looks as if the problem has to do with the fact that the File Plugin reads the complete file into memory (just checked the javascript code of the plugin). The solution is to read the file in chunks into memory and send it  chunk by chunk to the server (see https://stackoverflow.com/questions/25810051/filereader-api-on-big-files). 

This is exactly what the File Transfer plugin does in the action UploadFile if you set the chunkedMode input parameter to true. I would give it a try. You can either send the file straight to S3 (as S3 has a  Multipart upload REST api) or you could first send it to the platform server and from there to S3. 

If you choose for the latter, you can find in the FileTransferSample module an example of how to expose a REST service to receive the file (check the FileChomp rest api) sent using the UploadFile action. I would reconsider using the FileTransfer Plugin :) I don't think it changes that much your logic at the client. 

Hope this helps.


Greetings, Pedro 

Hi André, one more thought :) You could stream the file directly to S3 from within the client using the Javascript SDK available for AWS S3. Check the answer of Joomler (the one with 22 votes) in here: https://stackoverflow.com/questions/17585881/amazon-s3-direct-file-upload-from-client-browser-private-key-disclosure

The problem of sending the file in chunks is then solved by the SDK itself.


Good luck! Pedro 

Hi Andre,

I also need to upload videos to S3.

Can you share your solution?

Thanks.

Harlin Setiadarma wrote:

Hi Andre,

I also need to upload videos to S3.

Can you share your solution?

Thanks.

Hello Harlin,

Here are the links that helped me create the especific solution needed:


https://github.com/Telerik-Verified-Plugins/Amazon-AWS/blob/master/doc/index.md - The plugin to reference


https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html - The API methods


https://gist.github.com/davidheyman/21d1348758cea83c2aa9 - The example of a multipart upload


Hope it helps,

André Paula

Hi André,

Thanks for the link.

For now, I'm doing upload to S3 using built-in TriggerOfflineDataSync (asynchronous), passing the binary to Server Action, then upload to S3.

I haven't found file size limit yet using this method.

I uploaded 90MB videos and works fine.

Only thing I missed was I can't get a progress bar (chunked mode).

But I can configure it to run automatically when app login, app resume, and when online.