azure-storage-connector-actions
Service icon

Azure Storage Connector Actions

Stable version 1.0.0 (Compatible with OutSystems 11)
Uploaded
 on 31 Jan
 by 
0.0
 (0 ratings)
azure-storage-connector-actions

Azure Storage Connector Actions

Documentation
1.0.0

Azure_UploadFile

Input Parameters

Output Parameters

  • message : process flow message

  • file_url : server url of the uploaded file

  • Is_success : flow status 

Sample Usage

Above screenshot is sample for uploading image to the blob, this is using the upload widget and getting the required filename and binary data, for other input parameter you should have that set up on you Azure account

Azure_DeleteBlob

Input Parameters

Output Parameters

  • message : process flow message

  • Is_success : flow status 

Sample Usage

Above screenshot is sample for deleting image from the blob then for other input parameter you should have that set 


Azure_DownloadBlob

Input Parameters

Output Parameters

  • message : process flow message

  • Is_success : flow status 

  • file_content : Binary content downloaded from AZURE blob

Sample Usage

Above screenshot is sample for downloading image this will return binary data that you can download or use, from the blob then for other input parameter you should have that set 


Azure_ListBlobs_With_Metadata

Input Parameters

  • max_record : max record to retrieve in the azure blob storage

  • next_mark : if there is continuation or next page this will be populated data from "NextMarker" value from the JSON response

  • resource : F = Files only; D - Directories Only; A - All

  • prefix : Filters the results to return only blobs whose names begin with the specified prefix

  • storage_account_name : Azure account storage name

  • root_container : root container of the blob

  • include blob metadata : Include blob metadata

  • azure_header_id : Azure header credential for authentication

    • tenant_id : Azure account tenant ID

    • client_id : Azure account client ID

    • secret_id : Azure account secret ID

  • xms_version : Versioning for Azure Storage, recommended to use the latest https://learn.microsoft.com/en-us/rest/api/storageservices/versioning-for-the-azure-storage-services

  • xms_blobtype : file blob type check for additional info https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction

Output Parameters

  • message : process flow message

  • Is_success : flow status 

  • JSON_Output : Converted JSON output from XML response

Sample Usage

Sample above request to return file list only from your root container meaning all files uploaded on that container will be returned but it will also depend on the the max_record you set, then for the nex_mark it will be populated once there are more items to be return but not included in the current request, This action also returns JSON text so that you will be able to structure what output you want to get like if you also want to get the Metadata of that blob. Sample JSON response below 

Azure_PostPut_Table_Storage

Input Parameters

Output Parameters

  • message : process flow message

  • Is_success : flow status 

Sample Usage

Azure_BatchPOST_Table_Storage

Input Parameters

  • request_body: JSON string that will act as your Azure table storage “Schema” like format (Your JSON request body, It should be single json structure not LIST)

  • azure_account_name: Azure account storage name

  • is_initial: Checker if the first I call will be initial to set BatchID and ChangeSet 

  • is_final: Checker if the call will be the final this is to post/put the request

  • in_batch_id: a unique identifier for the batch request

  • in_changeset: a unique identifier for grouping multiple operations inside a batch

  • azure_table_name: Your azure table directory where we need to store the record

  • in_partial_request_body: This is where we will build the final request body for posting in the Batch process

  • row_key: The row key is a unique identifier for an entity within a given partition (you can use your entity primary ID if you are syncing records from Outsystems to Azure) https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model 

  • partition_key: The partition key is a unique identifier for the partition within a given table (You can you Tenant name in here or other that might suit your flow needs) https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model 

  • is_update: This is to determine where to POST or MERGE your record

  • azure_header_id : Azure header credential for authentication

    • tenant_id : Azure account tenant ID

    • client_id : Azure account client ID

    • secret_id : Azure account secret ID

  • xms_version : Versioning for Azure Storage, recommended to use the latest https://learn.microsoft.com/en-us/rest/api/storageservices/versioning-for-the-azure-storage-services

Output Parameters

  • message : process flow message

  • Is_success : flow status 

  • out_partial_request_body : action output that will be used for building the final request body this will be used on the flow for temporary values

  • out_changeset : action output that will be used for building the final request body this will be used on the flow for temporary values

  • out_batchid : action output that will be used for building the final request body this will be used on the flow for temporary values

Sample Usage

Aciont (BulkUpload)

  • temp_counter (int) : will be used for counting how many records have we processed this is due to the max limit of per batch would be 100 rows

  • Temp_request_body (text) : this will hold the output value of the Azure_BatchPost_Table_Storage action to build the final request body

  • temp_changeset (text) : this will hold current the change_set of the iteration

  • temp-batchid (text) : this will hold current batch_id of the iteration

  • TempUpload (text) : this is the structure I created for the upload (note that always include PatitionKey and RowKey on your structure)

  • is_initial (boolean) : flag to check whether the flow is initial or not

As seen on the screenshot above the implementation will be composed of multiple Assign widget that will build out final request body and also some logic to cater this event

Let’s start on the loop, the iteration will be based on your business logic requirement as a sample I used the user table upon going to the iteration flow our first step is to assign the “is_initial” value, 

We set the is_initial flag based on the temt_counter we have so meaning in the first run it will always be true 

Next is the assign entity value based on your business criteria on my example I am using the UserTable 

Next is to serialize the structure into a JSON text format

Then now will be the Action that will do the actual process and building the request, as seen on the assignment most of them are straight forward, the only addition will be the “is_final” input parameter because we a logic here first is to check if the temp_counter hits 99 which means we are almost at the limit of azure batch process then the other condition is if the current row is the final row.


Then the final one on my sample would be the assigning of temporary values, in here majority of the assignment is also straightforward except the “temp_counter” since we will check if we hit the limit or not if now we will increment of the temp_counter to +1 then if we hit the counter to 99 meaning the next iteration will be the final and the action Azure_BatchPOST_Table_Storage will now do the actual posting.