Azure DevOps CI /CD - YAML template instead JSON
Application Type
Traditional Web
Service Studio Version
11.10.22 (Build 41777)
Platform Version
11.11.3 (Build 29602)

Hello Team,

We are building an OutSystems pipeline with Azure DevOps using below tutorial,

https://github.com/OutSystems/outsystems-pipeline/wiki/Building-an-OutSystems-pipeline-with-Azure-DevOps

At one step when creating Task Group using  below file : 

windows_azure_task_group_deploy_to_destination_env.json

Due to restriction in organization we can not Import JSON instead they recommend to use YAML template ,

Could anybody help us with this?


Thanks 

Ashish



Hi Ashish,


The OutSystems CI/CD pipelines are built on a Python package that provides functions to support the interactions between the LifeTime and the DevOps automation tool of choice, in your case Azure DevOps.

That means that you can use those functions in the way that suits you best.

So, instead of using the JSON Task Group approach, you can always extend the windows_azure_pipeline_ci template to incorporate the pipeline's remaining activities.


Note: When you call the deploy_latest_tags_to_target_env function for the first time, it will generate an artifact (deployment_manifest) which you can later use on future function calls to ensure you are promoting the same application versions throughout the pipeline execution.

You can check the deployment manifest use on this Jenkins template.


Hope this helps.


Stay safe and best regards,

Duarte Castaño


Hi Duarte  ,


Thank you for your reply.


Just want to clear my understanding, as you mentioned  windows_azure_pipeline_ci needs to be extended does that mean the Task group created after importing JSON should all that tasks needs to be incorporated in the  windows_azure_pipeline_ci  ?

 Is it possible to give some example about extending?


Thanks & Reagrds

Ashish Akolkar



Hi Ashish,


There are many alternatives to achieve what you want.

The first thing you'll need to decide is the type of pipeline you want because each of them has its own advantages and disadvantages, here's an example:

  • YAML Pipeline implementation: extend the template we provide by adding more tasks to run the python scripts.
    • Pros: you'll be using pipeline as code which will be way easier to maintain
    • Cons: there's no built-in feature for approval gates
  • UI-based Release Pipeline implementation: run the python scripts as standalone tasks.
    • Pros: you can take advantage of all Azure DevOps pipelines' built-in functionalities.
    • Cons: pipeline will be harder to maintain.


In any of the cases, if you want to perform a deployment between two environments you'll have to use the deploy_latest_tags_to_target_env python function and provide the necessary input parameters.

Please check the code inside the provided templates and identify how to call this function.


Regards,

Duarte


 

Hi Duarte,

Ok, Still facing  some issues .

Is it possible to connect and discuss ?  So that I can briefly explain you scenario.


Thanks& Regards

Ashish Akolkar

Hi Ashish,

We are actively working on adding a multi-stage YAML pipeline template to the Azure DevOps examples section.

In the meantime, I suggest that you convert the TaskGroup JSON template into YAML so that it can be used as part of your CI/CD pipeline.


Regards,

Duarte

Hi Duarte,

Thank you for your reply 

I have tried to convert Jason to pipeline by using template but I'd doesn't worked, it will be great if you give me some example.


Also one more help  required as in the tutorial didn't found where to copy python scripts in the repo.

Thanks & Regards

Ashish Akolkar

Hi,


Getting below error  after running task


 - task: Bash@3

                  inputs:

                    targetType: 'inline'

                    script: |                    

                     python -m outsystems.pipeline.fetch_lifetime_data --artifacts "$(ArtifactsBuildFolder)" --lt_url  $(LifeTimeHostname) --lt_token $(LifeTimeServiceAccountToken) --lt_api_version $(LifeTimeAPIVersion)

                     echo 'Fetch Lifetime Data'

Error


/usr/bin/python: Error while finding module specification for 'outsystems.pipeline.fetch_lifetime_data' (ModuleNotFoundError: No module named 'outsystems')

Fetch Lifetime Data

Could you please check and  help me to resolve.

Hi Ashish,


It appears that the agent is unable to launch the OutSystems pipeline python scripts. Let me try to troubleshoot it with you.

  1. Is your agent Linux-based? If so, please use python3 to invoke the script.
  2. The ModuleNotFoundError is usually related to the PYTHONPATH environment variable.
    Did you use pip to get the pipeline code? Because pip eliminates the need to explicitly define PYTHONPATH.


Here's an example of a task sequence to include in the YAML pipeline:


- task: Bash@3 

  inputs:

    targetType: 'inline'

    script: pip3 install -U outsystems-pipeline==$(OSPackageVersion)

    workingDirectory: $(System.DefaultWorkingDirectory)

  displayName: 'Install OutSystems Pipeline Package'

- task: Bash@3 

  inputs:

    targetType: 'inline'

    script: python3 -m outsystems.pipeline.fetch_lifetime_data --artifacts "$(ArtifactsBuildFolder)" --lt_url  $(LifeTimeHostname) --lt_token $(LifeTimeServiceAccountToken) --lt_api_version $(LifeTimeAPIVersion)

  displayName: 'Fetch Lifetime Data'


Regards,

Duarte


Hi Duarte,

Thank you for reply and helping me to resolve issues.

Yes I am using Linux agent and used pip to get pipeline code.


Tried with Python 3 but got the same (ModuleNotFoundError: No module named 'outsystems')


Thanks & Regards

Ashish Akolkar

And are you fetching the package via pip? Or are you storing the python code in a local repo?

 Hi ,

Yes fetching package via pip.

As mentioned above I have executed  task "'Install OutSystems Pipeline Package' " first  and then in next task  executed below script

 script: python3 -m outsystems.pipeline.fetch_lifetime_data --artifacts "$(ArtifactsBuildFolder)" --lt_url  $(LifeTimeHostname) --lt_token $(LifeTimeServiceAccountToken) --lt_api_version $(LifeTimeAPIVersion)


 I have copied  the OutSystems ->Pipeline folder  which contains .fetch_lifetime_data file to repo

But seems script does not getting refrence to path to OutSystems folder


Thanks & Regards

Ashish Akolkar

Seems you're having an issue with the modules' local import, I would suggest creating a Python virtual environment, then use pip to get the code package, and finally call the function.

This way you'll be isolating the script execution.



Ok , But i see that outsystem -package is successfully installed only the problem is when is give physical path to fetch lifetime data python file it dosent recognise.


I will check if  there is any restrictions in creating  the Python virtual  environemnt, could you please give me some pointers about creating virtual environment specific to this I will also google meanwhile.


Thanks you for helping me.

Hello One update  , after doing Checkout : Self , 

atleast  script is exacting but now getting the error

ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed

 while running python script fetch_lifetime_data 


Thanks & Regards

Ashish AKolkar

 


Hi,

It's great to hear you've solved the problem and can now call the python scripts.


The SSL issue occurs because the python scripts require the use of a secure connection to communicate with LifeTime. 

We have a code branch to avoid that validation; however, using this strategy, you cannot fetch the code via pip, but must instead you'll need to get it directly from GitHub or save it to a local repository. 

Find it here.


Regards,

Duarte



Hi Duarte,

Thank you for helping us , your every responce is motivating us to resolve the issue :)

We have downloaded  the git folders and stored in local repository in AzureDevOps.

Can you please help how to use this instead pip  package.


Thanks & Regards

Ashish AKolkar


Thank you for the feedback Ashish!


For fetching the code you can just do a simple checkout from your local repository.

Depending on your approach you might need to install the dependencies of the package using the "pip install -q -I -r build_requirements.txt" command targeting this file.


After that, you can make the script call as you were doing before.


Regards,

Duarte



Hi Duarte,


Some tasks are running fine   but some are failing below is the description ,

What I would like to do a create CICD pipeline for deployment request you to please check the sequence  of tasks I am executing and let me know if something is missing ,

got confused as  tasks from JSON are also  incorporated in YAML at the end.    


Till the task 'Deploy to CI Environment' all tasks are running fine but after that tasks are failing please  reffer the  attached file.

Note :   I have referred displayname  for ref.

tasks.txt

Hi Ashish.


Now you need to understand how the deploy_latest_tags_to_target_env function works.

Here are the descriptions of the most important arguments:

  • "--artifacts": Name of the artifacts folder
  • "--lt_url": URL for LifeTime environment, without the API endpoint. Example: https://<lifetime_host>
  • "--lt_token": Token for LifeTime API calls
  • "--lt_api_version": LifeTime API version number. If version <= 10, use 1, if version >= 11, use 2
  • "--source_env": Name, as displayed in LifeTime, of the source environment where the apps are
  • "--destination_env": Name, as displayed in LifeTime, of the destination environment where you want to deploy the apps
  • "--app_list": Comma separated list of apps you want to deploy. Example: "App1,App2 With Spaces,App3_With_Underscores"
  • "--manifest_file": (optional) Manifest file path, used to promote the same application versions throughout the pipeline execution


When you call this function it will make the necessary LifeTime API calls to create, validate and execute a deployment plan to a destination environment. That means all the hard work is still managed within LifeTime along with all the impact analysis as usual in a manual deployment.

If there's an issue with the deployment plan, the function throws an error and creates a new file, named DeploymentConflics, which includes the error message feedback from LifeTime.

In the case of a successful deployment, it creates the deployment_manifest.cache file which contains the information of the applications and their versions. You can use this file as an argument for other deploy_latest_tags_to_target_env calls to make sure you promote the same application versions throughout the pipeline execution.



For your specific error, it happens because you are adapting the script from the Azure DevOps JSON which is receiving the application list from a previously CI pipeline execution. 

Please make sure you are providing the correct arguments for the application list and the deployment manifest file.


Best Regards,

Duarte


Hi Duarte,


Thank for your explanation , we are able to solve the issue now development is successful.

I have few question related to this ,

1.  Manully tag to application  / module is necessary

2. Is there any provision to tag version automatically ? or do we need to write  some logic to tag single/multiple - application / module 

3. I checked trigger pipeline plugin but its only to identify latest tag and trigger pipeline 

4.Is there any document available which lists benefits using Azure DevOps  over Lifetime?


Thanks & Regards

Ashish

Hi Ashish,


LifeTime is the OutSystems Application Lifecycle Management console, and it will continue to be so even if you orchestrate deployments with a CI/CD tool such as Azure DevOps.

With a CI/CD tool, you may include other types of activities, which can improve the quality and delivery confidence of your applications.


In terms of the versioning, the Trigger Pipeline LifeTime plugin already has some of the following features:

  • AutoTagging: applications are tagged automatically (i.e. on-the-fly) when triggering a pipeline 
  • EnforceAccessControl: enforces access control rules based on permissions assigned to your LifeTime users 
  • ForceTriggerPipeline: re-trigger a pipeline by bypassing the check for new application versions 
  • IgnorePendingChanges: trigger a pipeline whose applications have pending changes without having to tag these beforehand or on-the-fly 


NOTE: The AutoTagging feature relies on the LifeTime Deployment API, therefore a valid service account token must be provided (as a Site Property) and the correct API endpoint must be specified in the 'Integrations' tab in Service Center. 

To enable the EnforceAccessControl feature, please ensure that LifeTime version 11.5.0 (or higher) is installed and the service account used by the plugin has 'Manage Infrastructure and Users' permission.



Regards,

Duarte

Hu Duarte ,


Thank you for explanation.


We have configured the trigger pipeline with necessary information but when we are triggering pipeline we are  getting 404 Unauthorized error & 401 Page not found error 

In the error of page not found its written that its not able to access trigger pipeline dashboard.

Do you have any idea why these errors are occurring.


Thanks

Ashish

Hi Ashish,


In the Trigger Pipeline LifeTime plugin, there are 3 configuration types for calling Azure DevOps pipelines. Each one uses a distinct Azure DevOps API call, so you'll need to figure out which one is proper for the type of pipeline you're using.


Please ensure that the same pipeline name is configured in both the Trigger Pipeline LifeTime plugin and Azure DevOps Pipelines.


In terms of permissions, you must ensure that the configured user account has the necessary rights to trigger the pipelines.


I propose that you trigger the pipeline outside of the LifeTime first (using Postman for example), and then proceed with the setup in the Trigger Pipeline LifeTime plugin.


NOTE: If you are using the YAML pipeline type, then you'll need to set the Site Property 'AzureDevOpsAPIVersion' to '6.0-preview.1'.


Regards,

Duarte

Hello Duarte,


Thank you for your reply.


Sorry was little off on this topic for some time but now again started.

As you have mentioned to  set site property where should I configure that ?

About triggering pipeline outside is there any example available?


Thanks & Regards

Ashish Akolkar




Hi Ashish,


You will need to access your LifeTime's environment Service Center to configure the site properties.

We don't have any examples for that. However, here is the Azure DevOps REST API documentation to help you understand how to trigger pipelines via rest.


Best Regards,

Duarte

Hi Duarte,


Thank you for  reply, I am able to reach to Azure DevOps Pipeline but now getting below error,


400 - BadRequest {"$id":"1","innerException":null,"message":"Unexpected parameter 

'ApplicationScope'\nUnexpected parameter 'ApplicationScopeWithTests'\nUnexpected parameter

 'TriggeredBy'","typeName":"Microsoft.Azure.Pipelines.WebApi.PipelineValidationException, Microsoft.Azure.Pipelines.WebApi","typeKey":"PipelineValidationException","errorCode":0,"eventId":3000}

I have declared these variables in YAML pipeline in  Variables section before stages.


Thanks & Regards

Ashish 

Hi Ashish,


Have you declared the variables to be settable at queue time?

Please also check if you are using the correct type of pipeline inside the TriggerPlugin configuration as it will use different API endpoints.


Best Regards,

Duarte


Hi Durte,


Yes I have added variable to be set at queue time.

And also selected YAML type inside the triggerPlugin Configuration.

but still the error exits, not sure whats going wrong.

Thanks & Regards

Ashish AKolkar

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.