Tag Archives: Build Automation

ALM For Power Platform/CDS In Azure-Devops/VSTS Using Power Platform Build Tools – Part 2

In the previous post, I showed you how to setup the Azure DevOps and install Power Platform Build Tools to Azure DevOps. In this blog post, I will show you how we can export from the Dynamics 365 CE instance and commit to VSTS/Azure DevOps.

Below is the VSTS Build-pipeline and will include the following steps

How to setup VSTS Build Definition

In Build Pipeline, we will be exporting the CRM solution, unpack it using the Power Platform Build Tools and store the solution file in Artifacts. You can create a build definition directly from Visual Studio Online (VSTS/Azure-DevOps) or from within Visual Studio. Firstly, I will show you how to create a build definition from within Visual Studio, navigate to the Builds tab in the Team Explorer:

Once there, you can click New Build Definition to be taken directly to Visual Studio Online. This is where you would start if you had decided to create the build definition directly from Visual Studio Online instead of starting in Visual Studio.

On the dialog box that pops up in the browser, we’ll select Visual Studio as our build template, but you can see there are other templates for use, such as Xamarin for Android or iOS and the Universal Windows Platform. The default settings for your build definition should be correct for the most part, but you will need to check the Continuous Integration checkbox. Here is what they look like for this example:

Because this is a simple example and we don’t need the additional flexibility the Default queue provides, we can leave the default Hosted option selected in the Default agent queue field. For more information on the restrictions on the Hosted pool.

You can see the checkbox for CI at the bottom of the dialog. This is enabled so that Visual Studio Online will execute the defined build definition for each check-in. The build definition will define whether this build code is published to an environment. Since we want to continually build and deploy to our web environment, we will check this box.

We can create the build definition from Azure Dev Ops too by following the below steps.

Navigate to Pipelines -> Builds -> Click New Pipeline

Click the Visual Designer, which will allow you to perform using GUI.

Select the Team Project, Repository, Branch and click on the continue button. For Demo purpose, I have given the name of the build definition as ALM – PowerApps and Dynamics 365-CI

In the next step, select an empty job as shown below:

After selecting an empty job, you can see the empty Agent Job and Select the Agent Pool as Hosted

Next step is we need to configure the VSTS build definition as shown in the below diagram.

1. PowerPlatform Tools Installer

Every pipeline that uses the PowerPlatform Build Tools must install them as a first step. This ensures that they are really installed on the Agent.

2. PowerPlatform Publish Customizations

As a second step, we need to publish all customizations. In this step, you only must choose your connection.

Please refer my previous blog or video to understand how we need to create the service connection for the Power Platform Build Tools.

Blog post Reference: https://d365dotblog.com/2020/06/01/alm-how-to-create-service-connection-for-the-powerapps-build-tools/

Video Reference: https://d365dotblog.com/2020/06/16/alm-power-platform-tips-tricks-create-service-connection-for-the-powerapps-build-tools-in-azure-devops-vsts/

3. PowerPlatform Export Solution

As a next step, we need to export the solution from the Power Platform source instance. We are using “Export Solution” to export the solution from the Power platform instance.

Note: Please update the solution name and solution output file in the Power Platform Export solution task

Solution Name: $(SolutionName)

Solution Output File: $(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

4. PowerPlatform Unpack Solution

As a next step, we are going to unpack the solution using Power Platform Unpack solution ask as below:

Solution Input File should be the same as the output in the last step. In our case:


Target Folder to Unpack Solution should be the folder where you would like to store your unpacked solution in the repo. In our case, we will have a folder in the root which has the name of the solution.


5 – Commit solution to repo

Final step is to add the extracted solution to our repo. To do this we will add a standard “Command Line” step. There you will add the following code to “Script” field:

echo commit all changes
git config user.email "<email>"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "solution init"
echo push code to new repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

You must replace “<email>” with the email of the user you would like to use to push your changes.

6 – General Pipeline configuration

Below are the some of the general configurations you need to enable in this pipeline.

For the agent, we need to allow scripts to use the OAuth token. If this is not configured our command-line script will not be able to connect to the repo and push our solution. The configuration should look like this

In our PowerPlatform Export Solution steps, we have used a variable called “SolutionName”.  We need to make sure we are updating the solution name before running the VSTS Build pipeline. Now you can test the pipeline by running it. This can either be done via the “Queue” text if you are still in the edit mode or by using the “Run pipeline” button.

In my next blog, we will see how to pack the solution and deploy it in the Target instance using Power Platform Build Tools.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM – Power Platform – Copy Azure DevOps Build/Release to New Project

Here is my next video on “Tips & Tricks in ALM with PowerApps Build Tools”.

Please do subscribe my channel.

In this video, I have explained how to Copy Azure DevOps Build/Release Pipeline to New Project.

Here is the link for my course on Power Platform – ALM Fundamentals https://powerspark.newzenler.com/

ALM – Power Platform – Tips & Tricks -1.Using secrets from Azure Key Vault in a pipeline

I have started my new series of videos “Tips & Tricks in ALM with PowerApps Build Tools” staring this week.

Please do subscribe my channel.

In this video, I have explained how to use the secrets from the azure key vault in VSTS Build & Release definition.

Here is the link for Power Platform – ALM Fundamentals https://powerspark.newzenler.com/

Powershell scripts to export and import the Dynamics 365 CE solution

In this blog, we will see how to export and import the Dynamics 365 CE solution using the PowerShell script.

In my PowerShell script, I am using Microsoft.Xrm.Data.Powershell.


This function will check for this module (Microsoft.Xrm.Data.Powershell) in your machine. if not present then it will install this module


This function will establish the connection to the CRM by passing the Dynamics 365 CE instance URL, username and password.

Before running the PowerShell scripts please make sure you have updated the below values.

$solutionName ="BuildAutomation"
  1. $SolutionName – Name of the solution which you want to export the source instance.
  2. $SolutionFilePath – Desired folder path to export the solution
  3. $CRMSourceUserName – Username of the Dynamics 365 CE source instance
  4. $CRMDestinationUserName – Username of the Dynamics 365 CE destination instance
  5. $CRMDestinationPassword – Password of the Dynamics 365 CE destination instance
  6. $CRMDestinationPassword – Password of the Dynamics 365 CE source instance
  7. $CRMSourceUrl – URL of the Dynamics 365 CE source instance
  8. $CRMDestinationUrl – URL of the Dynamics 365 CE destination instance.

After updating the values in the below script, you can copy and paste in Windows PowerShell and run it. This will establish the connection to the Dynamics 365 CE source instance and export the solution mentioned and import it into the Dynamics 365 CE destination instance.

$solutionName ="BuildAutomation"

Set-StrictMode -Version latest
function InstallRequiredModule{
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass -Force
$moduleName = “Microsoft.Xrm.Data.Powershell”
$moduleVersion = “2.7.2”
if (!(Get-Module -ListAvailable -Name $moduleName )) {
Write-host “Module Not found, installing now”
Install-Module -Name $moduleName -MinimumVersion $moduleVersion -Force
Write-host “Module Found”
function EstablishCRMConnection{
Write-Host "UserId: $CRMUserName Password: $CRMSecPasswd CrmUrl: $crmUrl"
$CRMSecPasswdString = ConvertTo-SecureString -String $CRMSecPasswd -AsPlainText -Force
write-host "Creating credentials"
$Credentials = New-Object System.Management.Automation.PSCredential ($CRMUserName, $CRMSecPasswdString)
write-host "Credentials object created"
write-host "Establishing crm connection next"
$crm = Connect-CrmOnline -Credential $Credentials -ServerUrl $CrmUrl
write-host "Crm connection established"
return $crm


#Update Source CRM instance details below:
Write-Host "going to create source connection"
$CrmSourceConnectionString = EstablishCRMConnection -user "$CRMSourceUserName" -secpasswd "$CRMSourcePassword" -crmUrl "$CRMSourceUrl"
Write-Host "source connection created"
Set-CrmConnectionTimeout -conn $CrmSourceConnectionString -TimeoutInSeconds 1000

Write-Host "going to create destination connection"
$CrmSourceDestinationString = EstablishCRMConnection -user "$CRMDestinationUserName" -secpasswd "$CRMDestinationPassword" -crmUrl "$CRMDestinationUrl"
Write-Host "destination connection created"
Set-CrmConnectionTimeout -conn $CrmSourceDestinationString -TimeoutInSeconds 1000

Write-Host "Publishing Customizations in source environment"
Publish-CrmAllCustomization -conn $CrmSourceConnectionString
Write-Host "Publishing Completed in source environment."

Write-Host "Exporting Solution"
Export-CrmSolution -conn $CrmSourceConnectionString -SolutionName "$solutionName" -SolutionFilePath "$SolutionFilePath" -SolutionZipFileName "$solutionName.zip" 
Write-host "Solution Exported."

Write-host "Importing Solution"
Import-CrmSolution -conn $CrmSourceDestinationString -SolutionFilePath "$SolutionFilePath\$solutionName.zip"
Write-host "Solution Imported"

Write-Host "Publishing Customizations in destination environment"
Publish-CrmAllCustomization -conn $CrmSourceDestinationString
Write-Host "Publishing Completed in destination environment"

I hope this helps.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

CI/CD & Test Automation for Dynamics 365 in Azure DevOps/VSTS- Part 3 – Release Definition

In my previous blog, I wrote about how to set up a VSTS build definition. This blog will continue on that by setting up a VSTS release pipeline in Azure DevOps/VSTS. I will assume you got a QA, UAT and a production environment. The package(CRM solutions) from the build automation blog will be deployed to these environments. It will be set up in a very basic way. After that I will elaborate on different other options you can use in Azure DevOps to include in your pipeline.

Dynamics CRM CICD process

The below diagram illustrates the basic flow of the Dynamics 365 CE Workflow process.

In this part, we are going to see the above-highlighted one in detail.


  • Visual Studio Team Services (VSTS)/ Azure DevOps
  • Dynamics 365 Build Tools by Wael Hamze

Setting up the variables for the connection to CRM

You can create variable groups in VSTS. This is useful for variables who are related. You will use this to create a group for credentials for your development environment. Later if you implement the automated deployment, you can store the credentials to other environments there too.

In VSTS go to “Build and Release” and select “Library”. Here you can create variable groups.

Next click on “+ Variable Group”. This will take you to a form where you can create a variable group.

Now you give your variable group a name. I will assume you got a test, UAT and a production environment. The package from the build automation blog will be deployed to these environments.So, we will create three connection string for the test, UAT and production environments.

  • Connection string – AuthType=$(AuthType);Username=$(Username);Password=$(Password);Url=$(Url)
  • URL- Enter your instance URL.
  • Username – Enter the username of your instance
  • Password – Enter the password of your instance
  • Authtype – Office365

Below is the sample connection string details for your reference

AuthType=Office365;Username=jsmith@contoso.onmicrosoft.com; Password=passcode;Url=https://contoso.crm.dynamics.com

Reference Link https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/developers-guide/mt608573(v=crm.8)?redirectedfrom=MSDN

VSTS Release Definition

This release definition will import our packed solution file and then publish the customizations to our Dynamics 365 CE Online sandbox instance(QA, UAT & Production environment).

Once the build definition gets succeded VSTS release definition will trigger automatically and perform the following steps:

You can create a release definition directly from Visual Studio Online(VSTS/Azure-DevOps). To create a new pipeline, you go to Pipelines and then Releases. After that, you click on new and then new release pipeline.

You will get asked to select a template. Those templates are really useful for setting up deployments for azure. For Dynamics CRM deployments there is no template, so you select an empty job.

Next, you will get asked to provide a stage name and stage owner. A stage is an environment, so in our case, we name it “QA”. The owner can be left at default (your account). After you did that you can press the ‘X’ on the top right of the stage tab to close it.

Now you will see the pipeline as below.

  1.  You can edit the name of the pipeline and provide the name of your choice. Below that we can see a navigation bar with different tabs.
  2. Artifacts: An artifact is usually the output of a VSTS build definition, but it also has other options like an Azure Container Repository. If you click on “Add an artifact” you can see all options available. In this blog, we will only use the default, which is a build output.
  3.  Next is the stages. Here you can copy and add new stages for QA, UAT and production environment.

Adding an artifact

To add an artifact. Click on the “Add an artifact’ button. Now select all the correct details. For Project select your current team project. For source select the build pipeline you made based on the previous blog. Set default version to ‘Default’. Finally, provide the source alias name and click ‘Add’ to finish.

Now, we need to add the task for stage QA. Click on stage “QA”. It will navigate you to the task page as follows:

Now you will be in the task editor. Here you can add tasks similar to the VSTS build definition task editor.

To add a Task click on the + icon next to “Agent Job”. You will get a list of tasks. Search for the “MSCRM Tool installer” and add the task.

Next, add the MSCRM Import solution task and click on the task and add the details like below:

Display Name: Provide the name of your choice(Example: Import Solution to QA)
CRM Connection String: Provide the QA connection string variable which we have created earlier.
Solution File: Use the three dots at the right corner to select the correct solution (it will show the artifacts of the last successful build, so make sure your solution is there).
Checkboxes: According to your needs you can select the below checkboxes

Then next add the MSCRM Publish Customization task as follows:

Setting up the environments

Now that we have set up the QA environment, we need to set up the same for the UAT and production environment as well. Click on “Pipeline” on the navigation bar to go back to the overview. The UAT environment is the same as the QA environment, just different variables (remember we added those in the library). An easy way to create that environment is to clone the QA environment. To do that, hover over the QA test environment and click the icon with the 2 papers as shown below:

It will create a “Copy of QA” stage and you see it connected to the QA stage.  Click on it, change the name to “UAT ” and close the tab by pressing the ‘X’ button. Now you have the UAT stage too. The fact that the 2 environments are connected, means that the UAT deployment will automatically start when the QA stage is successfully deployed. Similarly, we need to clone and update the values for the production environment too.

Connecting variables

One more thing we need to do is to connect the variables we created at the start of the stages. You do that by clicking ‘Variables’ in the navigation bar and then “Variable Groups”

After that click on the “Link variable group”. Select the variable group and click on Link button as shown below:

VSTS release definition setup is done. Now, we will quickly explore the advanced options in the VSTS release definition.

Advanced Options


Triggers define when a stage will be deployed. You can open it by clicking the lightning icon next to a stage.

There are 3 options you can select:

After Release

This means the stage will be deployed right after the release is created. This is used to automatically start the first stage in the pipeline after a release is created. In build pipelines, you can set to automatically create a release when the build is completed. If your pipeline has these options enabled, then a successful build will automatically deploy to the first stage.

After Stage

This means your stage will deploy whenever 1 or more stages successfully deployed. If you select multiple stages, then every stage needs to complete successfully in order to start this stage. Optionally you can select the checkbox to also deploy if the previous stages are “partially succeeded” instead of just “succeeded”

Manual Only

This means the deployment of the stage has to be manually started.


In addition to triggers, you can set approvals. You can configure them by clicking the person icon next to a stage. There are 2 types.

Pre-deployment approvals

Somebody needs to approve that a stage will be deployed. This approval will trigger when the deployment of a stage is about to start. Either via an automatic trigger or a manual start. Approving the deployment results in the start of the deployment of that stage. Rejecting sets the status of the stage to “not deployed”.

Post-deployment approvals

Somebody needs to approve that the deployment of a stage is successful. This approval will trigger after the last deployment task is successfully completed. Approving this results in a succeeded deployment and rejecting it in a failed deployment.

For both options, you can select the approvers and a timeout before it automatically rejects. Also, you can set that if a user manually starts deployment of a stage, that user will not be able to approve the stage. Finally, you can select that approval is skipped when the previous stage was approved by somebody who is an approver of this stage.


Sometimes you may want to start releases on a specific day/time. For that, you can set a schedule. There are 2 options for scheduling resulting in different behavior.

Release Trigger Scheduling

You can set this schedule by clicking the scheduling button below the artifacts. A new release will be created at specific times that are configured. To also deploy to the first stage, make sure that at least one stage has the trigger “After Release”, otherwise it will create a new release but it won’t deploy anything. Also keep in mind a new release will be created, even if there is no new artifact available.

Stage Schedules

You can set this schedule by selecting the pre-deployment conditions and enabling the ‘Schedule’ option. Here you can define 1 schedule of when to deploy this stage.

I hope now you could have got clear information on how to create the VSTS release definition.

In my next blog, we will see how to enable the gated check-in.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.