Category Archives: Continuous Integration

getting started with Power BI Deployment Pipelines

In this blog, I will explain how we can use the deployment pipelines for Power BI. Deployment pipelines become General Availability in Power BI.

Deployment pipelines helps enterprise BI teams build an efficient and reusable release process by maintaining development, test, and production environments.

BI teams adopting deployment pipelines, will enjoy:

  1. Improved productivity
  2. Faster content updates delivery
  3. Reduced manual work and errors.

Pre-requisites

You will be able to access the deployment pipelines feature if the following conditions are met:

You have one of the following Premium licenses:

Create a deployment pipeline

You can create a pipeline from the deployment pipelines tab, or from a workspace.

After the pipeline is created, you can share it with other users or delete it. When you share a pipeline with others, the users you share the pipeline with will be given access to the pipeline. Pipeline access enables users to view, share, edit, and delete the pipeline.

Create a pipeline from the deployment pipelines tab

To create a pipeline from the deployment pipelines tab, do the following:

  1. In Power BI service, from the navigation pane, select Deployment pipelines and then select Create pipeline.
  2. In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.

Create a pipeline from a workspace

You can create a pipeline from an existing workspace, providing you’re the admin of a new workspace experience.

  1. From the workspace, select Create a pipeline.
  • In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.

Assign a workspace to a deployment pipeline

After creating a pipeline, you need to add the content you want to manage to the pipeline. Adding content to the pipeline is done by assigning a workspace to the pipeline stage. You can assign a workspace to any stage.

You can assign one workspace to a deployment pipeline. Deployment pipelines will create clones of the workspace content, to be used in different stages of the pipeline.

Follow these steps to assign a workspace in a deployment pipeline:

  1. In the newly created deployment pipeline, select Assign a workspace.
  2. In the Choose the workspace drop-down menu, select the workspace you want to assign to the pipeline.
  3. Select the stage you want to assign the workspace to.

Workspace assignment limitations

Deploying reports

Select the stage to deploy from and then select the deployment button. The deployment process creates a duplicate workspace in the target stage. This workspace includes all the content existing in the current stage.

In this below diagram we can see how we have deployed from Development to test and test to production environment.

In the upcoming blog, I will explain selective deployment, Backwards deployment, creating dataset rules and overriding content etc.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM For Power Platform/CDS In Azure-Devops/VSTS Using Power Platform Build Tools – Part 2

In the previous post, I showed you how to setup the Azure DevOps and install Power Platform Build Tools to Azure DevOps. In this blog post, I will show you how we can export from the Dynamics 365 CE instance and commit to VSTS/Azure DevOps.

Below is the VSTS Build-pipeline and will include the following steps

How to setup VSTS Build Definition

In Build Pipeline, we will be exporting the CRM solution, unpack it using the Power Platform Build Tools and store the solution file in Artifacts. You can create a build definition directly from Visual Studio Online (VSTS/Azure-DevOps) or from within Visual Studio. Firstly, I will show you how to create a build definition from within Visual Studio, navigate to the Builds tab in the Team Explorer:

Once there, you can click New Build Definition to be taken directly to Visual Studio Online. This is where you would start if you had decided to create the build definition directly from Visual Studio Online instead of starting in Visual Studio.

On the dialog box that pops up in the browser, we’ll select Visual Studio as our build template, but you can see there are other templates for use, such as Xamarin for Android or iOS and the Universal Windows Platform. The default settings for your build definition should be correct for the most part, but you will need to check the Continuous Integration checkbox. Here is what they look like for this example:

Because this is a simple example and we don’t need the additional flexibility the Default queue provides, we can leave the default Hosted option selected in the Default agent queue field. For more information on the restrictions on the Hosted pool.

 
You can see the checkbox for CI at the bottom of the dialog. This is enabled so that Visual Studio Online will execute the defined build definition for each check-in. The build definition will define whether this build code is published to an environment. Since we want to continually build and deploy to our web environment, we will check this box.

We can create the build definition from Azure Dev Ops too by following the below steps.

Navigate to Pipelines -> Builds -> Click New Pipeline

Click the Visual Designer, which will allow you to perform using GUI.

Select the Team Project, Repository, Branch and click on the continue button. For Demo purpose, I have given the name of the build definition as ALM – PowerApps and Dynamics 365-CI

In the next step, select an empty job as shown below:

After selecting an empty job, you can see the empty Agent Job and Select the Agent Pool as Hosted

Next step is we need to configure the VSTS build definition as shown in the below diagram.

1. PowerPlatform Tools Installer

Every pipeline that uses the PowerPlatform Build Tools must install them as a first step. This ensures that they are really installed on the Agent.

2. PowerPlatform Publish Customizations

As a second step, we need to publish all customizations. In this step, you only must choose your connection.

Please refer my previous blog or video to understand how we need to create the service connection for the Power Platform Build Tools.

Blog post Reference: https://d365dotblog.com/2020/06/01/alm-how-to-create-service-connection-for-the-powerapps-build-tools/

Video Reference: https://d365dotblog.com/2020/06/16/alm-power-platform-tips-tricks-create-service-connection-for-the-powerapps-build-tools-in-azure-devops-vsts/

3. PowerPlatform Export Solution

As a next step, we need to export the solution from the Power Platform source instance. We are using “Export Solution” to export the solution from the Power platform instance.

Note: Please update the solution name and solution output file in the Power Platform Export solution task

Solution Name: $(SolutionName)

Solution Output File: $(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

4. PowerPlatform Unpack Solution

As a next step, we are going to unpack the solution using Power Platform Unpack solution ask as below:

Solution Input File should be the same as the output in the last step. In our case:

$(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

Target Folder to Unpack Solution should be the folder where you would like to store your unpacked solution in the repo. In our case, we will have a folder in the root which has the name of the solution.

$(Build.ArtifactsStagingDirectory)\$(SolutionName)

5 – Commit solution to repo

Final step is to add the extracted solution to our repo. To do this we will add a standard “Command Line” step. There you will add the following code to “Script” field:

echo commit all changes
git config user.email "<email>"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "solution init"
echo push code to new repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

You must replace “<email>” with the email of the user you would like to use to push your changes.

6 – General Pipeline configuration

Below are the some of the general configurations you need to enable in this pipeline.

For the agent, we need to allow scripts to use the OAuth token. If this is not configured our command-line script will not be able to connect to the repo and push our solution. The configuration should look like this

In our PowerPlatform Export Solution steps, we have used a variable called “SolutionName”.  We need to make sure we are updating the solution name before running the VSTS Build pipeline. Now you can test the pipeline by running it. This can either be done via the “Queue” text if you are still in the edit mode or by using the “Run pipeline” button.

In my next blog, we will see how to pack the solution and deploy it in the Target instance using Power Platform Build Tools.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM – Power Platform – Tips & Tricks – Create Service Connection for the PowerApps Build Tools in Azure DevOps/VSTS

Here is my next video on “Tips & Tricks in ALM with PowerApps Build Tools”.

Please do subscribe my channel.

In this video, I have explained how to Create Service Connection for the PowerApps Build Tools in Azure DevOps/VSTS.

Here is the link for my course on Power Platform – ALM Fundamentals https://powerspark.newzenler.com/

ALM – Power Platform – Copy Azure DevOps Build/Release to New Project

Here is my next video on “Tips & Tricks in ALM with PowerApps Build Tools”.

Please do subscribe my channel.

In this video, I have explained how to Copy Azure DevOps Build/Release Pipeline to New Project.

Here is the link for my course on Power Platform – ALM Fundamentals https://powerspark.newzenler.com/

ALM- How to create Service Connection for the PowerApps Build Tools

In this blog, we will see how to create the service connection for PowerApps Build Tools.

Pre-requisite: Azure DevOps

The service endpoint for the source and the target environment that you want to export and import the solution to. For example, https://powerappsbuildtools.crm.dynamics.com. Service endpoints can be defined under Service Connections -> Generic Service Connection in Project Settings.

Navigate to Project Settings -> Service connections -> Click on New service connection.

Select Generic in Service connection

Provide the below details for the generic service connection as follows:

  • Username – Username for connecting to the endpoint
  • Server URL – example. https://powerappsbuildtools.crm.dynamics.com
  • Password – Password/Token Key for connecting to the endpoint
  • Service connection name  – Give the service connection as meaningful name.
  • Description (optional)
  • Security – Grant access permission to all pipelines – Enable

NOTE: MFA is not yet available. May be in the upcoming release Microsoft will add MFA to the service connection.

Once the above step is done, you will see your service connection in the list. You could add multiple other connections to various other environments.

Once the connection is established, you will be able to see this service connection in the PowerApps Build tools task like exporting and importing the solution as follows:

I hope this helps. If you have any challenges in creating “service connection”then please feel free to reach me out to me I am also always happy to assist you on your queries.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM – Power Platform – Tips & Tricks -1.Using secrets from Azure Key Vault in a pipeline

I have started my new series of videos “Tips & Tricks in ALM with PowerApps Build Tools” staring this week.

Please do subscribe my channel.

In this video, I have explained how to use the secrets from the azure key vault in VSTS Build & Release definition.

Here is the link for Power Platform – ALM Fundamentals https://powerspark.newzenler.com/

Powershell scripts to export and import the Dynamics 365 CE solution

In this blog, we will see how to export and import the Dynamics 365 CE solution using the PowerShell script.

In my PowerShell script, I am using Microsoft.Xrm.Data.Powershell.

InstallRequiredModule

This function will check for this module (Microsoft.Xrm.Data.Powershell) in your machine. if not present then it will install this module

EstablishCRMConnection

This function will establish the connection to the CRM by passing the Dynamics 365 CE instance URL, username and password.

Before running the PowerShell scripts please make sure you have updated the below values.

$solutionName ="BuildAutomation"
$SolutionFilePath="C:\Users\DBALASU\Source\Workspaces\BuildAutomation\BuildAutomation.Solutions\BuildAutomation"
$CRMSourceUserName=""
$CRMSourcePassword=""
$CRMDestinationUserName=""
$CRMDestinationPassword=""
$CRMSourceUrl="https://instancename.crm.dynamics.com"
$CRMDestinationUrl="https://instancename.crm5.dynamics.com"
  1. $SolutionName – Name of the solution which you want to export the source instance.
  2. $SolutionFilePath – Desired folder path to export the solution
  3. $CRMSourceUserName – Username of the Dynamics 365 CE source instance
  4. $CRMDestinationUserName – Username of the Dynamics 365 CE destination instance
  5. $CRMDestinationPassword – Password of the Dynamics 365 CE destination instance
  6. $CRMDestinationPassword – Password of the Dynamics 365 CE source instance
  7. $CRMSourceUrl – URL of the Dynamics 365 CE source instance
  8. $CRMDestinationUrl – URL of the Dynamics 365 CE destination instance.

After updating the values in the below script, you can copy and paste in Windows PowerShell and run it. This will establish the connection to the Dynamics 365 CE source instance and export the solution mentioned and import it into the Dynamics 365 CE destination instance.

$solutionName ="BuildAutomation"
$SolutionFilePath="C:\Users\DBALASU\Source\Workspaces\BuildAutomation\BuildAutomation.Solutions\BuildAutomation"
$CRMSourceUserName=""
$CRMSourcePassword=""
$CRMDestinationUserName=""
$CRMDestinationPassword=""
$CRMSourceUrl="https://instancename.crm.dynamics.com"
$CRMDestinationUrl="https://instancename.crm5.dynamics.com"

Set-StrictMode -Version latest
function InstallRequiredModule{
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass -Force
$moduleName = “Microsoft.Xrm.Data.Powershell”
$moduleVersion = “2.7.2”
if (!(Get-Module -ListAvailable -Name $moduleName )) {
Write-host “Module Not found, installing now”
$moduleVersion
Install-Module -Name $moduleName -MinimumVersion $moduleVersion -Force
}
else
{
Write-host “Module Found”
}
}
function EstablishCRMConnection{
param(
[string]$CRMUserName,
[string]$CRMSecPasswd,
[string]$crmUrl)
Write-Host "UserId: $CRMUserName Password: $CRMSecPasswd CrmUrl: $crmUrl"
$CRMSecPasswdString = ConvertTo-SecureString -String $CRMSecPasswd -AsPlainText -Force
write-host "Creating credentials"
$Credentials = New-Object System.Management.Automation.PSCredential ($CRMUserName, $CRMSecPasswdString)
write-host "Credentials object created"
write-host "Establishing crm connection next"
$crm = Connect-CrmOnline -Credential $Credentials -ServerUrl $CrmUrl
write-host "Crm connection established"
return $crm
}

InstallRequiredModule

#Update Source CRM instance details below:
Write-Host "going to create source connection"
$CrmSourceConnectionString = EstablishCRMConnection -user "$CRMSourceUserName" -secpasswd "$CRMSourcePassword" -crmUrl "$CRMSourceUrl"
Write-Host "source connection created"
Set-CrmConnectionTimeout -conn $CrmSourceConnectionString -TimeoutInSeconds 1000

Write-Host "going to create destination connection"
$CrmSourceDestinationString = EstablishCRMConnection -user "$CRMDestinationUserName" -secpasswd "$CRMDestinationPassword" -crmUrl "$CRMDestinationUrl"
Write-Host "destination connection created"
Set-CrmConnectionTimeout -conn $CrmSourceDestinationString -TimeoutInSeconds 1000

Write-Host "Publishing Customizations in source environment"
Publish-CrmAllCustomization -conn $CrmSourceConnectionString
Write-Host "Publishing Completed in source environment."

Write-Host "Exporting Solution"
Export-CrmSolution -conn $CrmSourceConnectionString -SolutionName "$solutionName" -SolutionFilePath "$SolutionFilePath" -SolutionZipFileName "$solutionName.zip" 
Write-host "Solution Exported."

Write-host "Importing Solution"
Import-CrmSolution -conn $CrmSourceDestinationString -SolutionFilePath "$SolutionFilePath\$solutionName.zip"
Write-host "Solution Imported"

Write-Host "Publishing Customizations in destination environment"
Publish-CrmAllCustomization -conn $CrmSourceDestinationString
Write-Host "Publishing Completed in destination environment"

I hope this helps.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

CI/CD & Test Automation for Dynamics 365 in Azure DevOps/VSTS -Part 5 – Master Data Deployment

In my previous blog, I wrote about how to set up a gated check-in, In this blog, we will see how to move the master data from source to target instance using our CI/CD pipeline.

Generally, we use the configuration migration tool to move the master data across multiple environments and organizations. Configuration data is used to define custom functionality in model-driven apps in Dynamics 365, such as Dynamics 365 Sales and Customer Service, and is typically stored in custom entities. Configuration data is different from end-user data (account, contacts, and so on). A typical example of configuration data is what you define in the Unified Service Desk for Dynamics 365 to configure a customized call center agent application. The Unified Service Desk entities, along with the configuration data that is stored in the entities, define an agent application.

Note: Disable plug-ins before exporting data and then re-enable them on the target system after the import is complete for all the entities or selected entities.

Master data/Configuration data deployment

undefinedDefine the schema of the source data to be exported: The schema file (.xml) contains information about the data that you want to export such as the entities, attributes, relationships, definition of the uniqueness of the data, and whether the plug-ins should be disabled before exporting the data.

undefinedUse the schema to export data: Use the schema file to export the data into a .zip file that contains the data and the schema of the exported data.

undefinedImport the exported data: Use the exported data (.zip file) to import into the target environment. The data import is done in multiple passes to first import the foundation data while queuing up the dependent data, and then import the dependent data in the subsequent passes to handle any data dependencies or linkages.

Instead of moving it manually we are going to automate this above process using Azure DevOps.

Pre-requisites

  • Please make sure the latest configuration.xml(generated using DataMigrationUtility.exe tool) is generated and placed in the desired input location.
  • Please make sure we have updated the variables (correct connection string, CRM username, CRM password) in both the VSTS build and release definition.               

VSTS Build Definition

We have to create a separate Build Definition for moving the master data from source to target instance. Once the solution movement is done this below build definition should trigger.

In my previous blog, I have explained how to create a new build definition. Please refer to that for creating a new build definition.

What will do – It will connect to the source instance and export the master data using the configuration.xml and push it to artifacts repository

In this Build Definition we have used the following MSCRM Build Tools tasks:

  • MSCRM Tool Installer – Installs the Dynamics 365 tools required by all of the tasks
  • MSCRM Export config migration data – Exports data from a CRM instance using a Configuration Migration schema file (How to prepare configuration schema file).

You have to update the connection string variable name and select the configuration.xml input location.

  • Publish build artifacts – Publish build artifacts to Azure Pipelines

Release Definition

Once the above definition gets succeded, this Release Definition will trigger automatically and performs the following tasks:

  • MSCRM Tool Installer – Installs the Dynamics 365 tools required by all of the tasks
  • MSCRM Import config migration data – Import data exported using Configuration Migration Tool into a CRM instance

You have to update the connection string variable name and select the exported data zip from the artifacts repository(drop).

This will import the master data/configuration data into our Dynamics Online sandbox instance.

In my next blog, we will see how to integrate the unit testing framework with the VSTS Build definition.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

CI/CD & Test Automation for Dynamics 365 in Azure DevOps/VSTS -Part 4 – Gated Check-in

In my previous blog, I wrote about how to set up a VSTS Release definition. In this blog, I am goint to explain the gated-check-in but before heading to it. We must know why we need gated check-in

Gated check-in helps to restrict developers from checking in a broken code into a source control system and thus helps to avoid blocking your team. With gated check-in when check-in is initiated by a developer, it will build the project and will check-in the code only if the build is successful. Gated check-in is suitable for projects whose overall build time is less than a few minutes.

C# we have used StyleCop and FxCop and for JavaScript and Jquery we have used JSHint

StyleCop for the custom code such as Plugins, Workflows, Actions, and WebApi.

Consider there is a group of developers working together and each one writes the code in the exact same way.

More often than not, one isn’t better than the other and it’s just a matter of taste. In a team or in a single project, it’s more important to be consistent than it is to choose the right style.

Agreeing on a style can be hard enough, but enforcing it shouldn’t be something you do manually. It will be tedious and error-prone.

StyleCop is a tool that can automate this. Let’s have a look at how to set it up.

What is StyleCop?

StyleCop analyzes C# source code to enforce a set of style and consistency rules.

StyleCop used to be a Visual Studio plugin and a NuGet package. You can still use this in Visual Studio 2019, 2017 etc.

Installing StyleCop

To add StyleCop to your project, right-click your project in Visual Studio’s Solution Explorer, and choose “Manage NuGet Packages?”:

https://i0.wp.com/blog.submain.com/wp-content/uploads/2018/04/ManageNugetPackages.png?w=328&ssl=1

Search for “StyleCop.Analyzers” and install the latest stable version:

Installing-NuGet-package

Once it is installed, you build the project solution. You might get the below StyleCop warnings.

  • Add XML comments
  • Generate an XML documentation file (this can be set in the project properties)
  • Add a file header (e.g., copyright information)
  • Put the “using” statements inside the “namespace” block
  • Put braces on a new line
  • Add an empty line between the two method definitions (Output2 and Output3)

Setup

The first step in integrating StyleCop into an MSBuild system is to obtain the default StyleCop MSBuild targets file. To do so, run the StyleCop installer, and select the MSBuild files option on the Custom Setup page. This will install the StyleCop MSBuild files into the {Program Files}\MSBuild\StyleCop folder.

Adding the Import Tag

Once the StyleCop MSBuild files are installed, the next step is to import the StyleCop targets file into your C# projects. This is done by adding an Import tag to each C# project file.

For example, to integrate StyleCop to the project SampleProject, open the project file SampleProject.csproj within your favorite text editor. Scroll down to the bottom of the file and add a new tag to import the StyleCop.targets file. This import tag should be added just below the import of Microsoft.CSharp.targets:

Code
<Project DefaultTargets=”Build” xmlns=”http://schemas.microsoft.com/developer/msbuild/2003″&gt;   …Contents Removed…   <Import Project=”$(MSBuildBinPath)\Microsoft.CSharp.targets” />   <Import Project=”$(ProgramFiles)\MSBuild\StyleCop\v4.4\StyleCop.targets” />   …Contents Removed… </Project>          

Save the modified .csproj file. The next time you build this project either within Visual Studio or on the command line, StyleCop will run automatically against all of the C# source files within the project.

Build Warnings Vs Errors

By default, StyleCop violations will show up as build warnings. To turn StyleCop violations into build errors, the flag StyleCopTreatErrorsAsWarnings must be set to false. This flag can be set as an environment variable on the machine, or within the build environment command window. Setting the flag this way will cause StyleCop violations to appear as build errors automatically for all projects where StyleCop build integration is enabled.

Alternately, this flag can be set within the project file for a particular project. Open the .csproj file for your project again, and find the first PropertyGroup section within the file. Add a new tag to set the StyleCopTreatErrorsAsWarnings flag to false. For example:

Code
<Project DefaultTargets=”Build” xmlns=”http://schemas.microsoft.com/developer/msbuild/2003″&gt;   <PropertyGroup>     <Configuration Condition=” ‘$(Configuration)’ == ” “>Debug</Configuration>     <Platform Condition=” ‘$(Platform)’ == ” “>AnyCPU</Platform>     <ProductVersion>8.0.50727</ProductVersion>     <SchemaVersion>2.0</SchemaVersion>     <ProjectGuid>{4B4DB6AA-A021-4F95-92B7-B88B5B360228}</ProjectGuid>     <OutputType>WinExe</OutputType>     <AppDesignerFolder>Properties</AppDesignerFolder>     <RootNamespace>SampleProject</RootNamespace>     <AssemblyName>SampleProject</AssemblyName>     <StyleCopTreatErrorsAsWarnings>false</StyleCopTreatErrorsAsWarnings>   </PropertyGroup>            

The configuration described above will suffice to enable StyleCop build integration on an individual development machine. However, development teams working within a well-defined development environment can set up the build integration in a more global way, so that each developer does not have to manually install StyleCop on his machine.

To do this, copy all of the files from {Program Files}\MSBuild\StyleCop into a custom folder within your build environment, and check all of these files into your source control system. Next, define an environment variable within your development environment which points to the location of the StyleCop targets file. For example:

set StyleCopTargets=%enlistmentroot%\ExternalTools\StyleCop\v4.4\StyleCop.targets

With this configuration in place, it is simply a matter of adding the following import tag to each .csproj file within your development environment:

Code
<Import Project=”$(MSBuildBinPath)\Microsoft.CSharp.targets” /> <Import Project=”$(StyleCopTargets)” />          

StyleCop will automatically run each time this project is built, no matter which developer is building the project. There is no need for each developer to install StyleCop manually, since the StyleCop binaries are checked directly into your source control system and are centrally integrated into your build environment.

What is CodeAnalysis?

To integrate Code Analysis in build, unload and edit project and add following tags. Note that paths might be different depending on Solution configuration.

<ItemGroup>
    <CodeAnalysisDictionary Include="$(SolutionDir)\CodeAnalysisDictionary.xml" />
</ItemGroup>
<Import Project="$(SolutionDir)\ExternalDlls\StyleCop 4.7\StyleCop.targets" /> 

Code Analysis configuration

Configure project Debug configuration to use Code Analysis rules in Solution root. Do the same for the Release configuration. Only difference is that in debug mode Code Analysis should not be run because it slows down the build. We keep CA running in Release to get error report from continues integration and to allow easily turning CA on by altering solution mode from Debug to Release.

Once the above step is done, please commit the project solution files in Azure DevOps/VSTS repository.

How to enable Gated Check-in VSTS build definition.

Go to Build definition -> Triggers-> you can see the gated check-in as follows:

Check the gated check-in checkbox. Now gated check-in is enabled for this particular build definition.

In my next blog, we will see how to move the master data from source to target instance using our CI/CD pipeline.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

« Older Entries