ALM For Power Platform/CDS In Azure-Devops/VSTS Using Power Platform Build Tools – Part 2

In the previous post, I showed you how to setup the Azure DevOps and install Power Platform Build Tools to Azure DevOps. In this blog post, I will show you how we can export from the Dynamics 365 CE instance and commit to VSTS/Azure DevOps.

Below is the VSTS Build-pipeline and will include the following steps

How to setup VSTS Build Definition

In Build Pipeline, we will be exporting the CRM solution, unpack it using the Power Platform Build Tools and store the solution file in Artifacts. You can create a build definition directly from Visual Studio Online (VSTS/Azure-DevOps) or from within Visual Studio. Firstly, I will show you how to create a build definition from within Visual Studio, navigate to the Builds tab in the Team Explorer:

Once there, you can click New Build Definition to be taken directly to Visual Studio Online. This is where you would start if you had decided to create the build definition directly from Visual Studio Online instead of starting in Visual Studio.

On the dialog box that pops up in the browser, we’ll select Visual Studio as our build template, but you can see there are other templates for use, such as Xamarin for Android or iOS and the Universal Windows Platform. The default settings for your build definition should be correct for the most part, but you will need to check the Continuous Integration checkbox. Here is what they look like for this example:

Because this is a simple example and we don’t need the additional flexibility the Default queue provides, we can leave the default Hosted option selected in the Default agent queue field. For more information on the restrictions on the Hosted pool.

 
You can see the checkbox for CI at the bottom of the dialog. This is enabled so that Visual Studio Online will execute the defined build definition for each check-in. The build definition will define whether this build code is published to an environment. Since we want to continually build and deploy to our web environment, we will check this box.

We can create the build definition from Azure Dev Ops too by following the below steps.

Navigate to Pipelines -> Builds -> Click New Pipeline

Click the Visual Designer, which will allow you to perform using GUI.

Select the Team Project, Repository, Branch and click on the continue button. For Demo purpose, I have given the name of the build definition as ALM – PowerApps and Dynamics 365-CI

In the next step, select an empty job as shown below:

After selecting an empty job, you can see the empty Agent Job and Select the Agent Pool as Hosted

Next step is we need to configure the VSTS build definition as shown in the below diagram.

1. PowerPlatform Tools Installer

Every pipeline that uses the PowerPlatform Build Tools must install them as a first step. This ensures that they are really installed on the Agent.

2. PowerPlatform Publish Customizations

As a second step, we need to publish all customizations. In this step, you only must choose your connection.

Please refer my previous blog or video to understand how we need to create the service connection for the Power Platform Build Tools.

Blog post Reference: https://d365dotblog.com/2020/06/01/alm-how-to-create-service-connection-for-the-powerapps-build-tools/

Video Reference: https://d365dotblog.com/2020/06/16/alm-power-platform-tips-tricks-create-service-connection-for-the-powerapps-build-tools-in-azure-devops-vsts/

3. PowerPlatform Export Solution

As a next step, we need to export the solution from the Power Platform source instance. We are using “Export Solution” to export the solution from the Power platform instance.

Note: Please update the solution name and solution output file in the Power Platform Export solution task

Solution Name: $(SolutionName)

Solution Output File: $(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

4. PowerPlatform Unpack Solution

As a next step, we are going to unpack the solution using Power Platform Unpack solution ask as below:

Solution Input File should be the same as the output in the last step. In our case:

$(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

Target Folder to Unpack Solution should be the folder where you would like to store your unpacked solution in the repo. In our case, we will have a folder in the root which has the name of the solution.

$(Build.ArtifactsStagingDirectory)\$(SolutionName)

5 – Commit solution to repo

Final step is to add the extracted solution to our repo. To do this we will add a standard “Command Line” step. There you will add the following code to “Script” field:

echo commit all changes
git config user.email "<email>"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "solution init"
echo push code to new repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

You must replace “<email>” with the email of the user you would like to use to push your changes.

6 – General Pipeline configuration

Below are the some of the general configurations you need to enable in this pipeline.

For the agent, we need to allow scripts to use the OAuth token. If this is not configured our command-line script will not be able to connect to the repo and push our solution. The configuration should look like this

In our PowerPlatform Export Solution steps, we have used a variable called “SolutionName”.  We need to make sure we are updating the solution name before running the VSTS Build pipeline. Now you can test the pipeline by running it. This can either be done via the “Queue” text if you are still in the edit mode or by using the “Run pipeline” button.

In my next blog, we will see how to pack the solution and deploy it in the Target instance using Power Platform Build Tools.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM for Power Platform/Dynamics 365 Projects in Azure DevOps/VSTS using Power Platform Build Tools – PART 1

Couple of months back I wrote a post on how to implement CI/CD for Dynamics 365 CE. That post can be found here and is still relevant.

In this series of blog, we will explore building out DevOps processes for Dynamics 365 Customer Engagement (CE) and Power Platform projects by utilizing Power Platfrom Build Tools.

Problem Statement

I have started working on ALM when I was assigned as a Platform Engineer for a Dynamics 365 CE and Power Platform implementation project. At that time, I had a few key challenges with deployments. I have listed those below:

  • Solution files were manually extracted and imported to target as a deployment process
  • Multiple deployment process is followed between release environments. For example, in Dev and Sit environment, the solution was migrated manually, and in UAT, Pre-Prod and Prod environment DB compare was applied to promote changes
  • Master data were mutually entered in each environment
  • Multiple developers working in the same organizations overwriting the changes.

To solve the above challenges, I have started working on ALM process. Before we start solving the problem, let us take a moment to see some of the Pre-Requisites which we required for building the ALM Process

Pre-Requisites

How to Setup Azure DevOps

You will need an Azure DevOps environment. Navigate to https://azure.microsoft.com/en-us/services/devops/ and click on the Start free option.

You will need to sign in with your Microsoft account. After logging in, it will take you to the https://dev.azure.com/dharani1743/

Next, we can start by creating a new project in Azure DevOps. This will contain your pipeline and your source repository.

After creating the project, you will be able to see the below screen

Install PowerPlatform Build Tools

Next, we will see how we can install the PowerPlatfrom Build Tools into your Azure DevOps instance.

Choose your specific Azure DevOps organization (in case you have many)

select the install and click on download. Now we have installed the PowerPlatform Build Tools in your Azure DevOps instance.

In my next blog, we will see how to setup the repository and how to create a VSTS Build and Release definition to save a Power Platform solution to Source Control and import the solution into the target instance.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

Resource Scheduling Optimization (RSO)- Initial configuration steps – Part 1

In my previous blog, I have explained “How to configure the RSO in Dynamics 365 Instance“.

In this blog, I will explain how to start with the Initial configuration steps of RSO.

Once the RSO solution has been deployed, you will be able to see the RSO Application as follows:

After installing RSO, Initial steps need to be configured inside your Dynamics 365 instance. We need to enable the functionality in your Dynamics 365 environment and defining what should be optimized by the solution.

By default, RSO will not be enabled (it will be in turned off) state, so it will need to be turned on.

This can be enabled from the Resource Scheduling Optimization app by selecting Administration > Resource Scheduling Parameters.

When RSO is installed, the RSO will add a Resource Scheduling Optimization tab. From the tab you can enable RSO by setting the Enable Resource Scheduling Optimization field to yes.

You can define a default goal for the organization if needed. You can select the default goal, or you can create your own goal from the Optimization goals tab.

How to configure Bing Maps in RSO?

RSO mainly use the map functionality to locate the closest resource to work on an item, so it is important that the organization has enable the map functionality

By default, the Connect to Maps field will be set to no in the scheduling parameters. You will need to set this to yes to ensure the schedule board and schedule assistant will use maps to schedule items.

Navigate to Administration in the RSO Application, open the scheduling parameters and update the Connect to Maps field to Yes.

By default, it will use Bing Maps, but you can configure it to work with any map provider by providing a specific Map API Key value for the mapping provider you want to use in the Map API Key field.

In the next blog I will show you the “how to prepare data for optimization”

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to configure the Resource Scheduling Optimization (RSO) in Dynamics 365 instance

In this blog post, I will explain how to configure the Resource Scheduling Optimization (RSO) to your Dynamics 365 CE instance.

Organizations that require automated scheduling and optimization can purchase the Resource Scheduling Optimization (RSO) add on solution.

  • Purchase Field Service. Go to the Microsoft 365 (Office 365) Admin Center > Billing > Purchase Services and then we need to purchase Resource Scheduling Optimizations (RSO) from Add-ons.

Once purchased, the solution can be deployed to a specific Dynamics 365 instance from the Power platform Admin center by navigating to the https://admin.powerplatform.microsoft.com/resources/applications .

The RSO application will be listed in the available applications list. Once located, it can be configured for a specific instance by selecting the manage button.

Once you click on manage it will navigate you to the https://rsomanagement.dynamics.com/

When deploying the solution, you will need to provide the organization it should be deployed to and agree to the licensing agreement. When it is deployed, RSO creates a Microsoft hosted Azure instance that hosts the optimization engine and service.

This instance is managed and maintained by Microsoft and is used only for the RSO deployment. After the RSO solution has been deployed, it can be managed from this same area moving forward.

After it is deployed, the RSO instance management screen provides the following capabilities:

Open CRM Organization: Allows you to access the Dynamics 365 organization that is associated with the RSO instance.

Delete current Deployment: This will delete the RSO Azure resources. The RSO solution will remain in your Dynamics 365 environment. It does not impact anything inside the Dynamics 365 organization.

It will take at least 15 minutes to deploy RSO into Dynamics 365 CE instance. Once it is deployed, it will display as configured in the resources page in Power Platform Admin center as follows:

After the solution has been deployed, it will need to be configured inside your Dynamics 365 instance. In the next blog I will show you how to do the configuration in Dynamics 365 CE instance.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to Integrate Dynamics 365 with Azure Service Bus

Here is next blog on “How to Integrate Dynamics 365 with Azure Service Bus” without writing any code.

Azure Service Bus and Dynamics 365

Azure Service Bus provides a native integration with Dynamics 365. This means we can send messages to Azure Service Bus from CRM when an event occurs, these messages can then be used to integrate with several downstream applications.

SAS (Shared Access Signature) authentication is used to access Service Bus resources. SAS authentication involves configuring a cryptographic key with associated rights on a Service Bus resource. Clients such as Dynamics CRM can get access to that resource by presenting a SAS token.

Let us build a CRM to Azure Service Bus integration.

Pre-Requisites:

  1. Azure Subscription
  2. Dynamics 365 CE instance
  3. Dynamics 365 Plugin Registration Tool

Here is an overview of how to integrate Dynamics 365 CE with Azure bus by without writing any code

  1. Create an Azure Service Bus in the Azure Portal
  2. Create a Queue on the Bus
  3. Create a Shared Access Key (SAS) for writing messages to the Bus
  4. Copy the Connection String
  5. Register a Service Endpoint in Dynamics with the Plugin Registration Tool
  6. Register a Step under the Service Endpoint for the Entity we want to send
  7. Perform the action in the Dynamics 365 CE and check the results with Service Bus Explorer/Azure Portal.

Step 1: Configure Azure Service Bus and obtain a connection string

You will need an active Microsoft Azure account. Browse to the Azure Portal and click Service Bus. We will start by creating a new namespace.

Next, create a new Service Bus instance by clicking on the below Add button.

Next, create a new Service Bus Messaging Entity such as Queue.

Finally, obtain the connection string by browsing to newly created Shared Access Policy or you can create your own Shared Access Policy.

After the SAS has been created. Then click on the copy button next to the Primary Connection String. We will be using this to tell Dynamics 365 CE in the Plugin Registration Tool.

Step 2: Use CRM plugin tool to make a connection to CRM and Azure Service Bus

Open the Dynamics 365 CE Plugin Registration Tool and Create new connection and login to your Dynamics CRM 365. Next, register a new Service Endpoint.

Copy and paste the connection string obtained in Step 1 in the below high-lighted textbox.

New Service Endpoint now appears in the list of registered plugins.

Step 3: Define events which will post messages to Azure Service Bus

Right-click the Service Endpoint and add a new step. Enter trigger action such as create, delete, etc. in the ‘Message’ box. Enter the primary Entity on which the action will occur. Also, mark the execution mode as ‘Asynchronous’.

Now we have configured the endpoint in Plugin Registration Tool, it is time to test it out. Login to you Dynamics CRM 365 CE and go to case and create a new case. This should trigger an action on the Service Endpoint.

Verify that a message is queued on the Azure Service Bus. Go to Queues -> Overview you can see all the messages in the queue.

And that’s how you can easily configure Azure Service Bus for Dynamics 365 CE without writing any code.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to enable Microsoft Teams integration in Dynamics 365 CE

Here is next blog on “How to enable Microsoft Teams integration in Dynamics 365 CE”

  1. Sign in as a System administrator to Common Data Service.
  2. Go to Settings > Administration > System Settings > General tab.
  3. To enable basic collaboration experience, select Yes for Enable Basic Microsoft Teams Integration.
  • 4.      To enable enhanced collaboration experience, select Yes for Enable Enhanced Microsoft Teams Integration.
    • When you select Yes to Enable Enhanced Microsoft Teams Integration, there is two consent permission popup boxes that will display. If you have a pop-up blocker and you don’t see the second consent dialog, then you need to disable the pop-up blocker in your browser.
  • On the second consent dialog box, select the checkbox for Consent on behalf of organization and then select Accept.

 Note

If you don’t select Consent on behalf of organization option, then when another user tries to pin an entity record or view to Microsoft Teams and shares the tab with another user, they will get this error message, The admin has not consented to use user sync feature, you can add them manually.

  • After the second consent is accepted select, Finish and then select OK on the System Settings screen. If you don’t select OK on the System Settings screen then you will lose your changes.

Once it is enabled, you need to install the Microsoft Teams App and set up the Microsoft Teams Collaboration Channel Tab In my upcoming post I will show you how to install and set up the Dynamics 365 in Microsoft Teams.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this

How to Enable Power BI dashboards in Dynamics 365 CE

Here is next blog on How to Enable Power BI dashboards in Dynamics 365 CE”

To add Power BI visualizations to personal dashboards in your model-driven app, you must:

  • Enable Power BI visualizations for your organization in Settings > Administration > System Settings > Reporting tab > Allow Power BI visualization embedding.

Note:

  • Have a Power BI account and have access to at least one Power BI dashboard.
  • Avoid adding Power BI visualizations to system dashboards; it is not supported.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to Change User Session Timeout Settings in Dynamics 365 CE

Here is next blog on “How to Change User Session Timeout Settings in Dynamics 365 CE.”

In Microsoft Dynamics 365 CE, by default timeout for a user’s session is set to 24 hours which means user is not required to re-enter their login details for up to 24 hours. Now as per our requirements, we can update these settings for the session timeout.

To enable this feature, Navigate to Settings > Administration > System Settings > General tab > Set session timeout.

In section Set session timeout, we need to set “Session timeout settings” to “Set custom”. This will allow us to change the maximum session length and how long before the session expires do you want to show a timeout warning.

Once it is configured, this will automatically logout users after a period of inactivity.

These settings will only apply to that particular(current) instance

In case if we are having multiple instances in our organizations, these settings must be changed in each instance. 

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to enable Plugin Trace logs in Dynamics 365 CE

Here is next blog on how to How to enable Plugin Trace logs in Dynamics 365 CE.

The following steps we must follow in order to enable Plugin Trace Log in Dynamics 365 CE.

  1. Navigate to Settings -> Administrator.
  • Click on the “System Settings” in Administration page.
  • Click on the “System Settings” and navigate to the “Customization” tab.
  • Select All in Enable logging to plug-in trace log under “Plug-in and custom workflow activity tracing” as follows:

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

« Older Entries