Author Archives: Dharani

How to create custom connector to get the flow run history

Here are the steps to create the custom connector to get the specified flow run history in a specified environment.

Swagger file:

swagger: '2.0'
info: {title: Flow Run History, description: '', version: '1.0'}
host: us.api.flow.microsoft.com
basePath: /
schemes: [https]
consumes: []
produces: []
paths:
  /providers/Microsoft.ProcessSimple/environments/{environment_name}/flows/{flow_name}/runs:
    get:
      responses:
        default:
          description: default
          schema: {}
      summary: Get Flow Run History
      description: Get the run history of specified flow in specified environment
      operationId: GetFlowRunHistory
      x-ms-visibility: important
      parameters:
      - {name: environment_name, in: path, required: true, type: string, default: Default-82e5f1fb-9ffd-49c1-b0e1-1aae52374ff1,
        x-ms-visibility: important, description: Environment ID}
      - {name: flow_name, in: path, required: true, type: string, default: 39190573-dbe9-40a2-a99f-183bb2a3fd31,
        x-ms-visibility: important, description: Flow ID}
      - {name: api-version, in: query, required: true, type: string, default: '2016-11-01'}
definitions: {}
parameters: {}
responses: {}
securityDefinitions:
  oauth2_auth:
    type: oauth2
    flow: accessCode
    authorizationUrl: https://login.windows.net/common/oauth2/authorize
    tokenUrl: https://login.windows.net/common/oauth2/authorize
    scopes: {}
security:
- oauth2_auth: []
tags: []

Please make sure you are Adding the Activity.Read.All and Flows.Read.All permissions.

  • Next steps is to Navigate to make.powerapps.com. In the left navigation bar traverse to Dataverse->Custom Connectors and create a new custom connector (Create from blank).
  • Please provide the custom connector name and click on continue. You will be redirected to the below page.
  • In the above image you will be seeing an option called Swagger editor. Please click on it, then it will redirect you to the below page.
  • In the left side black screen, please copy and paste the swagger code which I have mentioned above. Once it is done, please click on update connector. It will navigate you to the security tab.
  • Under Authentication type, you can see the option to edit. Then please enter Client Id, Client Secret and Resource URL: https://service.flow.microsoft.com/ and click on continue. It will navigate you to the definition tab.
  • In the definition tab, you can see the GetFlowRunHistory Action as below. The request has the environment id, flow id, and api-version as input.

You can test this custom connector by passing the environment id, flow id and api version (by default value is 2016-11-01). After testing you can come outside of this custom connector and navigate to Power Automate and add this connector as a action. You will be see the below image inside your Power Automate.

Thanks for reading.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

Post Message to Microsoft Teams using Incoming Webhook

In this blog, we will see how to post the message to Microsoft Teams using Incoming Webhook.

What is the Incoming Webhook?

Basically, it is a URL provided by Teams for any service to use to post content with the goal of sharing that content in your team’s channel. 

When you configure it, you get a URL which you can then post a JSON request to. (JSON is a long-formatted string that helps standardize data.) 

The final output, or the post in Teams, is a pretty card, like below:

How to configure Incoming Webhook in Microsoft Teams:

  1. Navigate to the channel where you want to add the webhook and select (•••) More Options from the top navigation bar.
  • Choose Connectors from the drop-down menu and search for Incoming Webhook.
  • Select the Configure button, provide a name, and, optionally, upload an image avatar for your webhook.
  • The dialog window will present a unique URL that will map to the channel. Make sure that you copy and save the URL—you will need to provide it to the outside service.
  • Select the Done button. The webhook will be available in the team channel.

Flow action looks like this:

JSON(Message Card):

{
    "@type": "MessageCard",
    "@context": "http://schema.org/extensions",
    "themeColor": "0076D7",
    "summary": "Dharanidharan Balasubramaniam created a new task",
    "sections": [{
        "activityTitle": "Dharanidharan Balasubramaniam created a new task",
        "activitySubtitle": "On Malaysia BizApps & Power Platform User Group",
        "activityImage": "https://teamsnodesample.azurewebsites.net/static/img/image5.png",
        "facts": [{
            "name": "Assigned to",
            "value": "Unassigned"
        }, {
            "name": "Due date",
            "value": "Mon May 01 2017 17:07:18 GMT-0700 (Pacific Daylight Time)"
        }, {
            "name": "Status",
            "value": "Not started"
        }],
        "markdown": true
    }],
    "potentialAction": [{
        "@type": "ActionCard",
        "name": "Add a comment",
        "inputs": [{
            "@type": "TextInput",
            "id": "comment",
            "isMultiline": false,
            "title": "Add a comment here for this task"
        }],
        "actions": [{
            "@type": "HttpPOST",
            "name": "Add comment",
            "target": "https://docs.microsoft.com/outlook/actionable-messages"
        }]
    }, {
        "@type": "ActionCard",
        "name": "Set due date",
        "inputs": [{
            "@type": "DateInput",
            "id": "dueDate",
            "title": "Enter a due date for this task"
        }],
        "actions": [{
            "@type": "HttpPOST",
            "name": "Save",
            "target": "https://docs.microsoft.com/outlook/actionable-messages"
        }]
    }, {
        "@type": "OpenUri",
        "name": "Learn More",
        "targets": [{
            "os": "default",
            "uri": "https://docs.microsoft.com/outlook/actionable-messages"
        }]
    }, {
        "@type": "ActionCard",
        "name": "Change status",
        "inputs": [{
            "@type": "MultichoiceInput",
            "id": "list",
            "title": "Select a status",
            "isMultiSelect": "false",
            "choices": [{
                "display": "In Progress",
                "value": "1"
            }, {
                "display": "Active",
                "value": "2"
            }, {
                "display": "Closed",
                "value": "3"
            }]
        }],
        "actions": [{
            "@type": "HttpPOST",
            "name": "Save",
            "target": "https://docs.microsoft.com/outlook/actionable-messages"
        }]
    }]
}

After running the Power Automate, it will post the message in Microsoft Teams like below:

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

getting started with Power BI Deployment Pipelines

In this blog, I will explain how we can use the deployment pipelines for Power BI. Deployment pipelines become General Availability in Power BI.

Deployment pipelines helps enterprise BI teams build an efficient and reusable release process by maintaining development, test, and production environments.

BI teams adopting deployment pipelines, will enjoy:

  1. Improved productivity
  2. Faster content updates delivery
  3. Reduced manual work and errors.

Pre-requisites

You will be able to access the deployment pipelines feature if the following conditions are met:

You have one of the following Premium licenses:

Create a deployment pipeline

You can create a pipeline from the deployment pipelines tab, or from a workspace.

After the pipeline is created, you can share it with other users or delete it. When you share a pipeline with others, the users you share the pipeline with will be given access to the pipeline. Pipeline access enables users to view, share, edit, and delete the pipeline.

Create a pipeline from the deployment pipelines tab

To create a pipeline from the deployment pipelines tab, do the following:

  1. In Power BI service, from the navigation pane, select Deployment pipelines and then select Create pipeline.
  2. In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.

Create a pipeline from a workspace

You can create a pipeline from an existing workspace, providing you’re the admin of a new workspace experience.

  1. From the workspace, select Create a pipeline.
  • In the Create a deployment pipeline dialog box, enter a name and description for the pipeline, and select Create.

Assign a workspace to a deployment pipeline

After creating a pipeline, you need to add the content you want to manage to the pipeline. Adding content to the pipeline is done by assigning a workspace to the pipeline stage. You can assign a workspace to any stage.

You can assign one workspace to a deployment pipeline. Deployment pipelines will create clones of the workspace content, to be used in different stages of the pipeline.

Follow these steps to assign a workspace in a deployment pipeline:

  1. In the newly created deployment pipeline, select Assign a workspace.
  2. In the Choose the workspace drop-down menu, select the workspace you want to assign to the pipeline.
  3. Select the stage you want to assign the workspace to.

Workspace assignment limitations

Deploying reports

Select the stage to deploy from and then select the deployment button. The deployment process creates a duplicate workspace in the target stage. This workspace includes all the content existing in the current stage.

In this below diagram we can see how we have deployed from Development to test and test to production environment.

In the upcoming blog, I will explain selective deployment, Backwards deployment, creating dataset rules and overriding content etc.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM For Power Platform/CDS In Azure-Devops/VSTS Using Power Platform Build Tools – Part 2

In the previous post, I showed you how to setup the Azure DevOps and install Power Platform Build Tools to Azure DevOps. In this blog post, I will show you how we can export from the Dynamics 365 CE instance and commit to VSTS/Azure DevOps.

Below is the VSTS Build-pipeline and will include the following steps

How to setup VSTS Build Definition

In Build Pipeline, we will be exporting the CRM solution, unpack it using the Power Platform Build Tools and store the solution file in Artifacts. You can create a build definition directly from Visual Studio Online (VSTS/Azure-DevOps) or from within Visual Studio. Firstly, I will show you how to create a build definition from within Visual Studio, navigate to the Builds tab in the Team Explorer:

Once there, you can click New Build Definition to be taken directly to Visual Studio Online. This is where you would start if you had decided to create the build definition directly from Visual Studio Online instead of starting in Visual Studio.

On the dialog box that pops up in the browser, we’ll select Visual Studio as our build template, but you can see there are other templates for use, such as Xamarin for Android or iOS and the Universal Windows Platform. The default settings for your build definition should be correct for the most part, but you will need to check the Continuous Integration checkbox. Here is what they look like for this example:

Because this is a simple example and we don’t need the additional flexibility the Default queue provides, we can leave the default Hosted option selected in the Default agent queue field. For more information on the restrictions on the Hosted pool.

 
You can see the checkbox for CI at the bottom of the dialog. This is enabled so that Visual Studio Online will execute the defined build definition for each check-in. The build definition will define whether this build code is published to an environment. Since we want to continually build and deploy to our web environment, we will check this box.

We can create the build definition from Azure Dev Ops too by following the below steps.

Navigate to Pipelines -> Builds -> Click New Pipeline

Click the Visual Designer, which will allow you to perform using GUI.

Select the Team Project, Repository, Branch and click on the continue button. For Demo purpose, I have given the name of the build definition as ALM – PowerApps and Dynamics 365-CI

In the next step, select an empty job as shown below:

After selecting an empty job, you can see the empty Agent Job and Select the Agent Pool as Hosted

Next step is we need to configure the VSTS build definition as shown in the below diagram.

1. PowerPlatform Tools Installer

Every pipeline that uses the PowerPlatform Build Tools must install them as a first step. This ensures that they are really installed on the Agent.

2. PowerPlatform Publish Customizations

As a second step, we need to publish all customizations. In this step, you only must choose your connection.

Please refer my previous blog or video to understand how we need to create the service connection for the Power Platform Build Tools.

Blog post Reference: https://d365dotblog.com/2020/06/01/alm-how-to-create-service-connection-for-the-powerapps-build-tools/

Video Reference: https://d365dotblog.com/2020/06/16/alm-power-platform-tips-tricks-create-service-connection-for-the-powerapps-build-tools-in-azure-devops-vsts/

3. PowerPlatform Export Solution

As a next step, we need to export the solution from the Power Platform source instance. We are using “Export Solution” to export the solution from the Power platform instance.

Note: Please update the solution name and solution output file in the Power Platform Export solution task

Solution Name: $(SolutionName)

Solution Output File: $(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

4. PowerPlatform Unpack Solution

As a next step, we are going to unpack the solution using Power Platform Unpack solution ask as below:

Solution Input File should be the same as the output in the last step. In our case:

$(Build.ArtifactsStagingDirectory)\$(SolutionName).zip

Target Folder to Unpack Solution should be the folder where you would like to store your unpacked solution in the repo. In our case, we will have a folder in the root which has the name of the solution.

$(Build.ArtifactsStagingDirectory)\$(SolutionName)

5 – Commit solution to repo

Final step is to add the extracted solution to our repo. To do this we will add a standard “Command Line” step. There you will add the following code to “Script” field:

echo commit all changes
git config user.email "<email>"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "solution init"
echo push code to new repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master

You must replace “<email>” with the email of the user you would like to use to push your changes.

6 – General Pipeline configuration

Below are the some of the general configurations you need to enable in this pipeline.

For the agent, we need to allow scripts to use the OAuth token. If this is not configured our command-line script will not be able to connect to the repo and push our solution. The configuration should look like this

In our PowerPlatform Export Solution steps, we have used a variable called “SolutionName”.  We need to make sure we are updating the solution name before running the VSTS Build pipeline. Now you can test the pipeline by running it. This can either be done via the “Queue” text if you are still in the edit mode or by using the “Run pipeline” button.

In my next blog, we will see how to pack the solution and deploy it in the Target instance using Power Platform Build Tools.

If you are interested in this topic and would like to do some further self-study, I encourage you to check out my blog on this.

ALM for Power Platform/Dynamics 365 Projects in Azure DevOps/VSTS using Power Platform Build Tools – PART 1

Couple of months back I wrote a post on how to implement CI/CD for Dynamics 365 CE. That post can be found here and is still relevant.

In this series of blog, we will explore building out DevOps processes for Dynamics 365 Customer Engagement (CE) and Power Platform projects by utilizing Power Platfrom Build Tools.

Problem Statement

I have started working on ALM when I was assigned as a Platform Engineer for a Dynamics 365 CE and Power Platform implementation project. At that time, I had a few key challenges with deployments. I have listed those below:

  • Solution files were manually extracted and imported to target as a deployment process
  • Multiple deployment process is followed between release environments. For example, in Dev and Sit environment, the solution was migrated manually, and in UAT, Pre-Prod and Prod environment DB compare was applied to promote changes
  • Master data were mutually entered in each environment
  • Multiple developers working in the same organizations overwriting the changes.

To solve the above challenges, I have started working on ALM process. Before we start solving the problem, let us take a moment to see some of the Pre-Requisites which we required for building the ALM Process

Pre-Requisites

How to Setup Azure DevOps

You will need an Azure DevOps environment. Navigate to https://azure.microsoft.com/en-us/services/devops/ and click on the Start free option.

You will need to sign in with your Microsoft account. After logging in, it will take you to the https://dev.azure.com/dharani1743/

Next, we can start by creating a new project in Azure DevOps. This will contain your pipeline and your source repository.

After creating the project, you will be able to see the below screen

Install PowerPlatform Build Tools

Next, we will see how we can install the PowerPlatfrom Build Tools into your Azure DevOps instance.

Choose your specific Azure DevOps organization (in case you have many)

select the install and click on download. Now we have installed the PowerPlatform Build Tools in your Azure DevOps instance.

In my next blog, we will see how to setup the repository and how to create a VSTS Build and Release definition to save a Power Platform solution to Source Control and import the solution into the target instance.

If you are interested in this topic and would like to do some further self-study I encourage you to check out my blog on this.

Resource Scheduling Optimization (RSO)- Initial configuration steps – Part 1

In my previous blog, I have explained “How to configure the RSO in Dynamics 365 Instance“.

In this blog, I will explain how to start with the Initial configuration steps of RSO.

Once the RSO solution has been deployed, you will be able to see the RSO Application as follows:

After installing RSO, Initial steps need to be configured inside your Dynamics 365 instance. We need to enable the functionality in your Dynamics 365 environment and defining what should be optimized by the solution.

By default, RSO will not be enabled (it will be in turned off) state, so it will need to be turned on.

This can be enabled from the Resource Scheduling Optimization app by selecting Administration > Resource Scheduling Parameters.

When RSO is installed, the RSO will add a Resource Scheduling Optimization tab. From the tab you can enable RSO by setting the Enable Resource Scheduling Optimization field to yes.

You can define a default goal for the organization if needed. You can select the default goal, or you can create your own goal from the Optimization goals tab.

How to configure Bing Maps in RSO?

RSO mainly use the map functionality to locate the closest resource to work on an item, so it is important that the organization has enable the map functionality

By default, the Connect to Maps field will be set to no in the scheduling parameters. You will need to set this to yes to ensure the schedule board and schedule assistant will use maps to schedule items.

Navigate to Administration in the RSO Application, open the scheduling parameters and update the Connect to Maps field to Yes.

By default, it will use Bing Maps, but you can configure it to work with any map provider by providing a specific Map API Key value for the mapping provider you want to use in the Map API Key field.

In the next blog I will show you the “how to prepare data for optimization”

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to configure the Resource Scheduling Optimization (RSO) in Dynamics 365 instance

In this blog post, I will explain how to configure the Resource Scheduling Optimization (RSO) to your Dynamics 365 CE instance.

Organizations that require automated scheduling and optimization can purchase the Resource Scheduling Optimization (RSO) add on solution.

  • Purchase Field Service. Go to the Microsoft 365 (Office 365) Admin Center > Billing > Purchase Services and then we need to purchase Resource Scheduling Optimizations (RSO) from Add-ons.

Once purchased, the solution can be deployed to a specific Dynamics 365 instance from the Power platform Admin center by navigating to the https://admin.powerplatform.microsoft.com/resources/applications .

The RSO application will be listed in the available applications list. Once located, it can be configured for a specific instance by selecting the manage button.

Once you click on manage it will navigate you to the https://rsomanagement.dynamics.com/

When deploying the solution, you will need to provide the organization it should be deployed to and agree to the licensing agreement. When it is deployed, RSO creates a Microsoft hosted Azure instance that hosts the optimization engine and service.

This instance is managed and maintained by Microsoft and is used only for the RSO deployment. After the RSO solution has been deployed, it can be managed from this same area moving forward.

After it is deployed, the RSO instance management screen provides the following capabilities:

Open CRM Organization: Allows you to access the Dynamics 365 organization that is associated with the RSO instance.

Delete current Deployment: This will delete the RSO Azure resources. The RSO solution will remain in your Dynamics 365 environment. It does not impact anything inside the Dynamics 365 organization.

It will take at least 15 minutes to deploy RSO into Dynamics 365 CE instance. Once it is deployed, it will display as configured in the resources page in Power Platform Admin center as follows:

After the solution has been deployed, it will need to be configured inside your Dynamics 365 instance. In the next blog I will show you how to do the configuration in Dynamics 365 CE instance.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to Integrate Dynamics 365 with Azure Service Bus

Here is next blog on “How to Integrate Dynamics 365 with Azure Service Bus” without writing any code.

Azure Service Bus and Dynamics 365

Azure Service Bus provides a native integration with Dynamics 365. This means we can send messages to Azure Service Bus from CRM when an event occurs, these messages can then be used to integrate with several downstream applications.

SAS (Shared Access Signature) authentication is used to access Service Bus resources. SAS authentication involves configuring a cryptographic key with associated rights on a Service Bus resource. Clients such as Dynamics CRM can get access to that resource by presenting a SAS token.

Let us build a CRM to Azure Service Bus integration.

Pre-Requisites:

  1. Azure Subscription
  2. Dynamics 365 CE instance
  3. Dynamics 365 Plugin Registration Tool

Here is an overview of how to integrate Dynamics 365 CE with Azure bus by without writing any code

  1. Create an Azure Service Bus in the Azure Portal
  2. Create a Queue on the Bus
  3. Create a Shared Access Key (SAS) for writing messages to the Bus
  4. Copy the Connection String
  5. Register a Service Endpoint in Dynamics with the Plugin Registration Tool
  6. Register a Step under the Service Endpoint for the Entity we want to send
  7. Perform the action in the Dynamics 365 CE and check the results with Service Bus Explorer/Azure Portal.

Step 1: Configure Azure Service Bus and obtain a connection string

You will need an active Microsoft Azure account. Browse to the Azure Portal and click Service Bus. We will start by creating a new namespace.

Next, create a new Service Bus instance by clicking on the below Add button.

Next, create a new Service Bus Messaging Entity such as Queue.

Finally, obtain the connection string by browsing to newly created Shared Access Policy or you can create your own Shared Access Policy.

After the SAS has been created. Then click on the copy button next to the Primary Connection String. We will be using this to tell Dynamics 365 CE in the Plugin Registration Tool.

Step 2: Use CRM plugin tool to make a connection to CRM and Azure Service Bus

Open the Dynamics 365 CE Plugin Registration Tool and Create new connection and login to your Dynamics CRM 365. Next, register a new Service Endpoint.

Copy and paste the connection string obtained in Step 1 in the below high-lighted textbox.

New Service Endpoint now appears in the list of registered plugins.

Step 3: Define events which will post messages to Azure Service Bus

Right-click the Service Endpoint and add a new step. Enter trigger action such as create, delete, etc. in the ‘Message’ box. Enter the primary Entity on which the action will occur. Also, mark the execution mode as ‘Asynchronous’.

Now we have configured the endpoint in Plugin Registration Tool, it is time to test it out. Login to you Dynamics CRM 365 CE and go to case and create a new case. This should trigger an action on the Service Endpoint.

Verify that a message is queued on the Azure Service Bus. Go to Queues -> Overview you can see all the messages in the queue.

And that’s how you can easily configure Azure Service Bus for Dynamics 365 CE without writing any code.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this.

How to enable Microsoft Teams integration in Dynamics 365 CE

Here is next blog on “How to enable Microsoft Teams integration in Dynamics 365 CE”

  1. Sign in as a System administrator to Common Data Service.
  2. Go to Settings > Administration > System Settings > General tab.
  3. To enable basic collaboration experience, select Yes for Enable Basic Microsoft Teams Integration.
  • 4.      To enable enhanced collaboration experience, select Yes for Enable Enhanced Microsoft Teams Integration.
    • When you select Yes to Enable Enhanced Microsoft Teams Integration, there is two consent permission popup boxes that will display. If you have a pop-up blocker and you don’t see the second consent dialog, then you need to disable the pop-up blocker in your browser.
  • On the second consent dialog box, select the checkbox for Consent on behalf of organization and then select Accept.

 Note

If you don’t select Consent on behalf of organization option, then when another user tries to pin an entity record or view to Microsoft Teams and shares the tab with another user, they will get this error message, The admin has not consented to use user sync feature, you can add them manually.

  • After the second consent is accepted select, Finish and then select OK on the System Settings screen. If you don’t select OK on the System Settings screen then you will lose your changes.

Once it is enabled, you need to install the Microsoft Teams App and set up the Microsoft Teams Collaboration Channel Tab In my upcoming post I will show you how to install and set up the Dynamics 365 in Microsoft Teams.

I hope this blog post helps you. If you are interested to know more about this topic then, I encourage you to check out my blog on this

« Older Entries