How to detect car engine anomaly by analyzing engine noise?

Following to my article on “Starting your exciting journey of Connected Field Service and Azure IoT Hub“, I started working on a practical scenario about measuring noise in your surrounding and generating alerts in #PowerPlatform. In this article I want to summarize all resources required to implement such a scenario and my learnings. I hope this will add on the resources available in the community so you can use it as a walkthrough to implement a practical scenario.

In this article, you will see what architectural components are required to implement this simple scenario using Azure Iot and Connected Field Service. This article focuses on what is happening underhood of the platform. Luckily with Connected Field Service Application, you have everything managed behind the scene and you don’t need to worry much but this walkthrough enables to you understand what options you have in such scenarios.


The scenario is about connecting MXChip IoT DevKit to your car or any place with noise and analyze the noise level by recording and sending the noise in form of Wave stream to an Azure IoT Hub. The Azure IoT Hub sends the data to an #Azurefunction which calculates the noise level using a simple formula and the function calls a #MicrosoftFlow to create alerts in #PowerPlatform. This can lead to number of endless scenarios.

  • The function for calculating the noise level from a wave file is extremely simple as well. There are so many scientific information which you can read here, here, here and here.
  • Calculating the noise level is not an easy task. There are many considerations involved and if you want to have the real working model, you will need to work on analyzing audio files which is beyond the scope of this demo.
  • It is possible and desirable to calculate the noise level in the device and send only the alerts to Azure IoT. This will reduce the traffic and the load on your Azure. However, for the sake of experiment I am sending all the noise data to Azure and calculate the noise level in Azure function.
  • In this demo, I am not listening to the noise all the time. I start recording on press of button A. I send the noise data to Azure on press of button B. I made this change to the scenario to demonstrate working with buttons in MX Chip and also reduce the traffic to Azure.


The architecture of this sample is very simple. I am using an IoT Hub and Azure function to calculate and propagate the IoT events to the #PowerPlatform. On the device side, there is an Arduino application running which listens to noises and sends the noise to the Azure function.

A very comprehensive architecture of a connected field service is created in the below diagram which can simply be implemented using the #ConnectedFieldService application. However, I just wanted to implement it in a simpler way. Full details of the #ConnectedFieldService architecture can be seen in this documentation.


The logical diagram of components is demonstrated below:

Ardiuno App

This component is a very program which reads the input from Audio, ButtonA and ButtonB of the device and does the following:

  1. On startup, it initializes the device and gets ready to listen to surrounding noise. It also checks the connectivity to Azure.
  2. On press of ButtonA , it records and surrounding noise and stores the stream in a buffer.
  3. On press of ButtonB, it sends the stream in the buffer to Azure.

To implement this part of the application, you will need to take following actions:

  1. Setup your device MXChip device. Please refer to this link to start.
  2. Setup your Visual Studio environment. Please refer to this link.
  3. You will need to learn how to deploy your code to the MXChip device. The simple way to upload your code your code to the device is to bring your MXChip device to Configuration mode. This means everytime you want to upload your updated code, Press A (and keep pressing) and then press reset (while still pressing A). Then release reset (While still pressing A) and then release A. Now you are ready to upload your code.
  4. If you want to debug your code in the device, you can refer to this link.

Here is my sample code:

#include "AZ3166WiFi.h"
#include "DevKitMQTTClient.h"
#include "AudioClassV2.h"
#include "stm32412g_discovery_audio.h"
//Constants and variables- Start//
enum AppState
// variables will change:
static AppState appstate;
static int buttonStateA = 0;
static int buttonStateB = 0;
static bool hasWifi = false;
static bool hasIoTHub = false;
AudioClass &Audio = AudioClass::getInstance();
const int AUDIO_SIZE = 32000 * 3 + 45;
char *audioBuffer;
int totalSize;
int monoSize;
static char emptyAudio[AUDIO_CHUNK_SIZE];
RingBuffer ringBuffer(AUDIO_SIZE);
char readBuffer[AUDIO_CHUNK_SIZE];
bool startPlay = false;
void SendMessage(char *message)
// Send message to Azure
if (hasIoTHub && hasWifi)
char buff[512];
// replace the following line with your data sent to Azure IoTHub
snprintf(buff, 512, message);
if (DevKitMQTTClient_SendEvent(buff))
Screen.print(1, "Sent...");
Screen.print(1, "Failure...");
// turn LED on-off after 2 seconds wait:
Screen.print("NO BUTTON DETECTED");
void setup()
// put your setup code here, to run once:
memset(emptyAudio, 0x0, AUDIO_CHUNK_SIZE);
if (WiFi.begin() == WL_CONNECTED)
hasWifi = true;
Screen.print(1, "Running!!!");
if (!DevKitMQTTClient_Init(false, true))
hasIoTHub = false;
hasIoTHub = true;
// initialize the pushbutton pin as an input:
appstate = APPSTATE_Init;
hasWifi = false;
Screen.print(1, "No Wi-Fi");
void loop()
// put your main code here, to run repeatedly:
// while(1)
// read the state of the pushbutton value:
buttonStateA = digitalRead(USER_BUTTON_A);
buttonStateB = digitalRead(USER_BUTTON_B);
if (buttonStateA == LOW && buttonStateB == LOW)
//SendMessage("A + B");
else if (buttonStateA == LOW && buttonStateB == HIGH)
Screen.print(0, "start recordig");
while (digitalRead(USER_BUTTON_A) == LOW && ringBuffer.available() > 0)
if (Audio.getAudioState() == AUDIO_STATE_RECORDING)
startPlay = true;
else if (buttonStateA == HIGH && buttonStateB == LOW)
if (startPlay == true)
Screen.print(0, "start playing");
while (ringBuffer.use() >= AUDIO_CHUNK_SIZE)
startPlay = false;
else if (buttonStateA == HIGH && buttonStateB == HIGH)
void record()
Serial.println("start recording");
Audio.format(8000, 16);
void play()
Serial.println("start playing");
Audio.format(8000, 16);
void playCallback(void)
if (ringBuffer.use() < AUDIO_CHUNK_SIZE)
Audio.writeToPlayBuffer(emptyAudio, AUDIO_CHUNK_SIZE);
int length = ringBuffer.get((uint8_t *)readBuffer, AUDIO_CHUNK_SIZE);
Audio.writeToPlayBuffer(readBuffer, length);
void recordCallback(void)
int length = Audio.readFromRecordBuffer(readBuffer, AUDIO_CHUNK_SIZE);
ringBuffer.put((uint8_t *)readBuffer, length);

Azure function

This is the simplest of all. All you have to do is to receive the stream and calculate the noise level. This can be very sophisticated but it is out of scope of this article.

using IoTHubTrigger = Microsoft.Azure.WebJobs.EventHubTriggerAttribute;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.EventHubs;
using System.Text;
using System.Net.Http;
using Microsoft.Extensions.Logging;
using System;
namespace IoTWorkbench
public static class IoTHubTrigger1
private static HttpClient client = new HttpClient();
public static void Run([IoTHubTrigger("%eventHubConnectionPath%", Connection = "eventHubConnectionString")]EventData message, ILogger log)
log.LogInformation($"C# IoT Hub trigger function processed a message: {Encoding.UTF8.GetString(message.Body.Array)}: " + System.Text.Encoding.Default.GetString(message.Body));
byte[] buffer = message.Body.ToArray();
short sample16Bit = BitConverter.ToInt16(buffer, 0);
double volume = Math.Abs(sample16Bit / 32768.0);
double decibels = 20 * Math.Log10(volume);


In order the device to send messages to the Azure function, the device must know the endpoint in which it should send the data. You can take steps in the this link to register your device with Azure function. It is all about using Azure IoT Workbench.


Photo by Steinar Engeland on Unsplash

How to resolve “error Executing the api /eventhubs” in #MicrosoftFlow?

While trying to connect Microsoft Flow to Azure Event Hub, you cannot retrieve the Event Hub name and instead you get “Error Executing the api /eventhubs” error. The event hub connector in Flow, allows you to connect to event hub using connection strings and get notified as soon as a new event in available in the hub. However, there are certain things you will need to know.

Event Hub Namespace vs. Event Hub

An Event Hubs namespace provides a unique scoping container, referenced by its fully qualified domain name, in which you create one or more event hubs. So Event Hubs are inside Event Hub Namespace. Both of the Event Hub Namespace and Event Hub have their own connection string which can be used to access these resources. However, it is important to know that the Microsoft flow connector for Event hub accepts the Event Hub Namespace’s connection string rather than Event Hub resource’s connection string.

Error “Executing the api /eventhubs”

You will see the below error while trying to use Event Hub resource.

The solution is to use Event Hub Namespace’s connection string.

To confirm whether your connection string is associated with your Event Hubs namespace or with a specific event hub, make sure the connection string doesn’t have the EntityPath parameter. If you find this parameter, the connection string is for a specific Event Hub “entity” and is not the correct string to use with your logic app.


How to associate #Azure Subscriptions with #PowerPlatform linked #AzureAD?

Registration of the #PowerPlatform application in #Azure is becoming inevitable in most projects. The most basic requirement is to expose the #PowerPlatform services to outside world. When you provision a #powerplatform subscription, Microsoft by default associates an #AzureAD to your subscription. This #AzureAD is only a directory without #Azure subscription linked with it. For using #Azure services, you will need to have #Azure subscription. You may have an existing #Azure subscription with lots of services in it and most often you wish to link the existing #Azure subscription with the new #AzureAD provisioned for the #PowerPlatform subscription. In this post, I list down the steps you will need to associate your existing #Azure subscription with newly provisioned #AzureAD linked with your #PowerPlatform.

  1. Go to your newly created #AzureAD by browsing:
  2. Sign in with your #PowerPlatform Credentials.
  3. You will have no subscriptions as per the below picture (You can see your subscriptions by searching Subscription in the search textbox).Go to Azure Active Directory. You will find the Azure Active Directory link on the left panel.
  4. Go to Azure Active Directory. You will find the Azure Active Directory link on the left panel.
  5. Go to Users and Click on the New User.
  6. Click on the INVITE User. Fill the form and invite the user associated with the #Azure subscription.
  7. Give the Administrator access to the invited user so the user can create applications and register endpoints.
  8. The contact associated with the #Azure subscription will receive an email to access accept the invitation.
  9. The next step is to login to the #Azure account with the subscription and access the subscription (by searching the word subscription on the search box).
  10. Click on the subscription to see the subscription details.
  11. From the command bar click on the “Change Directory”. You will see the #AzureAD which invited you to be move your subscription to it.





How To Start Your Exciting Journey Of “Connected Field Service” And “Azure Iot Hub”?

After my last article I spent some time to explore Azure IoT and Connected Field service. I have been watching presentations and demos on YouTube and I saw many good demonstrations about the capabilities of Azure IoT Hub and Connected Field service. However, I wanted to explore more and see how things really work under hood of the platform. So, in this and few upcoming posts I am summarising my key learnings for the benefit of fellows who want to start.

Note: I structured this blog series based on my learning journey. It is designed based on questions of CRM consultants who wants to start their Connected Field Service Journey.

Let’s start by some key definitions which you will hear a lot in your journey and better be familiar at the start:

IoT Device

IoT Device is a piece of hardware with a small circuit (called Microcontroller), capable of connecting to the internet using WIFI or Mobile services. The purpose of the microcontroller is to capture data from the surrounding and send it to consume services over the internet. In other words, IoT device is a minicomputer capable of sending (and sometimes processing) data over the internet.

Raspberry PI / Windows IoT

The IoT device mentioned above can be a small circuit to react to an event like switch on/switch off or can be very powerful to process images in the device itself. When we develop very powerful devices capable of processing data and applying complex logics, we will need a program to run the logic or process data. To enable this processing, we need a lightweight operating system to host our application. The operating system suitable for operating these powerful devices could be Raspberry PI or Windows IoT.

Azure Internet of Things (IoT)

Azure IoT is series of managed services combined to connect, monitor and control smart IoT enabled devices. So basically, the device’s sensors detect events from surrounding environments and pass the event and data using Microcontroller to Azure IoT service. Azure IoT receives the data from millions of devices and sends it for further processing services for giving the data meaning.

Azure IoT Hub

IoT Hub is a central message hub for bi-directional communication between devices and Azure IoT. So, imagine Azure IoT hub as a heart of Azure IoT service.

Azure IoT Edge

As we move towards future, we will have more IoT devices connected to the Azure IoT hub. Each of these devices will send raw messages to Azure IoT hub for processing. This is ok since the Azure IoT hub is a scalable service, but would not it be more efficient if we use the powerful hardware to pre-process data in devices before sending it to the Azure IoT hub? Azure IoT Edge enables us to write programs to process data inside device before sending it to azure and/or send only the necessary data/events. This will help to offload processing in Azure and reduce traffic. If you have a powerful tool why not use it?

IoT Device Twin

Device twin is logical representation of the device status in Azure. For example, you have a smart bulb which is installed in the Branch A. The device twin can contain the location so you can query twins and identify installed devices in the Branch A.  Or you can store the firmware version in Device Twins and at the time of firmware upgrade, you can upgrade only old firmware versions.

Connected Field Service

The Azure IoT gives you the technology enablement to interact with devices. However, the real value of IoT emerges when you give a business context to it. The connected field service gives the business context to your IoT data. The connected field service comes with an accelerator. When you install the connected field service from the AppSource, you will get all entities and required components in your Dynamics environment. It gives you the ability to configure Dynamics to interact with devices, listen to them and be notified when they don’t feel right.

I have seen IoT Central. What is it and how different is it with Azure IoT Hub?

According to the Microsoft documentation, Azure IoT Central is a software as a service (SaaS) solution that uses a model-based approach to help you to build enterprise-grade IoT solutions without requiring expertise in cloud-solution development. However, Azure IoT Hub is the core Azure PaaS that both Azure IoT Central and Azure IoT solution accelerators use. IoT Hub supports reliable and secure bidirectional communications between millions of IoT devices and a cloud solution.

So, in summary if you have Azure Skills and you want flexibility to expand/manage your solution go for IoT Hub. But if you want an accelerator without extension requirement, then IoT Central is the way forward.

Where do I start my journey?

Where to begin
Where to begin – Learning Path

If you want to start directly from Dynamics you simply can follow instructions here

Otherwise follow the following steps:

  1. Provision your Azure IoT Service (Link)
  2. Create a simulated device (Link)
  3. Send and receive messages (Link)

I have a smart door lock. I want to be notified when it is unlocked. Where do I start?

Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes.To implement this scenario, you will need to register your smart door lock in Azure IoT as mentioned above. Once the device is registered, you will need to define Unlock event in IoT service to enable the service to receive Unlock events. By default, the Azure IoT provides 4 event types:

  • Device Created
  • Device Deleted
  • Device Connected
  • Device Disconnected

However, for our scenario we will need to create a custom event Unlock. To define the custom event you can follow this link.

How do I route incoming message to Dynamics?

Once you define your events above, you will need to create a routing channel for the desired events. Once the Unlock events is detected in the IoT, you can define your routing to send your messages to an endpoint. You can configure your routing as per this link.

How to detect if my device disconnects from IoT?

It is very simple, as mentioned above, the Disconnection event is a default event which is triggered once your device is disconnected. All you must do it configure a service to react to the disconnection event.

How to orchestrate events from IoT to Dynamics?

If you are not using Connected Field Service solution from AppSource, you can configure a Flow to send messages across. There are triggers for IoT hub and events which you can use.

IoT Flow Template
IoT Flow Template

What kind of messages can be transmitted between device and IoT?

Are you a Passionate #PowerPlatform developer, looking for your next big challenge?

Looking for your next big thing?
What I love about the #PowerPlatform is that we have tones of new features every 6 months and if you are an avid learner, it gives no reason to you to be stagnant and not learn new things every day. You have numerous options from creating #powerapps, #pcf component or just explore new enhancements in the #dynamics365 applications.
After spending some time on exploring some core capabilities of the #PowerPlatform and playing with the field service application, I have been thinking about what is the next big thing I should focus on. I was looking for something challenging which has an impact on us. To decide on what should be the next, I decided to take a step back and look at the options I have to decide on what to do next. I looked in technology blogs, read books and listened to many podcasts which helped me to clear my mind and shortlist some major places I could start. I would like to share what I did in the hope that it may help other folks who are also looking for their next challenge:

Industry Reports & Microsoft investments

Like it or not, business needs and drive product development. It is the business, industry and eventually profits which are driving the investment into products which we can work on implementation or extension. I think one the best sources you can get your own next best challenge is industry or economic reports. I always have an eye on #Gartner reports. I see people share #Gartner reports back and forth on LinkedIn to show how a product is doing in the competition. I look at it from a different angel. I always read the #Gartner report but looking at the weaknesses. Weaknesses in the competition drive demand for innovations and solutions. By reading reports, you will understand what are the weaknesses in a product or platform and how these weaknesses matter to end customers. The next big challenge can be addressing some of the weaknesses!
Another indicator is Microsoft investments. When I see Microsoft is investing heavily on a platform or product, it gives me a good idea on what is the next big thing. Of course, Microsoft has a big team of industry experts and awesome people familiar in various businesses. Speaking of myself, I do trust the investment numbers of big players because that is going to create a demand for my innovative mind!
What do you think about this one?

What about cross-domain ideas?

Our field is fluid. It changes every day. It is not only #PowerPlatform. It is the nature of our field. There was a time when implementing a CRM was the goal of organisations but now CRM is only one piece in the big picture! Have you even thought of creating something including multiple technologies? How about combining #Azure and #PowerPlatform? The topic of cross-domain work is the one I am very interested. I have explored Azure SearchAzure SpeechAzure Bots, Azure Documents and Azure IoT Hub alongside #PowerPlatform. The synergy between #Azure and #PowerPlatform creates a system which really serves customers better. If you are out of ideas in #PowerPlatform, then start looking in #Azure. I bet you will have a lovely time learning new things and explore how #Azure can help you to do better implementations!

Working on community Ideas

There is a long list (and it is growing day by day) of product ideas in the IDEAS PORTAL. The #PowerPlatform product team does a good job on assessing, shortlisting and working on the great ideas to release in future waves. However, it is not possible to implement all of ideas due to various reasons such as criticality, demand, priorities and etc. Passionate people in the community can assess these ideas and see if they like the idea. Once chosen, you can work on the idea and release your own source to the GitHub to spread its benefit. Publishing your source helps the idea get bigger and your code quality will be better, too!

XrmToolBox plugins

I think XrmToolBox is the single best thing that the Dynamics community has come up with. I use it in every project. It inspires me when I see friends out them develop such a lovely plugin that works like a charm. (Big shoutout to Tanguy Touzard , Jonas Rapp and Aiden Kaskela whom I follow when it comes to the XrmToolBox) I never had a major problem using the tool which tells me how many good coders are out there. Your next best thing can be a XrmToolBox plugin or an extension of an existing one. If you think your idea is small or bad, you better think  again. There is no small or bad idea. Once you float your idea, it will become a snowball. Other awesome people contribute to yours and you will see how your small idea will become a big one helping others to do better. Even if you have no small idea, you can extend the functionality of an existing plugin. You always have an option to choose from 🙂 but remember one thing: Being stagnant is NO option!
The above list is not all I explored but it shortlists some of the things you can relate to. I would love to hear from you and see what is your experience in finding your next big thing?

P.S: A friendly reminder by @rappen on how to type: XrmToolBox (

Image by Joanna Kosinska from Joanna Kosinska

How to call #webapi from #PowerPlatform #Portals

In almost all of my portal projects I get a question from my clients: “How to call an external api from #portals”? This question has been common that I decided to write about my experience on this topic which might be helpful for the community. This post will focus on two main areas:

  1. The available options to integrate #portals with external
  2. A step by step guide on one of the least discussed options which is using Oauth Implicit Grant flow and how I created a simple demo for one of my customers


I would like to give a business context to this scenario. Any enterprise solution requires integration and interaction of multiple systems which #portals could be one of them. Imagine a scenario where a customer is looking for a quote on a product in the company portal. In this case the #portal is required to bring quote details from CPQ (Configure Price Quote) system to the portal. In another scenario, a #portal is required to integrate with a core banking system to get the customer’s latest balances. In these scenarios and similar ones, we will require the #portal to integrate with an external api to get information.

In order to enable such integrations, the #portal must be able to make calls in a secure way as most of the internal systems require authentication before anything can happen. So what are the options available?


Since #powerplatform #portals are tightly integrated with #powerplatform, in most cases the integration is done through the #powerplatform itself. However, the integration through these #powerplatform has three flavors.

  1. The first one is creating actions in the platform which communicated with external API and manages the requests and responses; then calling the actions through a workflow where the workflow is triggered using Entity Form or Entity List events. 
Portal Integration with Web Api
Portal Integration with Web Api using Actions


  • The second option is to use #MicrosoftFlow to encapsulate the Workflow and Action part in a Flow. The benefit of this solution is that you won’t need to write code (in most cases but not guaranteed) to call #webapi

    Portal Integration using Flow
    Portal Integration using Flow
  • The above two options, use #PowerPlatform to facilitate the integration and all calls are routed through the platform. However, going through the server is not always feasible. There are situations in which you would like to make client side calls from javascript using Ajax from #portals to call external API. However, the main concerns in these scenarios are authentication. And the solution provided by the platform is “Oauth Implicit Grant Flow“.If you would like to learn more about what is the ”

    Oauth Implicit Grant Flow” beyond the #PowerPlatform, you can read more here.


There are concerns over the Oauth Implicit Grant flow and the recommendation is to use “Oauth code grant flow”. According to the Oauth working group, “t is generally not recommended to use the implicit flow (and some servers prohibit this flow entirely). In the time since the spec was originally written, the industry best practice has changed to recommend that public clients should use the authorization code flow with the PKCE extension instead.”. Microsoft is aware of this restriction however, it is believed Oath implicit grant flow is still ok to use.

I have proposed an idea to implement the Oauth code grant flow in this IDEA. Please vote for it.

Now getting back to the topic: How to Integrate:

Portal Integration with Oauth Implicit Grant Flow
Portal Integration with Oauth Implicit Grant Flow

In this scenario, there is no server side calls are required. A complete documentation is available here. However, the documentation is not very helpful if you want to do things quickly since there is a learning cycle involved. OAuth 2.0 implicit grant flow supports endpoints that a client can call to get an ID token. Two endpoints are used for this purpose: authorize and token. I will not go to the details of these calls and I assume you already know what these are.

So here is what you will have to do:

  1. Create your web api. You can download the sample api from this Github project. This website is no different than any MVP website. So you can create your own with Web APIs. 
  2. Next is to register your application in Azure Active Directory. This is a free service which you can use to provide authentication to your web api. A step by step details of the registration process is in this link.The REDIRECT URL must be the direct link to the page you created in the step # 2. You will need to note the following after this step:

    – Client ID
    – Redirect URL

  3. Let’s say you have a Quote page in your portal and you would like to place a button on the portal page to get Quotations from your internal website. You will have to put a custom HTML in your “Content Page” (not the main page) of the portal. This custom HTML will be used to add a QUOTE button to the portal and also retrieve the Quotation by use of a custom javascrtip code.
<h2>The QUOTE BUTTON</h2>

<button type="button" onclick="callAuthorizeEndpoint()">Give me a Quote!</button>

//Remove this line to avoid State validation
function callAuthorizeEndpoint(){
//Used for State validation
var useStateValidation = $.cookie("useStateValidation");
var appStateKey = 'p07T@lst@T3';
var sampleAppState = {id:500, name:"logic"};
//Replace with Client Id Registered on CRM
//Replace with Redirect URL registered on CRM
var redirectUri = encodeURIComponent("");
//Authorize Endpoint
var redirectLocation = `/_services/auth/authorize?client_id=${clientId}&redirect_uri=${redirectUri}`;
//Save state in a cookie if State validation is enabled
$.cookie(appStateKey, JSON.stringify(sampleAppState));
redirectLocation = redirectLocation + `&state=${appStateKey}`;
console.log("Added State Parameter");

window.location = redirectLocation;

  1. Modify the source code in the web api website to use the Client ID and Redirect URL in its startup page.
public virtual Task ValidateIdentity(OAuthValidateIdentityContext context)
if (!context.Request.Headers.ContainsKey("Authorization"))
return Task.FromResult<object>(null);

// Retrieve the JWT token in Authorization Header
var jwt = context.Request.Headers["Authorization"].Replace("Bearer ", string.Empty);
var handler = new JwtSecurityTokenHandler();
var token = new JwtSecurityToken(jwt);
var claimIdentity = new ClaimsIdentity(token.Claims, DefaultAuthenticationTypes.ExternalBearer);
var param = new TokenValidationParameters
ValidateAudience = false, // Make this false if token was generated without clientId
ValidAudience = "CLIENT ID", //Replace with Client Id Registered on CRM. Token should have been fetched with the same clientId.
ValidateIssuer = true,
IssuerSigningKey = _signingKey,
IssuerValidator = (issuer, securityToken, parameters) =>
var allowed = GetAllowedPortal().Trim().ToLowerInvariant();

if (issuer.ToLowerInvariant().Equals(allowed))
return issuer;
throw new Exception("Token Issuer is not a known Portal");

SecurityToken validatedToken = null;
handler.ValidateToken(token.RawData, param, out validatedToken);
var claimPrincipal = new ClaimsPrincipal(claimIdentity);
context.Response.Context.Authentication.User = claimPrincipal;
catch(Exception exception)
return null;
return Task.FromResult<object>(null);

  1. The next step is to use Custom HTML on the Redirect PAGE so that you can make the call to the Web API by the token obtained in this step.
function getResultInUrlFragment(hash){
        var result = {};
            var arr = keyValuePair.split('=');
//  Add to result, only the keys with values
            arr[1] && (result[arr[0]] = arr[1]);
return result;
return null;
//Validate State parameter
//Returns true for valid state and false otherwise
function validateState(stateInUrlFragment){
console.error("State Validation Failed. State parameter not found in URL fragment");
return false;

// State parameter in URL Fragment doesn't have a corresponding cookie.
console.error("State Validation Failed. Invalid state parameter");
return false;
return true;

var useStateValidation = $.cookie("useStateValidation");
var appState = null;

//Fetch the parameters in Url fragment
var authorizeEndpointResult = getResultInUrlFragment(window.location.hash);

//Validate State
authorizeEndpointResult = null;
appState = $.cookie(authorizeEndpointResult.state);        
console.log("State: "+appState);

//Display token
    var data = authorizeEndpointResult.token;
console.log("Token:" + data);
type: "GET",
url: "",
contentType: "application/json; charset=utf-8",
dataType: "json",
headers: {
Accept:"text/plain; charset=utf-8",
        "Authorization": "Bearer "+data
success: function (data) {
}, //End of AJAX Success function
failure: function (data) {
}, //End of AJAX failure function
error: function (data) {
} //End of AJAX error function

I hope this post helps you a bit to make your portals connect to the outside world!

Email Sentiment Analysis in Power Platform to improve customer service

What I love about my life as a #consultant is having the opportunity to hear customer problems and responding to them with something of value which improves their business in their own industry and market. What I love about being #Microsoft #Technology #consultant is working on a technology which not only cares about end users but also makes it easier for me (or any citizen developer) to come up with solutions easy to implement and with the #powerplatform and #msflow, many things don’t even need to open my #visualstudio (which I love and open every day even if I am not coding – Sounds crazy, nah! :D). Let’s get back to the track now!


I had a request from my customer who was getting bombarded with case emails in its support department. The customer asked me to find a solution to prioritize emails based on urgency and probability of customers getting defected.

My initial thought was that “How do I need to quantify if a customer is going to defect because they are not satisfied”? After pondering on few solutions, I could come up with the idea of “Email Sentiment” as KPI for customer defection. If a customer is not satisfied with a service, their first reaction is to send a bad email to the company (At least this what I do) before going to the social media. So I took the initial complaining email as a sign of losing customers. The next thing was how to implement the idea? And this is how I did:


  1. The basis of the solution was to use Azure Text Analysis service to detect the email message sentiment. The underlying service being utilized was Azure Text Sentiment Analysis service.
  2. The next thing was to customize the email message entity to hold the sentiment value and potentially trigger a notification to manager or just sort emails based on their negative sentiment value.
  3. The last thing was to connect Power Platform to the Azure Text Sentiment Analysis service and get the sentiment value of email message from azure. I had two ways to implement this:
    • The first solution was to write a customer action to call the service and pass the email text to the azure endpoint. On receiving the response of the analysis service, the action would return the sentiment as its return. Finally calling the action on a workflow which triggers on the Creation of Email Activity!
    • The second solution was to use #MicrosftFlow and do everything without writing a single line of code. Obviously, I used this technique.

The solution is extremely easy because #MicrosoftFlow provides an out of the box connector to the text analysis service and all you will need to do is to provide the service key and service endpoint. Below is how my #Microsoftflow looks like:

Microsoft Flow Sentiment Analysis

Azure returns the sentiment score along with its analysis as Positive, Negative and Neutral. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, while scores close to 0 indicate negative sentiment. A score of 0.5 indicates the lack of sentiment (e.g. a factoid statement).

In my solution, I stored sentiment value as Whole Number, so I had to cast the decimal value between 0 and 1 to a number between 0 and 100. To do this, I used Operation step to multiply the sentiment score by 100 and cast it to an integer value. So I used the below formula:


Note: #MicrosftFlow does not have round function so I had to convert the value to string and substring until the decimal point.

Key Points:

  1. All of the Text Analytics API endpoints accept raw text data. The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks.
  2. Your rate limit will vary with your pricing tier.
  3. The Text Analytics API uses Unicode encoding for text representation and character count calculations. Requests can be submitted in both UTF-8 and UTF-16 with no measurable differences in the character count.

Improve efficiency of Call centers using Dynamics 365 and Azure cognitive services

Photo by Hrayr Movsisyan on Unsplash

I am Fascinated by sophistication of Azure services and how they help us to improve our solutions and extend the way we can solve customer problems. Recently I had a requirement to implement  a dynamics 365 solution to enable a call center to capture cases while their operators are offline.

One solution was to provide a self-service portal to customers to log the cases when Call center operators are offline. But in this case the customer was looking for something very quick to implement and having the ability to link incoming cases with their call center channel and derive some reporting based on it.


I started looking at Azure services and see how I can use Azure cognitive services and speech recognition to help me solve this requirement and like always I Azure did not disappoint me. In this post I would like to share my experience with you and take you to the steps that you would need to create such a solution. Of course possibilities are endless. However, this post will give you a starting point to begin your journey.

I have seen solutions where telephony systems send voice recordings of callers as an email attachment to a queue in CRM. The CRM then converts that queue item to a case and attaches the voice recording as note to the case. The challenge with this solution is the call center operators have to open attachments manually and have to write the description of the case after listening to the audio file. This means their time is spent on inefficient activities whereas they should be utilize in better ways.

Another problem with this approach is size of attachments. As time goes by, audio attachments will increase the database size impacting the maintenance of solution.


Our scenario is based on the fact that call center agents are not working 24 hours a day.

While agents  are offline customer should still be able to contact call center record the voice messages to create cases.

We will use the following components:

  1. Azure Blob to receive recorded audio files from telephony system.
  2. Azure cognitive services to listen to recorded audio files and translate the content to a text message. The audio file will be saved in  Azure blob (which is cheaper than CRM database storage).
  3. Azure function (with Azure Blob Binding) to recognize the text from the audio file and extracts the case description.
  4. Dynamics 365 Web API to create a case in CRM using the description extracted from Azure Cognitive services.  We can also add blob metadata like filename, etc. to case properties.
Solution Architecture

The full source code is available at GitHub

However, the main code snippet to perform conversion is below:

 public static async Task <string> RecognitionWithPullAudioStreamAsync ( string key, string region, Stream myBlob , ILogger log )


            // Creates an instance of a speech config with specified subscription key and service region.

            // Replace with your own subscription key and service region (e.g., "westus").

            var config = SpeechConfig.FromSubscription(key, region);

            string finalText = string.Empty;

            var stopRecognition = new TaskCompletionSource<int>();

            // Create an audio stream from a wav file.

            // Replace with your own audio file name.

            using ( var audioInput = Helper. OpenWavFile ( myBlob ) )


                // Creates a speech recognizer using audio stream input.

                using ( var recognizer = new SpeechRecognizer ( config , audioInput ) )


                    // Subscribes to events.

                    recognizer. Recognizing += ( s , e ) =>



                    recognizer. Recognized += ( s , e ) =>


                        if ( e. Result. Reason == ResultReason. RecognizedSpeech )


                            finalText += e. Result. Text + " ";


                        else if ( e. Result. Reason == ResultReason. NoMatch )


                            log.LogInformation ( $"NOMATCH: Speech could not be recognized." );



                    recognizer. Canceled += ( s , e ) =>


                        log. LogInformation ( $"CANCELED: Reason={e. Reason}" );

                        if ( e. Reason == CancellationReason. Error )


                            log. LogInformation ( $"CANCELED: ErrorCode={e. ErrorCode}" );

                            log. LogInformation ( $"CANCELED: ErrorDetails={e. ErrorDetails}" );

                            log. LogInformation ( $"CANCELED: Did you update the subscription info?" );


                        stopRecognition. TrySetResult ( 0 );


                    recognizer. SessionStarted += ( s , e ) =>


                        log. LogInformation ( "\nSession started event." );


                    recognizer. SessionStopped += ( s , e ) =>


                        log. LogInformation ( "\nSession stopped event." );

                        log. LogInformation ( "\nStop recognition." );

                        stopRecognition. TrySetResult ( 0 );


                    // Starts continuous recognition. Uses StopContinuousRecognitionAsync() to stop recognition.

                    await recognizer. StartContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    // Waits for completion.

                    // Use Task.WaitAny to keep the task rooted.

                    Task. WaitAny ( new [ ] { stopRecognition. Task } );

                    // Stops recognition.

                    await recognizer. StopContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    return finalText;




Important considerations:

  1. [This point is optional, if you use Web API to create cases in CRM] You will need use Multi-tenant configuration, if your Azure Function Tenant and the tenant in which your CRM API is registered, are different. If your Azure function tenant and the tenant in which your CRM API is registered, you can use Single Tenant configuration.
  2. The input file from the telephony to Azure blob must be in a specific format. The required format specification is:
Property Value
File Format RIFF (WAV)
Sampling Rate 8000 Hz or 16000 Hz
Channels 1 (mono)
Sample Format PCM, 16-bit integers
File Duration 0.1 seconds < duration < 60 seconds
Silence Collar > 0.1 seconds


4. You can use ffmpeg tool to convert your recording to this specific format. For your testing, you can download and use the tool as below:
Download ffmpeg from this link.
Use the command: ffmpeg -i “<source>.mp3” -acodec pcm_s16le -ac 1 -ar 16000 “<output>.wav”
5. My sample in GitHub covers input in one single chunk of audio. However, if you wish to have continuous streaming, you will need to implement the         StartContinuousRecognitionAsync method.
6. The azure function should be configured to be blob trigger.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.


Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

Display multiple entities on Power BI map control


Photo by Susannah Burleson on Unsplash

Recently I had to display location of multiple entities on a CRM dashboard. The requirement was to display all Workorders, Projects, Resources and Bookings in a map control so the project scheduler / field service dispatcher could see where is the location of each Workorders, Projects, Resources and Bookings on map. The bing map control works fine on individual entities which are enabled for geolocation however, in this scenario I had to plot all different entities on a single map.

My thoughts were that I could choose from one of the following methods:

  1. Use bing map control on a dashboard. Use a webresource to retrieve all entities in Workorders, Projects, Resources and Bookings. And then use a draw function to place each entity location on the bing map.
  2. The second approach was to use Power BI and its Visual Map control to plot all entities on a map. Then host the Power BI control on my dashboard. I decided to use this approach to display entities on a map control.

Power BI Map control to show multiple entities

The map control in Power BI uses one source table with longitude and latitude information to display table rows on map. The challenge with this approach is that the visual map control supports only one entity’s longitude and latitude and therefore we can only use one entity as source of the map data. In my scenario I had multiple entity types i.e. Workorders, Projects, Resources and Bookings. Each of these entities have its own longitude and latitude and we cannot use all these entities together as  a source for our Power BI Map.

The way I overcome to this challenge was to use a temporary table to union data from all Workorders, Projects, Resources and Bookings in this table and use this temporary table as the source of Power BI Map control. This is how I did it:

  1. Connect to the CRM Bookings table. This will bring all columns of the table to the Power BI.
  2. Remove unwanted columns in the Query Editor (optional).
    = Table.SelectColumns(Source,{"name", "msdyn_longitude", "msdyn_worklocation", "bookableresourcebookingid", "msdyn_latitude"})
  3. Reorder remaining columns in a way that you like to see your data (optional).
    = Table.ReorderColumns(#"Removed Other Columns",{"name", "msdyn_longitude", "msdyn_worklocation", "msdyn_latitude", "bookableresourcebookingid"})
  4. Rename column headings (optional).
    = Table.RenameColumns(#"Reordered Columns1",{{"bookableresourcebookingid", "id"}})
  5. Filter rows that you want to exclude from map (optional).
    = Table.SelectRows(#"Renamed Columns", each [latitude] <> null)
  6. Add a custom column to the query as TABLE Identifier/Category so you can identify workorder rows in the union table.
    = Table.AddColumn(#"Filtered Rows", "category", each Text.Upper("Bookable Resource Booking"))
  7. Change the column types (optional).
    = Table.TransformColumnTypes(#"Reordered Columns",{{"category", type text}, {"longitude", type number}, {"latitude", type number}})

If you have more than one entity, repeat the above steps for each table in your query editor.

The next step is to create a temporary table and union all the above tables data using DAX query into this temporary table.

  1. Go to Modeling Table.
  2. Click on New Table. Use the below query to fill the table (Alter table names based on your scenario),
    TempTable = UNION('Bookable Resource Booking','Bookable Resources','Work Orders','Project Sites')
  3. Drag a Map Visualisation control to the Power BI.
  4. Select “Category” or Entity Name from the TempTable as Legend. This will ensure to show your entities in different colors.
    Drag longitude and Latitude fields to the X and Y axis.
  5. Note: By default when you form tables, Power BI adds SUM function to summarize longitude and latitude. These columns with summarize functionality don’t work in maps. You must remove summarize attribute from them by choosing “Don’t summarize”.

Opening Global Search from a Webresource

Following my post here about opening a CRM form from a webresource by passing a query string, I got another idea to enhance the customer experience. How about opening Global Search from my webresource?

The idea is to search for a contact record based on a query string. If the contact record is not found in CRM, then I would want to open the global search screen and pass the query string value to it. This will give the option of finding my records among Contacts, Leads and Accounts.

Global Search
Invoking Global Search

To do that, you will need to open the following URL in your webresource.


You can pass the following optional parameters as Encoded text as well:

sitemappath=SFA|MyWork|nav_dashboards (note that the | character must be replaced with %7c)


There is also an XML Payload in which you can specify the Entity Filter.

<?xml version="1.0" encoding="utf-8" ?><soap:Envelope xmlns:soap="" xmlns:xsi="" xmlns:xsd=""><soap:Body><GetQuickFindColumnCollectionForEntityArray xmlns=""><entityCodes>4</entityCodes></GetQuickFindColumnCollectionForEntityArray></soap:Body></soap:Envelope>

Impact of deprecation of VoC on the Exam MB-230: Microsoft Dynamics 365 for Customer Service

For people aiming for: Exam MB-230: Microsoft Dynamics 365 for Customer Service

In October, Voice of the Customer skills and exam questions will be replaced with Forms Pro skills and questions. The exact date of that change and the associated changes to the Skills Measured will be communicated in August, 2019. Please prepare for your exam accordingly.