How To Start Your Exciting Journey Of “Connected Field Service” And “Azure Iot Hub”?

After my last article I spent some time to explore Azure IoT and Connected Field service. I have been watching presentations and demos on YouTube and I saw many good demonstrations about the capabilities of Azure IoT Hub and Connected Field service. However, I wanted to explore more and see how things really work under hood of the platform. So, in this and few upcoming posts I am summarising my key learnings for the benefit of fellows who want to start.

Note: I structured this blog series based on my learning journey. It is designed based on questions of CRM consultants who wants to start their Connected Field Service Journey.

Let’s start by some key definitions which you will hear a lot in your journey and better be familiar at the start:

IoT Device

IoT Device is a piece of hardware with a small circuit (called Microcontroller), capable of connecting to the internet using WIFI or Mobile services. The purpose of the microcontroller is to capture data from the surrounding and send it to consume services over the internet. In other words, IoT device is a minicomputer capable of sending (and sometimes processing) data over the internet.

Raspberry PI / Windows IoT

The IoT device mentioned above can be a small circuit to react to an event like switch on/switch off or can be very powerful to process images in the device itself. When we develop very powerful devices capable of processing data and applying complex logics, we will need a program to run the logic or process data. To enable this processing, we need a lightweight operating system to host our application. The operating system suitable for operating these powerful devices could be Raspberry PI or Windows IoT.

Azure Internet of Things (IoT)

Azure IoT is series of managed services combined to connect, monitor and control smart IoT enabled devices. So basically, the device’s sensors detect events from surrounding environments and pass the event and data using Microcontroller to Azure IoT service. Azure IoT receives the data from millions of devices and sends it for further processing services for giving the data meaning.

Azure IoT Hub

IoT Hub is a central message hub for bi-directional communication between devices and Azure IoT. So, imagine Azure IoT hub as a heart of Azure IoT service.

Azure IoT Edge

As we move towards future, we will have more IoT devices connected to the Azure IoT hub. Each of these devices will send raw messages to Azure IoT hub for processing. This is ok since the Azure IoT hub is a scalable service, but would not it be more efficient if we use the powerful hardware to pre-process data in devices before sending it to the Azure IoT hub? Azure IoT Edge enables us to write programs to process data inside device before sending it to azure and/or send only the necessary data/events. This will help to offload processing in Azure and reduce traffic. If you have a powerful tool why not use it?

IoT Device Twin

Device twin is logical representation of the device status in Azure. For example, you have a smart bulb which is installed in the Branch A. The device twin can contain the location so you can query twins and identify installed devices in the Branch A.  Or you can store the firmware version in Device Twins and at the time of firmware upgrade, you can upgrade only old firmware versions.

Connected Field Service

The Azure IoT gives you the technology enablement to interact with devices. However, the real value of IoT emerges when you give a business context to it. The connected field service gives the business context to your IoT data. The connected field service comes with an accelerator. When you install the connected field service from the AppSource, you will get all entities and required components in your Dynamics environment. It gives you the ability to configure Dynamics to interact with devices, listen to them and be notified when they don’t feel right.

I have seen IoT Central. What is it and how different is it with Azure IoT Hub?

According to the Microsoft documentation, Azure IoT Central is a software as a service (SaaS) solution that uses a model-based approach to help you to build enterprise-grade IoT solutions without requiring expertise in cloud-solution development. However, Azure IoT Hub is the core Azure PaaS that both Azure IoT Central and Azure IoT solution accelerators use. IoT Hub supports reliable and secure bidirectional communications between millions of IoT devices and a cloud solution.

So, in summary if you have Azure Skills and you want flexibility to expand/manage your solution go for IoT Hub. But if you want an accelerator without extension requirement, then IoT Central is the way forward.

Where do I start my journey?

Where to begin
Where to begin – Learning Path

If you want to start directly from Dynamics you simply can follow instructions here

Otherwise follow the following steps:

  1. Provision your Azure IoT Service (Link)
  2. Create a simulated device (Link)
  3. Send and receive messages (Link)

I have a smart door lock. I want to be notified when it is unlocked. Where do I start?

Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes.To implement this scenario, you will need to register your smart door lock in Azure IoT as mentioned above. Once the device is registered, you will need to define Unlock event in IoT service to enable the service to receive Unlock events. By default, the Azure IoT provides 4 event types:

  • Device Created
  • Device Deleted
  • Device Connected
  • Device Disconnected

However, for our scenario we will need to create a custom event Unlock. To define the custom event you can follow this link.

How do I route incoming message to Dynamics?

Once you define your events above, you will need to create a routing channel for the desired events. Once the Unlock events is detected in the IoT, you can define your routing to send your messages to an endpoint. You can configure your routing as per this link.

How to detect if my device disconnects from IoT?

It is very simple, as mentioned above, the Disconnection event is a default event which is triggered once your device is disconnected. All you must do it configure a service to react to the disconnection event.

How to orchestrate events from IoT to Dynamics?

If you are not using Connected Field Service solution from AppSource, you can configure a Flow to send messages across. There are triggers for IoT hub and events which you can use.

IoT Flow Template
IoT Flow Template

What kind of messages can be transmitted between device and IoT?

Are you a Passionate #PowerPlatform developer, looking for your next big challenge?

Looking for your next big thing?
What I love about the #PowerPlatform is that we have tones of new features every 6 months and if you are an avid learner, it gives no reason to you to be stagnant and not learn new things every day. You have numerous options from creating #powerapps, #pcf component or just explore new enhancements in the #dynamics365 applications.
After spending some time on exploring some core capabilities of the #PowerPlatform and playing with the field service application, I have been thinking about what is the next big thing I should focus on. I was looking for something challenging which has an impact on us. To decide on what should be the next, I decided to take a step back and look at the options I have to decide on what to do next. I looked in technology blogs, read books and listened to many podcasts which helped me to clear my mind and shortlist some major places I could start. I would like to share what I did in the hope that it may help other folks who are also looking for their next challenge:

Industry Reports & Microsoft investments

Like it or not, business needs and drive product development. It is the business, industry and eventually profits which are driving the investment into products which we can work on implementation or extension. I think one the best sources you can get your own next best challenge is industry or economic reports. I always have an eye on #Gartner reports. I see people share #Gartner reports back and forth on LinkedIn to show how a product is doing in the competition. I look at it from a different angel. I always read the #Gartner report but looking at the weaknesses. Weaknesses in the competition drive demand for innovations and solutions. By reading reports, you will understand what are the weaknesses in a product or platform and how these weaknesses matter to end customers. The next big challenge can be addressing some of the weaknesses!
Another indicator is Microsoft investments. When I see Microsoft is investing heavily on a platform or product, it gives me a good idea on what is the next big thing. Of course, Microsoft has a big team of industry experts and awesome people familiar in various businesses. Speaking of myself, I do trust the investment numbers of big players because that is going to create a demand for my innovative mind!
What do you think about this one?

What about cross-domain ideas?

Our field is fluid. It changes every day. It is not only #PowerPlatform. It is the nature of our field. There was a time when implementing a CRM was the goal of organisations but now CRM is only one piece in the big picture! Have you even thought of creating something including multiple technologies? How about combining #Azure and #PowerPlatform? The topic of cross-domain work is the one I am very interested. I have explored Azure SearchAzure SpeechAzure Bots, Azure Documents and Azure IoT Hub alongside #PowerPlatform. The synergy between #Azure and #PowerPlatform creates a system which really serves customers better. If you are out of ideas in #PowerPlatform, then start looking in #Azure. I bet you will have a lovely time learning new things and explore how #Azure can help you to do better implementations!

Working on community Ideas

There is a long list (and it is growing day by day) of product ideas in the IDEAS PORTAL. The #PowerPlatform product team does a good job on assessing, shortlisting and working on the great ideas to release in future waves. However, it is not possible to implement all of ideas due to various reasons such as criticality, demand, priorities and etc. Passionate people in the community can assess these ideas and see if they like the idea. Once chosen, you can work on the idea and release your own source to the GitHub to spread its benefit. Publishing your source helps the idea get bigger and your code quality will be better, too!

XrmToolBox plugins

I think XrmToolBox is the single best thing that the Dynamics community has come up with. I use it in every project. It inspires me when I see friends out them develop such a lovely plugin that works like a charm. (Big shoutout to Tanguy Touzard , Jonas Rapp and Aiden Kaskela whom I follow when it comes to the XrmToolBox) I never had a major problem using the tool which tells me how many good coders are out there. Your next best thing can be a XrmToolBox plugin or an extension of an existing one. If you think your idea is small or bad, you better think  again. There is no small or bad idea. Once you float your idea, it will become a snowball. Other awesome people contribute to yours and you will see how your small idea will become a big one helping others to do better. Even if you have no small idea, you can extend the functionality of an existing plugin. You always have an option to choose from 🙂 but remember one thing: Being stagnant is NO option!
The above list is not all I explored but it shortlists some of the things you can relate to. I would love to hear from you and see what is your experience in finding your next big thing?

P.S: A friendly reminder by @rappen on how to type: XrmToolBox (https://twitter.com/rappen/status/951009993989459969)

Image by Joanna Kosinska from Joanna Kosinska

Licensing Guide for Dynamics 365 Online – October 2019 Update

Overview

With October release right around the corner, Microsoft will be moving away from the plan-based structure to base-and-attach licensing model to allow customers to mix and match applications depending on their consumption. The new licensing model will allow customers to purchase the applications they need when they need them, or as some would like to call it -an a’la carte sales model.

But, as close as we are to the fall update, I am still hearing some concerns and questions regarding the new Dynamics 365 licensing. It may seem a bit complex and might require some to do some math after, but it is intended for customers to have more control of their subscriptions and therefore allow them to “pay only for what they use”  As Alysa Taylor, Corporate Vice President of Business Applications and Global Industry assures us…

In October, Microsoft will be moving from “one-size-fits-all” Microsoft Dynamics 365 licensing plans to focus on providing customers with the specific Dynamics 365 applications that meet the unique needs of their organization. Customers need software that aligns to their functional roles and scenarios, and they require the ability to add or remove applications as their company grows and changes over time. The new licensing model will allow customers to purchase the applications they need, when they need them. Each application is extensible and applications can be easily mixed and matched to configure integrated solutions that align to a customer’s unique business requirements.

How it works

Every user starts with a base license. The Base license is the first application purchased for individual use within an organization.  The Attach licenses can be purchased on top of the base license as incremental applications at a heavy discount.

451642-PEZYQN-621

Both Base and Attach licenses provide access to the same application functionality.

Screenshot_9

This new model allows each organization to use the right layering for each individual user depending on their needs. But depending on your consumption, this new model might still affect your monthly payment. Payment is based on usage per user per month. I suggest reading the full pricing guide and check Microsoft updates.

Restrictions and things to consider…

  • The Dynamics 365 for Marketing app will remain tenant-based and can only be purchased at the standard price of $1500 for new customers. Attach licenses will NOT apply.

451642-PEZYQN-621sc

  • Additionally, Dynamics 365 for Project Service Automation and Talent are NOT available as Attach licenses.
  • The minimum 20 seats for Finance and Operation will still take effect. At least 20 licenses must be purchased for any one of the applications.
  • Common Data Service will still be allocated with Unified Operation apps without additional licensing, but multiplexing issues will still apply.
  • Storage SKUs will remain unchanged and can be purchased and allocated as needed.
  • Base license must be the higher-priced app. For users that require using apps from Customer Engagement and Unified Operations, Unified Operations must always be the Base License. The same applies between CE Enterprise licenses and CE Professional licenses – Enterprise must always be the base license.

Screenshot_11

Customer Engagement

See below for the placemat view pricing guide for Dynamics 365 applications under Customer Engagement.

Some entities are accessible as read-only and are categorized as unrestricted. Update to restricted and unrestricted entities are also being made with the goal of enabling more users to access data directly through a PowerApps application or Flow workflow without requiring a Dynamics license but more on this on a separate post.

But if you require to use all the Customer Engagement Apps, you can purchase Project Service Automation as Base License and add the rest as Attach Licenses.

Screenshot_3

Unified Operations

One of the key changes that are also coming this October is the separation of Finance & Operations module into two separate applications — Supply Chain Management and Finance.

Both of these apps can be purchased as base and attach licenses of one another, depending on your needs, and would still require the 20 seat minimum.

Screenshot_8

The functions are broken down into these two applications…

Screenshot_10

Below is the placemat view of the pricing guide for Unified Operations applications.

Screenshot_14.png

Impact

New customers can no longer purchase the Dynamics 365 CE and UO plans. The new Base and Attach license model will apply from the 1st of October. So, it is important for potential and existing customers to be made aware of these changes.

For existing customers, the changes will only take effect after their renewal date. So, for a 1-year long subscription purchased on January 1, 2018, the changes will take effect on the same date the following year and must be converted before then. To those that still have concerns, make sure to keep an eye out for more updates.

These changes are easy to understand but may still have a substantial impact on your existing subscriptions. Depending on your organization’s needs, it is important to seek the advice of a consultant/partner regarding this matter.

How to call #webapi from #PowerPlatform #Portals

In almost all of my portal projects I get a question from my clients: “How to call an external api from #portals”? This question has been common that I decided to write about my experience on this topic which might be helpful for the community. This post will focus on two main areas:

  1. The available options to integrate #portals with external
  2. A step by step guide on one of the least discussed options which is using Oauth Implicit Grant flow and how I created a simple demo for one of my customers

Scenario

I would like to give a business context to this scenario. Any enterprise solution requires integration and interaction of multiple systems which #portals could be one of them. Imagine a scenario where a customer is looking for a quote on a product in the company portal. In this case the #portal is required to bring quote details from CPQ (Configure Price Quote) system to the portal. In another scenario, a #portal is required to integrate with a core banking system to get the customer’s latest balances. In these scenarios and similar ones, we will require the #portal to integrate with an external api to get information.

In order to enable such integrations, the #portal must be able to make calls in a secure way as most of the internal systems require authentication before anything can happen. So what are the options available?

Solutions

Since #powerplatform #portals are tightly integrated with #powerplatform, in most cases the integration is done through the #powerplatform itself. However, the integration through these #powerplatform has three flavors.

  1. The first one is creating actions in the platform which communicated with external API and manages the requests and responses; then calling the actions through a workflow where the workflow is triggered using Entity Form or Entity List events. 
Portal Integration with Web Api
Portal Integration with Web Api using Actions

 

  • The second option is to use #MicrosoftFlow to encapsulate the Workflow and Action part in a Flow. The benefit of this solution is that you won’t need to write code (in most cases but not guaranteed) to call #webapi

    Portal Integration using Flow
    Portal Integration using Flow
  • The above two options, use #PowerPlatform to facilitate the integration and all calls are routed through the platform. However, going through the server is not always feasible. There are situations in which you would like to make client side calls from javascript using Ajax from #portals to call external API. However, the main concerns in these scenarios are authentication. And the solution provided by the platform is “Oauth Implicit Grant Flow“.If you would like to learn more about what is the ”

    Oauth Implicit Grant Flow” beyond the #PowerPlatform, you can read more here.

 

There are concerns over the Oauth Implicit Grant flow and the recommendation is to use “Oauth code grant flow”. According to the Oauth working group, “t is generally not recommended to use the implicit flow (and some servers prohibit this flow entirely). In the time since the spec was originally written, the industry best practice has changed to recommend that public clients should use the authorization code flow with the PKCE extension instead.”. Microsoft is aware of this restriction however, it is believed Oath implicit grant flow is still ok to use.

I have proposed an idea to implement the Oauth code grant flow in this IDEA. Please vote for it.

Now getting back to the topic: How to Integrate:

Portal Integration with Oauth Implicit Grant Flow
Portal Integration with Oauth Implicit Grant Flow

In this scenario, there is no server side calls are required. A complete documentation is available here. However, the documentation is not very helpful if you want to do things quickly since there is a learning cycle involved. OAuth 2.0 implicit grant flow supports endpoints that a client can call to get an ID token. Two endpoints are used for this purpose: authorize and token. I will not go to the details of these calls and I assume you already know what these are.

So here is what you will have to do:

  1. Create your web api. You can download the sample api from this Github project. This website is no different than any MVP website. So you can create your own with Web APIs. 
  2. Next is to register your application in Azure Active Directory. This is a free service which you can use to provide authentication to your web api. A step by step details of the registration process is in this link.The REDIRECT URL must be the direct link to the page you created in the step # 2. You will need to note the following after this step:

    – Client ID
    – Redirect URL

  3. Let’s say you have a Quote page in your portal and you would like to place a button on the portal page to get Quotations from your internal website. You will have to put a custom HTML in your “Content Page” (not the main page) of the portal. This custom HTML will be used to add a QUOTE button to the portal and also retrieve the Quotation by use of a custom javascrtip code.
<h2>The QUOTE BUTTON</h2>

<button type="button" onclick="callAuthorizeEndpoint()">Give me a Quote!</button>

 <script>
//Remove this line to avoid State validation
$.cookie("useStateValidation",1);
function callAuthorizeEndpoint(){
//Used for State validation
var useStateValidation = $.cookie("useStateValidation");
var appStateKey = 'p07T@lst@T3';
var sampleAppState = {id:500, name:"logic"};
//Replace with Client Id Registered on CRM
var clientId = "CLIENT ID OBTAINED FROM AZURE ACTIVE DIRECTORY";
//Replace with Redirect URL registered on CRM
var redirectUri = encodeURIComponent("https://MYPORTAL.powerappsportals.com/REDIRECT_PAGE/");
//Authorize Endpoint
var redirectLocation = `/_services/auth/authorize?client_id=${clientId}&redirect_uri=${redirectUri}`;
//Save state in a cookie if State validation is enabled
if(useStateValidation){
$.cookie(appStateKey, JSON.stringify(sampleAppState));
redirectLocation = redirectLocation + `&state=${appStateKey}`;
console.log("Added State Parameter");
}

//Redirect
window.location = redirectLocation;
}
</script>


  1. Modify the source code in the web api website to use the Client ID and Redirect URL in its startup page.
public virtual Task ValidateIdentity(OAuthValidateIdentityContext context)
        {
try
{
if (!context.Request.Headers.ContainsKey("Authorization"))
{
return Task.FromResult<object>(null);
}

// Retrieve the JWT token in Authorization Header
var jwt = context.Request.Headers["Authorization"].Replace("Bearer ", string.Empty);
var handler = new JwtSecurityTokenHandler();
var token = new JwtSecurityToken(jwt);
var claimIdentity = new ClaimsIdentity(token.Claims, DefaultAuthenticationTypes.ExternalBearer);
var param = new TokenValidationParameters
{
ValidateAudience = false, // Make this false if token was generated without clientId
ValidAudience = "CLIENT ID", //Replace with Client Id Registered on CRM. Token should have been fetched with the same clientId.
ValidateIssuer = true,
IssuerSigningKey = _signingKey,
IssuerValidator = (issuer, securityToken, parameters) =>
{
var allowed = GetAllowedPortal().Trim().ToLowerInvariant();

if (issuer.ToLowerInvariant().Equals(allowed))
{
return issuer;
}
throw new Exception("Token Issuer is not a known Portal");
}
};

SecurityToken validatedToken = null;
handler.ValidateToken(token.RawData, param, out validatedToken);
var claimPrincipal = new ClaimsPrincipal(claimIdentity);
context.Response.Context.Authentication.User = claimPrincipal;
context.Validated(claimIdentity);
}
catch(Exception exception)
{
System.Diagnostics.Debug.WriteLine(exception);
return null;
}
return Task.FromResult<object>(null);

}
  1. The next step is to use Custom HTML on the Redirect PAGE so that you can make the call to the Web API by the token obtained in this step.
function getResultInUrlFragment(hash){
    if(hash){
        var result = {};
        hash.substring("1").split('&').forEach(function(keyValuePair){
            var arr = keyValuePair.split('=');
//  Add to result, only the keys with values
            arr[1] && (result[arr[0]] = arr[1]);
        });
return result;
    }
else{
return null;
    }
}
//Validate State parameter
//Returns true for valid state and false otherwise
function validateState(stateInUrlFragment){
if(!stateInUrlFragment){
console.error("State Validation Failed. State parameter not found in URL fragment");
return false;
    }

// State parameter in URL Fragment doesn't have a corresponding cookie.
if(!$.cookie(stateInUrlFragment)){
console.error("State Validation Failed. Invalid state parameter");
return false;
    }
return true;
}

var useStateValidation = $.cookie("useStateValidation");
var appState = null;

//Fetch the parameters in Url fragment
var authorizeEndpointResult = getResultInUrlFragment(window.location.hash);

//Validate State
if(useStateValidation){
if(!validateState(authorizeEndpointResult.state)){
authorizeEndpointResult = null;
    }
else{
appState = $.cookie(authorizeEndpointResult.state);        
console.log("State: "+appState);
    }
}

//Display token
if(authorizeEndpointResult){
    var data = authorizeEndpointResult.token;
console.log("Token:" + data);
   $.ajax({
type: "GET",
url: "https://URL_TO_THE_WEB_API.azurewebsites.net/api/external/ping",
contentType: "application/json; charset=utf-8",
dataType: "json",
headers: {
Accept:"text/plain; charset=utf-8",
        "Authorization": "Bearer "+data
},
success: function (data) {
alert(JSON.stringify(data));
console.log(data);
}, //End of AJAX Success function
failure: function (data) {
alert(data.responseText);
}, //End of AJAX failure function
error: function (data) {
alert(data.responseText);
} //End of AJAX error function
});
}

I hope this post helps you a bit to make your portals connect to the outside world!

Email Sentiment Analysis in Power Platform to improve customer service

What I love about my life as a #consultant is having the opportunity to hear customer problems and responding to them with something of value which improves their business in their own industry and market. What I love about being #Microsoft #Technology #consultant is working on a technology which not only cares about end users but also makes it easier for me (or any citizen developer) to come up with solutions easy to implement and with the #powerplatform and #msflow, many things don’t even need to open my #visualstudio (which I love and open every day even if I am not coding – Sounds crazy, nah! :D). Let’s get back to the track now!

Scenario

I had a request from my customer who was getting bombarded with case emails in its support department. The customer asked me to find a solution to prioritize emails based on urgency and probability of customers getting defected.

My initial thought was that “How do I need to quantify if a customer is going to defect because they are not satisfied”? After pondering on few solutions, I could come up with the idea of “Email Sentiment” as KPI for customer defection. If a customer is not satisfied with a service, their first reaction is to send a bad email to the company (At least this what I do) before going to the social media. So I took the initial complaining email as a sign of losing customers. The next thing was how to implement the idea? And this is how I did:

Solutions

  1. The basis of the solution was to use Azure Text Analysis service to detect the email message sentiment. The underlying service being utilized was Azure Text Sentiment Analysis service.
  2. The next thing was to customize the email message entity to hold the sentiment value and potentially trigger a notification to manager or just sort emails based on their negative sentiment value.
  3. The last thing was to connect Power Platform to the Azure Text Sentiment Analysis service and get the sentiment value of email message from azure. I had two ways to implement this:
    • The first solution was to write a customer action to call the service and pass the email text to the azure endpoint. On receiving the response of the analysis service, the action would return the sentiment as its return. Finally calling the action on a workflow which triggers on the Creation of Email Activity!
    • The second solution was to use #MicrosftFlow and do everything without writing a single line of code. Obviously, I used this technique.

The solution is extremely easy because #MicrosoftFlow provides an out of the box connector to the text analysis service and all you will need to do is to provide the service key and service endpoint. Below is how my #Microsoftflow looks like:

Microsoft Flow Sentiment Analysis

Azure returns the sentiment score along with its analysis as Positive, Negative and Neutral. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, while scores close to 0 indicate negative sentiment. A score of 0.5 indicates the lack of sentiment (e.g. a factoid statement).

In my solution, I stored sentiment value as Whole Number, so I had to cast the decimal value between 0 and 1 to a number between 0 and 100. To do this, I used Operation step to multiply the sentiment score by 100 and cast it to an integer value. So I used the below formula:

int(substring(string(mul(body('Detect_Sentiment')?['score'],100)),0,indexof(string(mul(body('Detect_Sentiment')?['score'],100)),'.')))

Note: #MicrosftFlow does not have round function so I had to convert the value to string and substring until the decimal point.

Key Points:

  1. All of the Text Analytics API endpoints accept raw text data. The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks.
  2. Your rate limit will vary with your pricing tier.
  3. The Text Analytics API uses Unicode encoding for text representation and character count calculations. Requests can be submitted in both UTF-8 and UTF-16 with no measurable differences in the character count.

Improve efficiency of Call centers using Dynamics 365 and Azure cognitive services

Photo by Hrayr Movsisyan on Unsplash

I am Fascinated by sophistication of Azure services and how they help us to improve our solutions and extend the way we can solve customer problems. Recently I had a requirement to implement  a dynamics 365 solution to enable a call center to capture cases while their operators are offline.

One solution was to provide a self-service portal to customers to log the cases when Call center operators are offline. But in this case the customer was looking for something very quick to implement and having the ability to link incoming cases with their call center channel and derive some reporting based on it.

Approach

I started looking at Azure services and see how I can use Azure cognitive services and speech recognition to help me solve this requirement and like always I Azure did not disappoint me. In this post I would like to share my experience with you and take you to the steps that you would need to create such a solution. Of course possibilities are endless. However, this post will give you a starting point to begin your journey.

I have seen solutions where telephony systems send voice recordings of callers as an email attachment to a queue in CRM. The CRM then converts that queue item to a case and attaches the voice recording as note to the case. The challenge with this solution is the call center operators have to open attachments manually and have to write the description of the case after listening to the audio file. This means their time is spent on inefficient activities whereas they should be utilize in better ways.

Another problem with this approach is size of attachments. As time goes by, audio attachments will increase the database size impacting the maintenance of solution.

Scenario

Our scenario is based on the fact that call center agents are not working 24 hours a day.

While agents  are offline customer should still be able to contact call center record the voice messages to create cases.

We will use the following components:

  1. Azure Blob to receive recorded audio files from telephony system.
  2. Azure cognitive services to listen to recorded audio files and translate the content to a text message. The audio file will be saved in  Azure blob (which is cheaper than CRM database storage).
  3. Azure function (with Azure Blob Binding) to recognize the text from the audio file and extracts the case description.
  4. Dynamics 365 Web API to create a case in CRM using the description extracted from Azure Cognitive services.  We can also add blob metadata like filename, etc. to case properties.
Solution Architecture

The full source code is available at GitHub

However, the main code snippet to perform conversion is below:

 public static async Task <string> RecognitionWithPullAudioStreamAsync ( string key, string region, Stream myBlob , ILogger log )

        {

            // Creates an instance of a speech config with specified subscription key and service region.

            // Replace with your own subscription key and service region (e.g., "westus").

            var config = SpeechConfig.FromSubscription(key, region);

            string finalText = string.Empty;

            var stopRecognition = new TaskCompletionSource<int>();

            // Create an audio stream from a wav file.

            // Replace with your own audio file name.

            using ( var audioInput = Helper. OpenWavFile ( myBlob ) )

            {

                // Creates a speech recognizer using audio stream input.

                using ( var recognizer = new SpeechRecognizer ( config , audioInput ) )

                {

                    // Subscribes to events.

                    recognizer. Recognizing += ( s , e ) =>

                    {                       

                    };

                    recognizer. Recognized += ( s , e ) =>

                    {

                        if ( e. Result. Reason == ResultReason. RecognizedSpeech )

                        {

                            finalText += e. Result. Text + " ";

                        }

                        else if ( e. Result. Reason == ResultReason. NoMatch )

                        {

                            log.LogInformation ( $"NOMATCH: Speech could not be recognized." );

                        }

                    };

                    recognizer. Canceled += ( s , e ) =>

                    {

                        log. LogInformation ( $"CANCELED: Reason={e. Reason}" );

                        if ( e. Reason == CancellationReason. Error )

                        {

                            log. LogInformation ( $"CANCELED: ErrorCode={e. ErrorCode}" );

                            log. LogInformation ( $"CANCELED: ErrorDetails={e. ErrorDetails}" );

                            log. LogInformation ( $"CANCELED: Did you update the subscription info?" );

                        }

                        stopRecognition. TrySetResult ( 0 );

                    };

                    recognizer. SessionStarted += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession started event." );

                    };

                    recognizer. SessionStopped += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession stopped event." );

                        log. LogInformation ( "\nStop recognition." );

                        stopRecognition. TrySetResult ( 0 );

                    };

                    // Starts continuous recognition. Uses StopContinuousRecognitionAsync() to stop recognition.

                    await recognizer. StartContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    // Waits for completion.

                    // Use Task.WaitAny to keep the task rooted.

                    Task. WaitAny ( new [ ] { stopRecognition. Task } );

                    // Stops recognition.

                    await recognizer. StopContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    return finalText;

                }

            }

        }

Important considerations:

  1. [This point is optional, if you use Web API to create cases in CRM] You will need use Multi-tenant configuration, if your Azure Function Tenant and the tenant in which your CRM API is registered, are different. If your Azure function tenant and the tenant in which your CRM API is registered, you can use Single Tenant configuration.
  2. The input file from the telephony to Azure blob must be in a specific format. The required format specification is:
Property Value
File Format RIFF (WAV)
Sampling Rate 8000 Hz or 16000 Hz
Channels 1 (mono)
Sample Format PCM, 16-bit integers
File Duration 0.1 seconds < duration < 60 seconds
Silence Collar > 0.1 seconds

 

4. You can use ffmpeg tool to convert your recording to this specific format. For your testing, you can download and use the tool as below:
Download ffmpeg from this link.
Use the command: ffmpeg -i “<source>.mp3” -acodec pcm_s16le -ac 1 -ar 16000 “<output>.wav”
5. My sample in GitHub covers input in one single chunk of audio. However, if you wish to have continuous streaming, you will need to implement the         StartContinuousRecognitionAsync method.
6. The azure function should be configured to be blob trigger.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

Display multiple entities on Power BI map control

 

Photo by Susannah Burleson on Unsplash

Recently I had to display location of multiple entities on a CRM dashboard. The requirement was to display all Workorders, Projects, Resources and Bookings in a map control so the project scheduler / field service dispatcher could see where is the location of each Workorders, Projects, Resources and Bookings on map. The bing map control works fine on individual entities which are enabled for geolocation however, in this scenario I had to plot all different entities on a single map.

My thoughts were that I could choose from one of the following methods:

  1. Use bing map control on a dashboard. Use a webresource to retrieve all entities in Workorders, Projects, Resources and Bookings. And then use a draw function to place each entity location on the bing map.
  2. The second approach was to use Power BI and its Visual Map control to plot all entities on a map. Then host the Power BI control on my dashboard. I decided to use this approach to display entities on a map control.

Power BI Map control to show multiple entities

The map control in Power BI uses one source table with longitude and latitude information to display table rows on map. The challenge with this approach is that the visual map control supports only one entity’s longitude and latitude and therefore we can only use one entity as source of the map data. In my scenario I had multiple entity types i.e. Workorders, Projects, Resources and Bookings. Each of these entities have its own longitude and latitude and we cannot use all these entities together as  a source for our Power BI Map.

The way I overcome to this challenge was to use a temporary table to union data from all Workorders, Projects, Resources and Bookings in this table and use this temporary table as the source of Power BI Map control. This is how I did it:

  1. Connect to the CRM Bookings table. This will bring all columns of the table to the Power BI.
  2. Remove unwanted columns in the Query Editor (optional).
    = Table.SelectColumns(Source,{"name", "msdyn_longitude", "msdyn_worklocation", "bookableresourcebookingid", "msdyn_latitude"})
  3. Reorder remaining columns in a way that you like to see your data (optional).
    = Table.ReorderColumns(#"Removed Other Columns",{"name", "msdyn_longitude", "msdyn_worklocation", "msdyn_latitude", "bookableresourcebookingid"})
  4. Rename column headings (optional).
    = Table.RenameColumns(#"Reordered Columns1",{{"bookableresourcebookingid", "id"}})
  5. Filter rows that you want to exclude from map (optional).
    = Table.SelectRows(#"Renamed Columns", each [latitude] <> null)
  6. Add a custom column to the query as TABLE Identifier/Category so you can identify workorder rows in the union table.
    = Table.AddColumn(#"Filtered Rows", "category", each Text.Upper("Bookable Resource Booking"))
     
  7. Change the column types (optional).
    = Table.TransformColumnTypes(#"Reordered Columns",{{"category", type text}, {"longitude", type number}, {"latitude", type number}})

If you have more than one entity, repeat the above steps for each table in your query editor.

The next step is to create a temporary table and union all the above tables data using DAX query into this temporary table.

  1. Go to Modeling Table.
  2. Click on New Table. Use the below query to fill the table (Alter table names based on your scenario),
    TempTable = UNION('Bookable Resource Booking','Bookable Resources','Work Orders','Project Sites')
  3. Drag a Map Visualisation control to the Power BI.
  4. Select “Category” or Entity Name from the TempTable as Legend. This will ensure to show your entities in different colors.
    Drag longitude and Latitude fields to the X and Y axis.
  5. Note: By default when you form tables, Power BI adds SUM function to summarize longitude and latitude. These columns with summarize functionality don’t work in maps. You must remove summarize attribute from them by choosing “Don’t summarize”.


Remove Flow Ribbon Button

Remove Flow Ribbon Button

Hey D365’ers! Welcome to {{ quirk.works }}, a blog series where we try to solve a Dynamics 365 or Power Platform problem with unconventional solution. For our first part of the series, let’s try to remove the Flow ribbon button.

So you want to remove the Flow ribbon button?

Yes sir, we don’t want our users to see it as per our requirements document.

Tried removing it with Ribbon Workbench?

Ribbon buttons are easy to hide/show using the Ribbon Workbench. Just select the ribbon button, right click, and click Hide.

Hide also the rest of the ribbon button sub-menu.

And there you go. What? It’s not hidden isn’t it?

Here’s a {{ quirk.works }}

There is a quirk with how the Flow ribbon button can be removed.

Click the cog icon on the upper right side of the screen and click Advanced Settings.

Go to Settings > Administration

Go to System Settings.

Open the Customization tab. There you have it, select No from the option Show Microsoft Flow on forms and in the site map. Click the OK button and we’re all set.

Conclusion

Not every ribbon button can be hidden using the Ribbon Workbench, sometime you have to look out of the XrmToolBox. Just playing with some puns, no offense meant for these great tools as I admire these awesome works from Scott and Tanguy. Until our next {{ quirk.works }}, stay-tuned D365’ers. I’ll keep you posted.

 

PowerApp Admin Tools

I’ve been working on XrmToolBox Tools for a bit now, both speaking and posting on the huge number of cool tools and how we can build our own. I’m still working on a few XrmToolBox related projects, but when I started diving into Power Apps, I immediately wondered if I could replicate some Tools in Canvas or Model Driven Apps. Wouldn’t it be cool if we could build out a suite of some administrator or developer tools as Power Apps?

In my post Building XrmToolBox Tools, Part 2, we build an example Tool that allows admins to view the list of Users with a Security Role assignment. It’s a relatively simple tool but it can be pretty useful if you need a quick check on a Role for migrations, troubleshooting, etc. This seemed like as good a candidate as any for a new ‘Admin Tool’ Power App.

Security Role Member Manager!

The proposed functionality for the tool is pretty simple: provide a list of Security Roles in the system, and when the user selects a Role, show the list of assignments. This was meant to be written in about an hour, so that was the extent of the capabilities. Since we have a bit more time, we can add some features. How about we display the list of Teams that have the selected Role assignment and allow Adding or Removing a User or Team to the selected Role.

These requirements are a pretty good candidate for a Canvas App. We can build this using the following components:

  • Common Data Service (CDS) connector – provides the list of Security Roles
  • Office 365 Users Connector – provides the user picture
  • Gallery – bound to the Security Roles list
  • Gallery – bound to the list of Users related to the Security Role
  • Gallery – bound to the list of Teams related to the Security Role

The main screen layout and functionality is also fairly simple. Bind the main grid to to the Security Roles list, and on the select event, bind the secondary lists to the related Users and Teams. No real code behind, just some simple data binding to galleries.

Here is the initial main screen for the Power App:

Security Role Member Manager

Some challenges, some solutions

With the XrmToolBox Tool, I wrote some code to retrieve Security Roles and and User Security Role assignments using the standard SDK Query Expression methods. The list of Security Roles was a simple Retrieve Multiple while the User Role Assignment is a many-to-many relationship.

The many to many is where I stumbled a bit. When I select my Security Role, I want to retrieve the Users and Teams which are both many-to-many relations to Security Roles. The CDS connector does not list the join table as an Entity, so I couldn’t simply add a new Data Source for User or Team and filter by the selected Role. Fortunately, support has been added for many-to-many relationships in the CDS connector. Here is an excellent blog post on the feature by Greg Lindhorst, Principal Program Manager at Microsoft: Relate records in Many-to-Many relationships

So to render the list of Users and Teams, I can bind the galleries using a simple formula. The main screen gallery from which you select a Security Role is named ‘Security Roles List’. So the User and Teams gallery Items property can be set using these simple formulas respectively:

'Security Roles List'.Selected.Users
'Security Roles List'.Selected.Teams

As you can see in Greg’s post, adding and removing Users and Teams are fairly easy too. To remove a User from the selected Security Role, we need a single line formula added to our gallery button:

Unrelate('Security Roles List'.Selected.Users, ThisItem)

That statement passes the selected Security role and the currently selected User to the Unrelate formula, and we’re done!

Next Steps

I plan on a follow up post with a bit more functionality. For example, I like the inline model for selecting a User shown in the post by Greg above, but I think selecting multiple Users and Teams works better. Another nice feature will be to distinguish between Business Units. Right now, this pulls all Security Roles for the entire organization.

This sounds like an obvious one, but I also plan on adding a confirmation dialog before removing the User or Team from a Security Role. This was a bit more complicated than I had expected, so I will write up more detail on how this will be implemented.

As I was working on this sample Power App, I came across a great post User Admin PowerApp (Part 4) by d365Cooky that proposes a similar tool, but for managing Security Roles for a selected User. I like the idea of embedding this into Dynamics 365 CE. By the way, I saw the link via Guido Preite’s dynamicsweekly.com newsletter. If you have not already signed up for this, get to it!

I’ll post notes on all these updates with more detail on how it was built, including the full solution for download in a follow up post.

Powerful stuff!

I feel like I have said that a LOT recently! In a relatively short time, I built a functioning Power App that will allow administrators to manage the Security Role Users and Teams. I put this together in a few hours, including some reading on the CDS connector capabilities and designing a few screens. All of this was done using the existing connectors and no custom code outside of the standard Canvas App formulas.

This is not as complex a tool as you may find in the XrmToolBox, and I am definitely going to continue any contribution I can to the XrmToolBox! But I think this once again demonstrates how the Power Platform allow us to provide low code/no code solutions to your users.

In the meantime, as always, any comments, suggestions, or questions are appreciated!

Dynamics 365-CE Approval Dialogues Using Canvas-apps

There are scenarios where we need to configure approvals in Dynamics 365, for example, mark an account as a premium customer after approval or qualify leads after approval etc. We used dialog control to capture approval request and comments but, now dialog controls are depreciated and not advised to use for new projects.

As per Microsoft’s initial announcement

Dialogs are deprecated and are replaced by mobile task flows (available as of the December 2016 update), and business process flows. Both task flows and business process flows will continue to evolve to make the transition easier.

But either tasks flow or business process flow was not a perfect replacement for Dialog. Knowing this pain from users, Microsoft has now modified the announcement.

Dialogs are deprecated, and should be replaced by business process flows or canvas apps

Even though I knew canvas apps can be now embedded in model-driven apps, I hadn’t thought of this option until I came across this new announcement, so tried replicating my approval dialogues with a canvas app and it works fine. Pheww!!!! 🙂

For testing purpose, I replicated the dialogue for creating approval request for the Account entity.

  1. created a canvas app to create an approval request.
This sample app changes account status to pending verification and captures the comments in one custom field.

2. Now we need to call this app from account form, obtain the app ID from app details section.

select app details to get the App GUID

3. I need to call this canvas app as a popup when the user clicks a button. I created a custom button for account entity-> added a JavaScript as button action to call an HTML web-resource and embedded my canvas app in this HTML I-frame.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<html><head>
<title>Approval</title>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body style="padding: 0px; font-family: arial; overflow-wrap: break-word;" onload="LoadPowerApp()">
<center>
<iframe id="Approval" width="800" height="600"></iframe>
</center>
<script id="myScript">
function LoadPowerApp()
{
var AccountID = window.parent.opener.Xrm.Page.data.entity.getId().slice(1, -1);
document.getElementById("Approval").src=url;
}
</script>
</body>
</html>

I know you have many questions now. "https://web.powerapps.com/webplayer/iframeapp?source=iframe&appId=/providers/Microsoft.PowerApps/apps/56123673-f45c-4b96-b9e6-ece1b0a8069a&ID="+AccountID;

This is the key and I will breakdown it into parts “https://web.powerapps.com/webplayer/iframeapp?source=iframe&appId=/providers/Microsoft.PowerApps/apps/APP GUID&CUSTOM PARAMETER NAME=”+PARAMETER VALUE

App GUID I explained in step 2, now regarding a custom parameter, I deliberately didn’t mention it when we discussed the app creation and kept for this section. When we open this canvas from an account form(like we start a dialogue) the app needs the record GUID to update the account status

I have used form control in the canvas app and filtered the item using the ID Parameter.

4. Now try your button and you can see the magic.

You can download the sample APP from TDG Power Apps bank.

Please note this is a basic app I tried for testing purpose and needs many improvements to use in a live project. you are always welcome to discuss with on this app.

Hope this helps….. 🙂

D365UG Bristol

Man, oh MAN! I am just on the train back from Bristol after attending their first 365 User Group. and it was AWESOME!

First up, big props to the organisers @lubo @dbarber @xrmjoel @robpeledie @leembaker for doing such an amazing job. The venue was great, the food was too (including vegan options :-)) and the swag bags were epic.

But more importantly, the speakers were amazing.

First up, Mark Smith did a fantastic presentation to explain the (seemingly unexplainable) Common Data Model, which is something that sits on the CDS and exists to make sense of your data so that it is usable and, more importantly, means you can build apps faster. Considering he knew nothing about the CDM 5 days ago, this was a great presentation which goes to show how quickly you can assimilate information when the data is available in a usable format (see what I did there *winks*)

Second up was @scott-durowdevelop1-net showing us how much the Power Platform has changed in such a short space of time, and why that’s such a good thing, enabling access to a much wider audience and giving rise to things like the Citizen Developers who can effect such great change in an organisation, and how we all need to embrace change and be adaptable.

And finally, @themarkchristie with an entertaining presentation on how he bought headphones at the Microsoft store and @lubo busted them 5 minutes later – but not to worry, Virtual agent was able to help him raise a support ticket through Forms Pro, which then could be assigned to someone in Field Service, leading to the headphones being fixed (whilst being worn by Mark Smith).

I learned a lot about how to better use the tech we have openly available to us in a more inventive way, and how to work with the community to find answers to stuff that you are struggling with.

I love these community events and I hope to attend each and every one that I can.

Again, big thanks and well done to the organisers.

Love to all

Alison

(Me when I won a Flic button from @themarkchristie)