Dynamics 365 for Customer Service

Email Sentiment Analysis in Power Platform to improve customer service

What I love about my life as a #consultant is having the opportunity to hear customer problems and responding to them with something of value which improves their business in their own industry and market. What I love about being #Microsoft #Technology #consultant is working on a technology which not only cares about end users but also makes it easier for me (or any citizen developer) to come up with solutions easy to implement and with the #powerplatform and #msflow, many things don’t even need to open my #visualstudio (which I love and open every day even if I am not coding – Sounds crazy, nah! :D). Let’s get back to the track now!

Scenario

I had a request from my customer who was getting bombarded with case emails in its support department. The customer asked me to find a solution to prioritize emails based on urgency and probability of customers getting defected.

My initial thought was that “How do I need to quantify if a customer is going to defect because they are not satisfied”? After pondering on few solutions, I could come up with the idea of “Email Sentiment” as KPI for customer defection. If a customer is not satisfied with a service, their first reaction is to send a bad email to the company (At least this what I do) before going to the social media. So I took the initial complaining email as a sign of losing customers. The next thing was how to implement the idea? And this is how I did:

Solutions

  1. The basis of the solution was to use Azure Text Analysis service to detect the email message sentiment. The underlying service being utilized was Azure Text Sentiment Analysis service.
  2. The next thing was to customize the email message entity to hold the sentiment value and potentially trigger a notification to manager or just sort emails based on their negative sentiment value.
  3. The last thing was to connect Power Platform to the Azure Text Sentiment Analysis service and get the sentiment value of email message from azure. I had two ways to implement this:
    • The first solution was to write a customer action to call the service and pass the email text to the azure endpoint. On receiving the response of the analysis service, the action would return the sentiment as its return. Finally calling the action on a workflow which triggers on the Creation of Email Activity!
    • The second solution was to use #MicrosftFlow and do everything without writing a single line of code. Obviously, I used this technique.

The solution is extremely easy because #MicrosoftFlow provides an out of the box connector to the text analysis service and all you will need to do is to provide the service key and service endpoint. Below is how my #Microsoftflow looks like:

Microsoft Flow Sentiment Analysis

Azure returns the sentiment score along with its analysis as Positive, Negative and Neutral. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, while scores close to 0 indicate negative sentiment. A score of 0.5 indicates the lack of sentiment (e.g. a factoid statement).

In my solution, I stored sentiment value as Whole Number, so I had to cast the decimal value between 0 and 1 to a number between 0 and 100. To do this, I used Operation step to multiply the sentiment score by 100 and cast it to an integer value. So I used the below formula:

int(substring(string(mul(body('Detect_Sentiment')?['score'],100)),0,indexof(string(mul(body('Detect_Sentiment')?['score'],100)),'.')))

Note: #MicrosftFlow does not have round function so I had to convert the value to string and substring until the decimal point.

Key Points:

  1. All of the Text Analytics API endpoints accept raw text data. The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks.
  2. Your rate limit will vary with your pricing tier.
  3. The Text Analytics API uses Unicode encoding for text representation and character count calculations. Requests can be submitted in both UTF-8 and UTF-16 with no measurable differences in the character count.

Improve efficiency of Call centers using Dynamics 365 and Azure cognitive services

Photo by Hrayr Movsisyan on Unsplash

I am Fascinated by sophistication of Azure services and how they help us to improve our solutions and extend the way we can solve customer problems. Recently I had a requirement to implement  a dynamics 365 solution to enable a call center to capture cases while their operators are offline.

One solution was to provide a self-service portal to customers to log the cases when Call center operators are offline. But in this case the customer was looking for something very quick to implement and having the ability to link incoming cases with their call center channel and derive some reporting based on it.

Approach

I started looking at Azure services and see how I can use Azure cognitive services and speech recognition to help me solve this requirement and like always I Azure did not disappoint me. In this post I would like to share my experience with you and take you to the steps that you would need to create such a solution. Of course possibilities are endless. However, this post will give you a starting point to begin your journey.

I have seen solutions where telephony systems send voice recordings of callers as an email attachment to a queue in CRM. The CRM then converts that queue item to a case and attaches the voice recording as note to the case. The challenge with this solution is the call center operators have to open attachments manually and have to write the description of the case after listening to the audio file. This means their time is spent on inefficient activities whereas they should be utilize in better ways.

Another problem with this approach is size of attachments. As time goes by, audio attachments will increase the database size impacting the maintenance of solution.

Scenario

Our scenario is based on the fact that call center agents are not working 24 hours a day.

While agents  are offline customer should still be able to contact call center record the voice messages to create cases.

We will use the following components:

  1. Azure Blob to receive recorded audio files from telephony system.
  2. Azure cognitive services to listen to recorded audio files and translate the content to a text message. The audio file will be saved in  Azure blob (which is cheaper than CRM database storage).
  3. Azure function (with Azure Blob Binding) to recognize the text from the audio file and extracts the case description.
  4. Dynamics 365 Web API to create a case in CRM using the description extracted from Azure Cognitive services.  We can also add blob metadata like filename, etc. to case properties.
Solution Architecture

The full source code is available at GitHub

However, the main code snippet to perform conversion is below:

 public static async Task <string> RecognitionWithPullAudioStreamAsync ( string key, string region, Stream myBlob , ILogger log )

        {

            // Creates an instance of a speech config with specified subscription key and service region.

            // Replace with your own subscription key and service region (e.g., "westus").

            var config = SpeechConfig.FromSubscription(key, region);

            string finalText = string.Empty;

            var stopRecognition = new TaskCompletionSource<int>();

            // Create an audio stream from a wav file.

            // Replace with your own audio file name.

            using ( var audioInput = Helper. OpenWavFile ( myBlob ) )

            {

                // Creates a speech recognizer using audio stream input.

                using ( var recognizer = new SpeechRecognizer ( config , audioInput ) )

                {

                    // Subscribes to events.

                    recognizer. Recognizing += ( s , e ) =>

                    {                       

                    };

                    recognizer. Recognized += ( s , e ) =>

                    {

                        if ( e. Result. Reason == ResultReason. RecognizedSpeech )

                        {

                            finalText += e. Result. Text + " ";

                        }

                        else if ( e. Result. Reason == ResultReason. NoMatch )

                        {

                            log.LogInformation ( $"NOMATCH: Speech could not be recognized." );

                        }

                    };

                    recognizer. Canceled += ( s , e ) =>

                    {

                        log. LogInformation ( $"CANCELED: Reason={e. Reason}" );

                        if ( e. Reason == CancellationReason. Error )

                        {

                            log. LogInformation ( $"CANCELED: ErrorCode={e. ErrorCode}" );

                            log. LogInformation ( $"CANCELED: ErrorDetails={e. ErrorDetails}" );

                            log. LogInformation ( $"CANCELED: Did you update the subscription info?" );

                        }

                        stopRecognition. TrySetResult ( 0 );

                    };

                    recognizer. SessionStarted += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession started event." );

                    };

                    recognizer. SessionStopped += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession stopped event." );

                        log. LogInformation ( "\nStop recognition." );

                        stopRecognition. TrySetResult ( 0 );

                    };

                    // Starts continuous recognition. Uses StopContinuousRecognitionAsync() to stop recognition.

                    await recognizer. StartContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    // Waits for completion.

                    // Use Task.WaitAny to keep the task rooted.

                    Task. WaitAny ( new [ ] { stopRecognition. Task } );

                    // Stops recognition.

                    await recognizer. StopContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    return finalText;

                }

            }

        }

Important considerations:

  1. [This point is optional, if you use Web API to create cases in CRM] You will need use Multi-tenant configuration, if your Azure Function Tenant and the tenant in which your CRM API is registered, are different. If your Azure function tenant and the tenant in which your CRM API is registered, you can use Single Tenant configuration.
  2. The input file from the telephony to Azure blob must be in a specific format. The required format specification is:
Property Value
File Format RIFF (WAV)
Sampling Rate 8000 Hz or 16000 Hz
Channels 1 (mono)
Sample Format PCM, 16-bit integers
File Duration 0.1 seconds < duration < 60 seconds
Silence Collar > 0.1 seconds

 

4. You can use ffmpeg tool to convert your recording to this specific format. For your testing, you can download and use the tool as below:
Download ffmpeg from this link.
Use the command: ffmpeg -i “<source>.mp3” -acodec pcm_s16le -ac 1 -ar 16000 “<output>.wav”
5. My sample in GitHub covers input in one single chunk of audio. However, if you wish to have continuous streaming, you will need to implement the         StartContinuousRecognitionAsync method.
6. The azure function should be configured to be blob trigger.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

Display multiple entities on Power BI map control

 

Photo by Susannah Burleson on Unsplash

Recently I had to display location of multiple entities on a CRM dashboard. The requirement was to display all Workorders, Projects, Resources and Bookings in a map control so the project scheduler / field service dispatcher could see where is the location of each Workorders, Projects, Resources and Bookings on map. The bing map control works fine on individual entities which are enabled for geolocation however, in this scenario I had to plot all different entities on a single map.

My thoughts were that I could choose from one of the following methods:

  1. Use bing map control on a dashboard. Use a webresource to retrieve all entities in Workorders, Projects, Resources and Bookings. And then use a draw function to place each entity location on the bing map.
  2. The second approach was to use Power BI and its Visual Map control to plot all entities on a map. Then host the Power BI control on my dashboard. I decided to use this approach to display entities on a map control.

Power BI Map control to show multiple entities

The map control in Power BI uses one source table with longitude and latitude information to display table rows on map. The challenge with this approach is that the visual map control supports only one entity’s longitude and latitude and therefore we can only use one entity as source of the map data. In my scenario I had multiple entity types i.e. Workorders, Projects, Resources and Bookings. Each of these entities have its own longitude and latitude and we cannot use all these entities together as  a source for our Power BI Map.

The way I overcome to this challenge was to use a temporary table to union data from all Workorders, Projects, Resources and Bookings in this table and use this temporary table as the source of Power BI Map control. This is how I did it:

  1. Connect to the CRM Bookings table. This will bring all columns of the table to the Power BI.
  2. Remove unwanted columns in the Query Editor (optional).
    = Table.SelectColumns(Source,{"name", "msdyn_longitude", "msdyn_worklocation", "bookableresourcebookingid", "msdyn_latitude"})
  3. Reorder remaining columns in a way that you like to see your data (optional).
    = Table.ReorderColumns(#"Removed Other Columns",{"name", "msdyn_longitude", "msdyn_worklocation", "msdyn_latitude", "bookableresourcebookingid"})
  4. Rename column headings (optional).
    = Table.RenameColumns(#"Reordered Columns1",{{"bookableresourcebookingid", "id"}})
  5. Filter rows that you want to exclude from map (optional).
    = Table.SelectRows(#"Renamed Columns", each [latitude] <> null)
  6. Add a custom column to the query as TABLE Identifier/Category so you can identify workorder rows in the union table.
    = Table.AddColumn(#"Filtered Rows", "category", each Text.Upper("Bookable Resource Booking"))
     
  7. Change the column types (optional).
    = Table.TransformColumnTypes(#"Reordered Columns",{{"category", type text}, {"longitude", type number}, {"latitude", type number}})

If you have more than one entity, repeat the above steps for each table in your query editor.

The next step is to create a temporary table and union all the above tables data using DAX query into this temporary table.

  1. Go to Modeling Table.
  2. Click on New Table. Use the below query to fill the table (Alter table names based on your scenario),
    TempTable = UNION('Bookable Resource Booking','Bookable Resources','Work Orders','Project Sites')
  3. Drag a Map Visualisation control to the Power BI.
  4. Select “Category” or Entity Name from the TempTable as Legend. This will ensure to show your entities in different colors.
    Drag longitude and Latitude fields to the X and Y axis.
  5. Note: By default when you form tables, Power BI adds SUM function to summarize longitude and latitude. These columns with summarize functionality don’t work in maps. You must remove summarize attribute from them by choosing “Don’t summarize”.


Impact of deprecation of VoC on the Exam MB-230: Microsoft Dynamics 365 for Customer Service

For people aiming for: Exam MB-230: Microsoft Dynamics 365 for Customer Service

In October, Voice of the Customer skills and exam questions will be replaced with Forms Pro skills and questions. The exact date of that change and the associated changes to the Skills Measured will be communicated in August, 2019. Please prepare for your exam accordingly.

How to fix Ribbon Button issues in Unified Client Interface (UCI)

A while ago I was working on creating a ribbon button for Contact Form. I used the fabulous RibbonWorkBench to add the button to my classic form. My requirement was to run a workflow from my custom ribbon button. The post I referred to was from Scott Durow in his website.

When I changed my form from the classic Webform to UCI, I saw two strange behaviors .

The button was not anymore showing on UCI

  • Someone using the Ribbon Workbench, the Enable Rule was messed up. The button was showing in the classic form but not on the UCI form. I tried creating new solutions. I even tried isolating the button. None of these worked. Then I had to examine my “Customization.xml” file. I used the Microsoft Documentation to write enable rules again and published my solution. Once I published my solution it worked fine. To button line is that in case of any issues with your ribbon (when you try all possible options), make sure to examine Customization.xml to ensure your configuration is correct. Customization.xml is the ultimate source of truth.

Pressing the ribbon button was not doing anything

  • The ribbon button was not performing anything because the action was from a static .js library (/_static/_common/scripts/RibbonActions.js). Actions in UCI must be from web resources. static files are not supported anymore on UCI

5 ways to insert images in Dynamics 365 email templates

 

Disclaimer: Some of these methods are unsupported, so please check Microsoft documentation for updates.

1. The old school copy paste.

1.You need an image that is hosted on a public-facing website. Simply go to that image, right click, and select Copy Image. Works in IE, Chrome, Edge, and Firefox. The image must be rendered in browser view.

2. Open a new email template window, hit ‘Ctrl + V’ to paste the image. Your image should now be visible.

2. Upload your image in a file repository online (OneDrive/Dropbox/Google Drive)

Another secure way is to upload your image to your preferred file repository, make the file public, and embed it in your email template.

  1. Simply get the direct link to the image you have uploaded.
  2. Open the image in browser view, right click and select Copy Image.

3. Open a new email template window, hit ‘Ctrl + V’ to paste the image. Your image should now be visible.

3. Base64

If you do not want to upload your image to a site, you can encode your image using Base64.

1. Use an Image to Base64 converter. I personally use this website but it’s up to you, you can use MS Flow if you want 🙂

2. After you’ve converted the file, copy the Base64 code. Enclose it with an <img> tag. Select the text and copy and paste it to your email template.

3. When you insert a template into you email, the image should render properly.

4. Clickable images

If you want your image to point somewhere on the web, then you would want to make use of a few friendly HTML tags.

Example:

<a href=”https://dynamics.microsoft.com/en-us/”>
<img border=”0″  src=”https://mspoweruser.com/wp-content/uploads/2016/10/Microsoft-Dynamics-365-logo.jpg”></a>

  1. Just copy the snippet above and replace the href tag to whatever URL you want the image to direct to.
  2. Open a new Email Template window, paste the HTML snippet. Select Save & Close. 

3. When you try your new email template, the clickable image should work properly.

5. Image slices

This is a bit beyond this post, but this is a common issue especially if you want to send out marketing emails. I would just like to share what I know.

  1. Open your image in Photoshop, make your desired slices using the slice tool.

2. Once your slices are ready, right click on a selected slice, then select Edit Slice Options.

3. Enter URL/target depending on where you want the slice to direct to.

4. Once you’re set, select Save for Web and Devices and then select Preview.

5. Copy the generated HTML script and replace the img src tag to the direct link of the image.

6. Paste it on your new email template. Select Save & Close.

7. When you try your new email template, the image slices should be rendered properly. 🙂

Opening Dynamic CRM Entity Form by passing Query String

Photo by Luca Bravo on Unsplash
One of the awesome features of the Power Platform is its extension capabilities. We often talk about integrating Power Platform using web services, azure services or plugins but we overlook the platform client side capabilities. The Dynamics platform allows interacting with resources using Addressable elements. URL addressable elements enable you to include links to Dynamics 365 for Customer Engagement apps forms, views, dialogs, and reports in other applications. In this manner, you can easily extend other applications, reports, or websites so that users can view information and perform actions without switching applications.

Requirement

I had a requirement to open an Account entity form based on the Account Telephone number. The Dynamics platform allows only to open the entity record in edit mode only by passing the entity ID. However, my requirement was to open the entity form based on the telephone number.

Considerations

  • Opening a form in edit mode is possible only if we know the ID (or GUID) of the record. If you pass any other query string like telephone, employeeno or etc.  you will receive 500-internal error.
  • You will need an HTML webresource as intermediate component to resolve your query string and in my case telephone to the entity ID and then open the form in edit mode by passing ID.
  • The only query string name you can use to pass to the organization URL is “data”. If you use any other query string name such as employeeId, contactid and etc. will lead you to the 500-Internal Server Error.
  • You will need to use GlobalContext by calling getGlobalContext method in your web resource. The getQueryStringParameters method is deprecated. You will need to find another way to get the value of your query string. I used Andrew Butenko post to extract query string. A big shout out to Andrew Putenko. At the same time a big shout out to Jason Lattimer for his great CRMRestBuilder.

Solution

I used an HTML webresource, with a Javascript function to extract and resolve query string and then call OpenForm function to open the form.
<!DOCTYPE html>
<html lang="en" xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta charset="utf-8" />
    <title>Web Resource</title>
    <script src="ClientGlobalContext.js.aspx" type="text/javascript"></script>
    <script src="https://code.jquery.com/jquery-3.4.1.min.js"> </script>
    <script>
        function Onload() {
            var queryString = location.search.substring(1);
            var params = {};
            var queryStringParts = queryString.split("&");
            for (var i = 0; i < queryStringParts.length; i++) {
                var pieces = queryStringParts[i].split("=");
                params[pieces[0]] = pieces.length == 1 ? null : decodeURIComponent(pieces[1]);
            }

            var phone = params["data"];//formContext.data.attributes["data"];
            var req = new XMLHttpRequest();
            req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v9.1/accounts?$select=accountid&$filter=telephone1 eq '" + phone + "'", true);
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("Prefer", "odata.include-annotations=\"*\",odata.maxpagesize=1");
            req.onreadystatechange = function () {
                if (this.readyState === 4) {
                    req.onreadystatechange = null;
                    if (this.status === 200) {
                        var results = JSON.parse(this.response);
                        for (var i = 0; i < results.value.length; i++) {
                            var accountid = results.value[i]["accountid"];
                            OpenForm("account", accountid);
                        }
                    } else {
                        Xrm.Utility.alertDialog(this.statusText);
                    }
                }
            };
            req.send();           
        }

        function OpenForm(entity, id) {
            var entityFormOptions = {};
            entityFormOptions["entityName"] = entity;
            entityFormOptions["openInNewWindow"] = true;
            entityFormOptions["entityId"] = id;
            Xrm.Navigation.openForm(entityFormOptions).then(
                function (success) {
                    console.log(success);
                },
                function (error) {
                    console.log(error);
                });
        }
</script>
</head>
<body>
<script>Onload();</script>
</body>
</html>

Review the Power Platform release plan with mvp’s

In case you missed it, the 2019 wave 2 release plan for Dynamics 365 and the Power Platform was released today.  You can read James Phillip’s blog summary at https://cloudblogs.microsoft.com/dynamics365/bdm/2019/06/10/announcing-new-features-growing-demand-for-dynamics-365-and-power-platform/ and you can download the full release notes from https://docs.microsoft.com/en-us/dynamics365-release-plan/2019wave2/.

This afternoon, I was joined by MVP’s Megan Walker, Ulrik Carlsson, and Andrew Bibby to review the release plan. Watch the video below.

 

Flow, HTTP Actions, and Files

I am working on a new presentation sample project and I wanted to test invoking an HTTP request from a Flow. Specifically, I want to invoke a Function App from a Flow using an HTTP Flow Action. In my sample, I will kick this off when a new Note is created with an Attachment.

To quickly test calling the HTTP Action, I uses an existing Function App sample that I had worked on a few weeks ago: a small Function App that I put together to test populating a PDF template using CRM data.

Poking around with Function Apps

This sample is creatively named CRMToPDF because it retrieves a record from CRM and populates a fillable PDF form from the CRM record using iText, returning the updated PDF for download. Pretty simple in terms of code, but it was a nice proof of concept testing the iText libraries (more on that in another post!).

Since this Function App returns the PDF file as the response, I was curious as to how Flow would handle it.  Could I “download” a file from a URL in Flow and attach it to an email using the Outlook email Action?

You bet I can!

With a few short steps, I was able to grab the resulting file and attach it to an email. This isn’t a huge surprise since so many Flow connectors already deal with moving files around. But it surprised me how simple it was to accomplish what I wanted to do.

So the Flow I created is really simple:

  • Trigger on a new Note
  • Invoke an HTTP Flow Action
  • Email the resulting PDF to myself

Here are the initial Flow steps, a Trigger and HTTP Action:

Trigger on new Note record, call HTTP GET

The trigger is on ALL Notes, so this would definitely change in the real world. And the HTTP GET only includes my Function App Authorization key. In a real example, I would pass in additional parameters, such as the of the Note ID or Object ID as an additional Query or as part of the request Body.

The Outlook Email Action looks like this:

New Email using the HTTP Response

You can see that this Action is pretty straightforward. It’s just an email to myself from the Owner of the Note. In Dynamics 365 CE, this mean that the System User had to enable sending emails on their behalf which is just a value under Personalized Settings. I just filled in a few bits of other info, like the Body and Subject.

The important part for me here is setting up the Attachment: set the Attachment Name to “CRM2PDF.pdf” and the Attachment Content to the Body of the HTTP Response.

That’s it. Yep. That’s all!

I first started looking at Flow a bit last year and wrote a short post about moving a document from Dynamics 365 CE to SharePoint, Flow Examples: Note attachment to SharePoint. This turned out to be relatively straightforward and a really cool Flow but had a few quirks, like converting the Note document body using the base64ToBinary function.

When I started looking at this sample, I expected some similar required steps, but setting the Body as the Attachment Content just worked. I put this entire Flow together in about 15 minutes, and it worked on the first try! (As a developer, I NEVER expect it to work on the first try!)

This tells me that the Flow engine is aware of the content type being returned by the HTTP get and can handle it properly when moving between the actions. The Actions know how to work with the files between the source HTTP Action and the next Outlook email Action.

That sounds like another obvious comment, but it makes me happy as a developer not having to do any kind of manipulation or parsing or other coding magic! For an idea of what is being returned from the HTTP action, we can look at the Flow Test logs for the HTTP GET Action and open the Outputs:

This isn’t super complex JSON for most developers: HTTP response code, several headers, the filename, etc. But for non developers, this could present an impossible roadblock. With the Flow designer and this huge library of existing Actions, a non developer can point their Flow to a service endpoint and move files about without a single line of code.

That’s some powerful stuff.

Solution Layering

I’ve recently noticed the Solution Layers button but knew next to nothing about its functionality.  It was added to my ever growing list of, “Ok, I need to check that out when I have some time!” While on a call this past week, the Solution Layers feature came up. After a brief overview on the call and some poking around afterwards, it looks to be a useful feature for developers, business analysts, and administrators.

What are Solution Layers?

Solution Layers is not some hidden, mystery feature.  Microsoft has done a great job recently with their online documentation and the article titled View solution layers includes a nice quick explanation of Solution layers:

  • Let you see the order in which a solution changed a component.
  • Let you view all properties of a component within a specific solution, including the changes to the component.
  • Can be used to troubleshoot dependency or solution-layering issues by displaying change details for a component that was introduced by a solution change.

So the Solution Layers tool offers insight into system components and their relationships to Solution deployments. The significant bit here to me is that it shows changes to the component and when the installation or updates were introduced.

Where do I find Solution Layers?

When you select a Solution component, such as an Entity, Process, or Web Resource, or sub component such as an Entity Form or Attribute, you will now see a button labeled Solution Layers.

For example, I opened the Power Apps Checker solution in a recently provisioned demo environment.  Expanding the Entities, we can see the button on the Analysis Result Detail Entity. Drilling into the Forms list, we see the tool button available with the Information main Form.  

Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity Information Form
Solution Layers for the Analysis Result Detail Entity Information Form

If you open the Solution Layers dialog for the Analysis Result Detail Entity, we can see a one item list of Solutions.  This is a list of the Solutions to which this Entity is related.

Entity level Solution Layers
Entity level Solution Layers

Select the Solution listed and you can view the Analysis Result Detail Entity details that are related to the Solution.

 Analysis Result Detail Entity Solution Layer Details
Analysis Result Detail Entity Solution Layer Details

This view provides the list of the changed properties for the Entity when the Solution was imported in the first Changed Properties ‘tab’, and the full list of Entity properties in the All Properties tab. If we open the Information Form for this Entity, we see very similar Information: a single Solution and the detailed changes of the selected Entity Form for that Solution import. 

We only see one item in both the Entity and Entity Form levels because this Entity and all of its components are unique to this Solution. We can also see the list of Changed Properties is the same as the list of All Properties. This tells us that the Analysis Result Detail Entity was installed with Power Apps Checker solution and has not been affected by any other Solution installs.

That is some nice information, but not especially useful. The Solution Layers component really shines when we look at Entities that can be impacted by other solution imports.  For example, a system Entity Contact can be impacted by many different Solutions on your system. Or you may have a custom Entity being deploying as part of a product or an ongoing project that will see regular changes, whether through major Solution releases or hotfix style solution deployments.

Contact is a popular Entity

If we open a different solution that contains the Contact Entity, we see the real power behind this tool. If we open the solution named Sales Navigator for Dynamics 365 Unified Interface that comes with my demo environment, and view the Contact Entity Solution Layers, we see some immediate differences.

Contact Solution Layers Detail - lots of changes!
Contact Solution Layers Detail – lots of changes!

The Contact Entity has been changed by 21 separate Solutions. The first at the bottom of the list is System, but at the top we see Active as the latest. This means that the Entity or one or more Entity sub components were updated with each of these 21 Solution imports. So, how do we see more detail on all of these Entity changes?

Deltas!

If we dig deeper into the Solution components, we can see more granular detail of the changes. We can drill into the Contact Forms list for this Solution and open the Contact Form Solution Layers dialog.

In this view, we can see that the Contact Form has been updated by 11 different Solution Imports. But what has been changed? Open up a solution from the list to find out:

Contact Form Solution Layers Detail
Contact Form Solution Layers Detail

In this view under Changed Properties, we can see detailed changes that were made with the Solution Import. In this example, we see the underlying Form JSON value was updated, and if you scroll a bit, you will also see that the Form XML. With other value types, such as numbers or boolean values, it’s easy to see the changed value.

For more complex types like Form JSON or XML, you can compare the differences to the previous Solution Layer value. Simply open the previous Solution Layer from the list and view the property value under the All Properties view using a standard text diff tool such as WinDiff or Visual Studio.

Why is this a big deal?

The Dynamics 365 CE and the Power Platform with CDS now has a built in method for change tracking of various layers of the solution components. I include the Power Platform here because when you view an Entity from a Model Driven Power Apps , you have the option of switching to Classic View. In Classic View, you can view the Solution Layers exactly as if you were working within a Dynamics 365 CE solution.

This can be incredibly useful when troubleshooting issues or just managing your own deployments. With solid DevOps practices in place, you should be able to view content like this using source code control tools. But if you are working on a project for which those practices were not well established, I can see this feature as a huge help for developers, business analysts, or system administrators.

I recommend reviewing the article listed above and playing around with the feature. For example, check out changes to solution components like Workflows where you can view the changes to the underlying XAML that contains the workflow logic.

I will be looking into it in more detail myself because I can see the possibility for some nice tools built around this capability!

My Quest to D365 Saturday Stockholm

Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to meet a load of new people for the first time which was AMAZING!

These events are so important for the community because they are often the only opportunities some of the community members really have to interact with other customers, partners ISVs and Microsoft employees. I love running into people that have encountered the same issues as I have, that way I know I’m not going bonkers and we can work on a solution together.

The crowd was great! There were many enthusiastic people in the audience who were really getting involved in the sessions, looking for information and really testing all of the speakers knowledge. You can find the list of sessions and speakers HERE.

I big reason I really enjoyed this event was the different layers and levels of content being shared across the sessions. The sessions were split into three tracks, these being:

Applications (Dynamics 365 CE), Dev (Dynamics 365 CE) and Business & Project Management. This gave participants the opportunity to stick to a single, themed, track or weave between tracks. This is pretty much how my experience went. I went to at least 1 session from each track. I wanted to get a flavour for everything. I also got to see some wizardry from folks like Julie Yack, Gus Gonzalez, Nick Doelman and Gustaf Westerlund.

Other presenters and panellists included Sara Lagerquist, Jonas Rapp, Fredrik Niederud, Katherine Hogseth, Mark Smith and Antti Pajunen. Each delivering some amazing content based on their experiences with Dynamics 365 and Power Platform.

There was a plethora of information and content being shared between speakers and passionate attendees. Everything from Microsoft Portals and Social Engagement to developing your own XrmToolBox tools (Careful with the spelling here….HAHA) was being talked about. I personally got involved in a number of Power Platform conversations, which suited me just fine because that’s kinda what I’m doing at the moment.

I had the pleasure of running 2 sessions. One in the Application track and one in the Dev track (I am no developer… Don’t judge me). The 2 sessions were:

  1. Power Platform – How it all fits together (Download Here)
  2. Building your first Canvas App with CDS, and Azure (Download Here)

Apparently people don’t like the first 2 rows.. great crowd though! Thanks to everyone that attended. Try work out what Mark is doing in the background there! HAHA!

The first presentation focused on the different elements of the Power Platform and the way it all works together. Many Dynamics 365 users often worry a bit about this because it seems so large and complicated, but it really isn’t once you have wrapped your head around the different technologies. To highlight the way the different elements of the technology worked together I included a Roadside Assist demonstration that was created during the PowerApps & CDS Hackathon Those Dynamics Guys and Hitachi Solutions Europe hosted together.

My second presentation consisted some of the “Do’s and Don’ts” around building your first Canvas App with your customer. I followed the presentation with the following:

  1. Adding several fields to a custom entity in the Common Data Service (CDS)
  2. Importing some data
  3. Creating a new canvas app
  4. Connecting the Canvas app to the CDS
  5. Adding in the Azure translation service to the app
  6. Publishing the app

The actual canvas app I created with the little model driven app solution, including data is available HERE.

The below pic gives of the impression that I am about to start having a conversation with my own hand, like an invisible Muppet. May be a great trick for my next demo 😀

One of the BEST sessions that I have been in was the “CAGE MATCH” moderated by the one and only Julie Yack,  This was EPIC fun! We were split into teams of 5 and given problems by the audience to resolve. It was a little daunting being in the presence of some of these long time MVPs, BUT, THE SHOW MUST GO ON, so we got stuck right in. Unfortunately, the team I was in didn’t take the win 🙁 Its cool, I am preparing my battle cards for the next one!

All in All, it was a fantastic event and a great opportunity to network with this amazing Dynamics and power Platform community that we all have grown to know and learn from. A MASSIVE thank you to the sponsors of the event!

Also, a big thanks to all of the folks that hung out after the event and enjoyed several beverages with me. Was a great time and I’d love to do it again 🙂

Here are some more delightful images from the day 🙂 My camera skills aren’t great so i had to grab a few from social. Thanks to those that grabbed pics in my session! I hope that this encourages more people to attend these events because I genuinely gain so much from being there.

Nick Doelman Smashing his Portals Presentation

One of my favourite Finns – Antti

Julie Yack doing her presentation on Social Engagement

MORE of the awesome Julie Yack

WHAT?? ANOTHER ONE of my favourite Finns – KIMMO!

JOOONNNAASS RAPP!!! The Legend!

We were all so excited! mark, Jonas and me 🙂