D365UG Bristol

Man, oh MAN! I am just on the train back from Bristol after attending their first 365 User Group. and it was AWESOME!

First up, big props to the organisers @lubo @dbarber @xrmjoel @robpeledie @leembaker for doing such an amazing job. The venue was great, the food was too (including vegan options :-)) and the swag bags were epic.

But more importantly, the speakers were amazing.

First up, Mark Smith did a fantastic presentation to explain the (seemingly unexplainable) Common Data Model, which is something that sits on the CDS and exists to make sense of your data so that it is usable and, more importantly, means you can build apps faster. Considering he knew nothing about the CDM 5 days ago, this was a great presentation which goes to show how quickly you can assimilate information when the data is available in a usable format (see what I did there *winks*)

Second up was @scott-durowdevelop1-net showing us how much the Power Platform has changed in such a short space of time, and why that’s such a good thing, enabling access to a much wider audience and giving rise to things like the Citizen Developers who can effect such great change in an organisation, and how we all need to embrace change and be adaptable.

And finally, @themarkchristie with an entertaining presentation on how he bought headphones at the Microsoft store and @lubo busted them 5 minutes later – but not to worry, Virtual agent was able to help him raise a support ticket through Forms Pro, which then could be assigned to someone in Field Service, leading to the headphones being fixed (whilst being worn by Mark Smith).

I learned a lot about how to better use the tech we have openly available to us in a more inventive way, and how to work with the community to find answers to stuff that you are struggling with.

I love these community events and I hope to attend each and every one that I can.

Again, big thanks and well done to the organisers.

Love to all

Alison

(Me when I won a Flic button from @themarkchristie)

How to set Sharepoint document location in a field dynamically in Dynamics 365

So, I recently had this requirement to put the SP document location in a field and show it on the Contact form. And I had to do it OOTB way. After some time of playing around with the D365 workflow, I found a way to do it. Yay! ūüôā

First, make sure to

  1. Enable Server-side Sharepoint Integration.
  2. Enable document management for your desired entity.
  3. Take note of the absolute URL of your Sharepoint directory. It looks something like this: https://contoso.sharepoint.com

Once you have Sharepoint side set up, let’s do some config! ūüôā

  1. Create a field where you would like the link to appear on. In this case, I created a single line of text field with format URL and put it on the form.

2. Create a workflow. The important thing to note here is that the workflow should run on the Document Location entity.

3. Create a step to update the record of whichever entity you enabled document management on. In my case, it is Contact.

4. Inside the workflow designer, select the field you created on Step 1 and paste the URL of your sharepoint directory and add a “/” character.

5. Once you’ve done that, select this field from the Form Assistant and it on the field after the “/”.

6. Once you’ve mapped the field from Step 5, add another “/” character.

7. Select this field from the Form Assistant and add it after the second “/” character.

8. Select Save & Close and activate your workflow.

9. Now this link will be added dynamically on the form ūüôā

 

5 ways to insert images in Dynamics 365 email templates

 

Disclaimer: Some of these methods are unsupported, so please check Microsoft documentation for updates.

1. The old school copy paste.

1.You need an image that is hosted on a public-facing website. Simply go to that image, right click, and select Copy Image. Works in IE, Chrome, Edge, and Firefox. The image must be rendered in browser view.

2. Open a new email template window, hit ‘Ctrl + V’ to paste the image. Your image should now be visible.

2. Upload your image in a file repository online (OneDrive/Dropbox/Google Drive)

Another secure way is to upload your image to your preferred file repository, make the file public, and embed it in your email template.

  1. Simply get the direct link to the image you have uploaded.
  2. Open the image in browser view, right click and select Copy Image.

3. Open a new email template window, hit ‘Ctrl + V’ to paste the image. Your image should now be visible.

3. Base64

If you do not want to upload your image to a site, you can encode your image using Base64.

1. Use an Image to Base64 converter. I personally use this website but it’s up to you, you can use MS Flow if you want ūüôā

2. After you’ve converted the file, copy the Base64 code. Enclose it with an <img> tag. Select the text and copy and paste it to your email template.

3. When you insert a template into you email, the image should render properly.

4. Clickable images

If you want your image to point somewhere on the web, then you would want to make use of a few friendly HTML tags.

Example:

<a href=”https://dynamics.microsoft.com/en-us/”>
<img border=”0″¬† src=”https://mspoweruser.com/wp-content/uploads/2016/10/Microsoft-Dynamics-365-logo.jpg”></a>

  1. Just copy the snippet above and replace the href tag to whatever URL you want the image to direct to.
  2. Open a new Email Template window, paste the HTML snippet. Select Save & Close. 

3. When you try your new email template, the clickable image should work properly.

5. Image slices

This is a bit beyond this post, but this is a common issue especially if you want to send out marketing emails. I would just like to share what I know.

  1. Open your image in Photoshop, make your desired slices using the slice tool.

2. Once your slices are ready, right click on a selected slice, then select Edit Slice Options.

3. Enter URL/target depending on where you want the slice to direct to.

4. Once you’re set, select Save for Web and Devices and then select Preview.

5. Copy the generated HTML script and replace the img src tag to the direct link of the image.

6. Paste it on your new email template. Select Save & Close.

7. When you try your new email template, the image slices should be rendered properly. ūüôā

Opening Dynamic CRM Entity Form by passing Query String

Photo by Luca Bravo on Unsplash
One of the awesome features of the Power Platform is its extension capabilities. We often talk about integrating Power Platform using web services, azure services or plugins but we overlook the platform client side capabilities. The Dynamics platform allows interacting with resources using Addressable elements. URL addressable elements enable you to include links to Dynamics 365 for Customer Engagement apps forms, views, dialogs, and reports in other applications. In this manner, you can easily extend other applications, reports, or websites so that users can view information and perform actions without switching applications.

Requirement

I had a requirement to open an Account entity form based on the Account Telephone number. The Dynamics platform allows only to open the entity record in edit mode only by passing the entity ID. However, my requirement was to open the entity form based on the telephone number.

Considerations

  • Opening a form in edit mode is possible only if we know the ID (or GUID) of the record. If you pass any other query string like telephone, employeeno or etc.¬† you will receive 500-internal error.
  • You will need an HTML webresource as intermediate component to resolve your query string and in my case telephone to the entity ID and then open the form in edit mode by passing ID.
  • The only query string name you can use to pass to the organization URL is “data”. If you use any other query string name such as employeeId, contactid and etc. will lead you to the 500-Internal Server Error.
  • You will need to use GlobalContext by calling getGlobalContext method in your web resource. The getQueryStringParameters method is deprecated. You will need to find another way to get the value of your query string. I used Andrew Butenko post to extract query string. A big shout out to Andrew Putenko. At the same time a big shout out to Jason Lattimer for his great CRMRestBuilder.

Solution

I used an HTML webresource, with a Javascript function to extract and resolve query string and then call OpenForm function to open the form.
<!DOCTYPE html>
<html lang="en" xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta charset="utf-8" />
    <title>Web Resource</title>
    <script src="ClientGlobalContext.js.aspx" type="text/javascript"></script>
    <script src="https://code.jquery.com/jquery-3.4.1.min.js"> </script>
    <script>
        function Onload() {
            var queryString = location.search.substring(1);
            var params = {};
            var queryStringParts = queryString.split("&");
            for (var i = 0; i < queryStringParts.length; i++) {
                var pieces = queryStringParts[i].split("=");
                params[pieces[0]] = pieces.length == 1 ? null : decodeURIComponent(pieces[1]);
            }

            var phone = params["data"];//formContext.data.attributes["data"];
            var req = new XMLHttpRequest();
            req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v9.1/accounts?$select=accountid&$filter=telephone1 eq '" + phone + "'", true);
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("Prefer", "odata.include-annotations=\"*\",odata.maxpagesize=1");
            req.onreadystatechange = function () {
                if (this.readyState === 4) {
                    req.onreadystatechange = null;
                    if (this.status === 200) {
                        var results = JSON.parse(this.response);
                        for (var i = 0; i < results.value.length; i++) {
                            var accountid = results.value[i]["accountid"];
                            OpenForm("account", accountid);
                        }
                    } else {
                        Xrm.Utility.alertDialog(this.statusText);
                    }
                }
            };
            req.send();           
        }

        function OpenForm(entity, id) {
            var entityFormOptions = {};
            entityFormOptions["entityName"] = entity;
            entityFormOptions["openInNewWindow"] = true;
            entityFormOptions["entityId"] = id;
            Xrm.Navigation.openForm(entityFormOptions).then(
                function (success) {
                    console.log(success);
                },
                function (error) {
                    console.log(error);
                });
        }
</script>
</head>
<body>
<script>Onload();</script>
</body>
</html>

Underhood of Dynamics 365 Portal Profile Page

Photo by Emanuel Villalobos on Unsplash
In the part 1 of the series, I shed light on some of planning and analysis activities involved in a Portal project. From this part, I will be writing about some common portal project requirements and the way I have addressed those requirements.

Site Settings

One of the most basic portal configurations happen in Site Settings. Site Settings contain some global configuration used by portal framework. The complete list of portal site settings can be found here. However, there are some settings are deprecated since this link is taken from ADXStudio product and there are some settings that you will not find in the given link and I am listing down those settings in the below table:
Setting Description
Search/Enabled A true or false value. If set to false will disable the search functionality from the portal so you will not see the magnifier sign on the primary navigation
DateTime/DateFormat The format you want to show dates in your portal. For example, for British datetime format, I use d/M/yyyy
profile/forcesignup A true or false value. If set to true will force user to update their profile after signup. It means portal will redirect the user to the Profile page after successful signup
Authentication/Registration/TermsAgreementEnabled A true or false value. If set to true, the portal will display the terms and conditions of the site. Users must agree to the terms and conditions before they are considered authenticated and can use the site.
Authentication/Registration/ProfileRedirectEnabled A true or false value. If set to false and profile page is disabled on the Portal, then Redeem Invitation workflow doesn’t work properly and keeps redirecting user to same page in place of home page.

From <https://support.microsoft.com/en-au/help/4496222/portal-capabilities-for-microsoft-dynamics-365-version-9-1-4-29-releas> 

Authentication/Registration/LoginButtonAuthenticationType If a portal only requires a single external identity provider, this allows the Sign-In link on the primary navigation to link directly to the sign-in page of that external identity provider
Authentication/Registration/OpenRegistrationEnabled A true or false value. If set to true allows any anonymous visitor to the portal to create a new user account.
Profile/ShowMarketingOptionsPanel A true or false value. If set to false, it hides the marketing preferences area in the contact profile

Profile page

Profile page is a custom aspx page which displays Contact entity’s “Profile Web Form” on the profile page. If you want to change the fields on this page, you must modify “Profile Web Form” on the contact entity. In addition of fields on the “Profile Web Form”, the profile pages show Marketing Preferences. Marketing Preferences can be enabled or disabled by using Profile/ShowMarketingOptionsPanel site setting.

Customising Profile Page

Profile page is a special page which you cannot customise using Entity forms and Metadata. An ordinary web page like a case form in portal rely on “Entity Forms”. The case of Profile page is different. The Profile page does not rely on Entity forms, but it re-writes the request to “Profile.aspx”. So, in case if you want to change the form behavior or add validation to fields on the screen, you will need to come with a different approach.

Scenario:

  1. The mobile phone on the profile should be in the format xxxx-xxx-xxx
  2. The profile page should be editable if the contact has NO active cases
  3. The profile page should be read-only if the contact has active cases

Implementation:

  1. Deactivate the existing Profile web page
  2. Create a new Entity Form called “Editable Profile Entity Form” interfacing “Profile Web Form” in Edit mode
  3. Create a new Entity Form called “Read-only Profile Entity Form” interfacing “Profile Web Form” in Read-only mode
  4. Create a new Web Template called “Profile Web Template” – We will talk about this template in details later
  5. Create a new Page Template named “Profile”
  6. Open the Profile Page template and set the following values
    1. Type=Web Template
    2. Entity Name= Web Page (adx_webpage)
    3. Web Template = Profile Web Template (created in the step 4)
  7. Create a new Web Page named “Profile”.
    1. The profile page’s partial URL must be “Profile”
    2. Set the parent page to “Home”.
    3. Set the Page Template to the Profile Template (created on the step 6)
  8. Open the Profile Web Template and add the following liquid template:

For adding the breadcramp, add the following liquid

{% block breadcrumbs %}
{% include 'Breadcrumbs' %}
{% endblock %}

For adding the title to the page, add the following liquid

{% block title %}
{% include 'Page Header' %}
{% endblock %}

For adding side navigation, add the following liquid

{% block aside %}
{%include "side_navigation_portal" %}
{% endblock %}

The main form will be in the Main Block

<div class="col-sm-8 col-lg-8 left-column">
{% block main %}
{% endblock %}
</div>

Now the below code is the magic behind meeting the requirement:

Use FetchXML to check if there are any active cases related to the contact:

{% fetchxml my_query %}
<fetch version="1.0" output-format="xml-platform" mapping="logical"returntotalrecordcount="true" distinct="false">
<entity name="incident">
<attribute name="name" />
<attribute name="status" />
<attribute name="createdon" />
<filter type="and">
<condition attribute="parent_contact" operator="eq" value="{{User.Id}}" />
<condition attribute="status" operator="eq" value=0 />
</filter>
</entity>
</fetch>
{% endfetchxml %}
  1.  The FetchXML block must be enclosed in the {%fetchxml my_query%} where my_query is holding the result of the query
  2. If you want to check the total records returned by the query, you must use returntotalrecordcount=true otherwise you will always get -1 in count of your records
  3. The total count of result of the query will be accessed by my_query.results.total_record_count

The final piece of code in the page template will be the following

{% if my_query.results.total_record_count > 0 %}
{% entityform name: 'Read only - Profile Entity Form' %}
{% else %}
{% entityform name: 'Editable - Profile Entity Form' %}
{% endif %}

With this simple if/else statement you can control the behavior of the profile page.

Validating Mobile phone

Since we are using Entity Forms to show profile information, we can use Entity Form Metadata to control behavior of fields on the form. To ensure our mobile number is always in xxxx-xxx-xxx format, do the following:

  1. Open Editable – Profile Entity Form
  2. From related records, go to “Entity Form Metadata”
  3. Add a new metadata record with the following properties:
  4. Type: Attribute
  5. Attribute Logical Name:  Mobile Phone (mobilephone)
  6. Find Validation section down the form and add Validation Error Message
  7. Use the ^\d{4}\s\d{3}\s\d{3}$ as regular expression to ensure the mobile phone is in the xxxx-xxx-xxx format
  8. You can tick the “Is Field Required” to make the field required on the screen

 

Converting Dynamics’s Geolocation To SQL Geolocation Using Microsoft Flow And Azure Function

Background

One of the awesome features of the Azure Search service is the ability to search information based on location. Azure Search processes, filters, and displays geographic locations. It enables users to explore data based on the proximity of a search result to a physical location. This feature is powered by SQL Server Geolocation data type. Since SQL Server 2008, developers are able store geospatial data in SQL server using Geolocation fields. Geolocation fields allow querying data with location based queries. To facilitate the Azure Search service to search within CRM accounts and contact, I had to pushed my account and contact searchable information to SQL server hosted in Azure. To copy information from Dynamics to Azure SQL server, I used Microsoft flow. Everything worked good except, copying CRM longitude and Latitude to SQL Server.

The problem

The problem with copying longitude and latitude to SQL server Geolocation field is the compatibility. When you try to insert longitude and latitude fields to Geolocation you encounter casting error.

The solution

  1. The solution I used to tackle this problem is making use of Azure Function and converting Longitude and Latitude to Geolocation type in the Azure function and return the response before the Insert action in the flow. See the below steps:
  2. Step 1 is self-explanatory.
  3. The step “CC Contact” extracts the Contact name (or any lookup name property) from a lookup.
  4. The “Http” step, calls the Azure Function to converts the CRM longitude and Latitude to SQL Geolocation field
  5. The “Insert Row” step, inserts our data to SQL server row.
Microsoft Flow
Microsoft Flow

The Azure Function

The Azure function is a very simple function. You will need to import Microsoft.SqlServer.Types Nuget package and use the below code:
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();  
       Coordinates data = JsonConvert.DeserializeObject<Coordinates>(requestBody);  
       SqlGeography point = data.GetGeography();  
       return ( ActionResult ) new OkObjectResult ( $"{point}" );  
 public class Coordinates  
   {  
     public double Longitude { get; set; }  
     public double Latitude { get; set; }  
     public SqlGeography GetGeography ( )  
     {        
       try  
       {  
         return SqlGeography. Point ( Latitude , Longitude , 4326 );  
       }  
       catch ( Exception ex )  
       {  
 // Log ex and handle exception  
         throw ex;  
       }  
     }  
   } 

 

 

Implementing Enterprise Search In Power Platform

Photo by Anthony Martino on Unsplash
Photo by Anthony Martino on Unsplash
Providing good search capabilities is a key feature in modern business applications to support usability and end user satisfaction. We have seen how the search capabilities of the Dynamics platform has evolved from providing “Quick Search” and “Advanced File” to “Relevance Search”. The goal of the platform search features has been to support users to find the relevant information they need in the quickest and easiest form. These search features are out-of-the-box and easy to enable/configure/use. As the platform progresses to offer richer features to users and enable them to search better, the demand for richer and better search techniques grow, and we see instances where the platform capabilities cannot meet user demands with its out-of-the-box capabilities. Before going further about advanced search scenarios, you can read about the platform out-of-the-box search capabilities in this official documentation. In this article I share why we may decide to implement a search solution of our Dynamics solution using Azure Search Service.
In enterprise implementations, business applications are not the only systems used in the organization. We often see call center agents and sales representatives need to obtain their required information from various systems to service customers. Searching users in every system is a cumbersome job which may cause setbacks in end-user adaption. Integrating Dynamics with Azure search offers consolidation of search operations in one specialized search service with ability to connecting to various data sources and apply modern search techniques to find the most relevant data. A practical example of this scenario can be seen in one my recent experiences where the organization users had to search for user information in CRM, SharePoint, Sybase and a pool of CSV files.

Customized Search experience

To facilitate more user adoption, using customized search techniques are highly favorable. In all modern search engines, we see use of “Auto complete”, “Suggestions” and “highlighting” features which can be added to the Dynamics solutions search experience. Displaying search results by support of “Document Preview”, “Document Opening in a customized containers”, “Facets”, “Filter” and “Sorting” are examples that enhance your Dynamics solution’s capabilities.

Customized Search Behavior

The true power of search is demonstrated with different pieces of information are linked together to make sense of a bigger picture. Extracting words and sentences from documents including images and pdf files, extracting key phrases, people names, location names, languages and other custom entities with the help of AI is another unique feature that you can add to your Dynamics’s search capabilities. Another amazing search capability you can have in your Dynamics implementation is the ability to search based on geolocation information, i.e. you can search for all your partner network from CRM or get the location of your field service force. The beauty of implementing your own enterprise search lies in the fact that you can search information in your data stores and link them using AI to generate knowledge and better insight to your data.

Customized Search Result

Another need for customized search in your Dynamics solution to the ability to refine your search result profile. When you use AI in your search, the system gives you the power to see how relevant search results are to your search keywords. And by knowing this you can refine your search profiles to generate a different result for the same keywords. This way you train the AI engine to work better for you and enable users to get more accurate search results.
Architecture

Dynamics integration with Azure Search service can be integrated in the following pattern:

 

  1. Integration through web resources: These web resources will host a web application acting as a client to the search service. The web resource can be a HTML file or an iFrame hosted on forms. The important point in this approach to ensure cross-origin settings in the client application and writing your html in a secure way and according to the best practices.
  2. Integration through custom power platform controls. You may build your own custom control which sends REST requests to the Azure Search and display results by consumes REST responses. The custom control can call Azure Search services using Actions or direct REST calls to Azure Service.
  3. Azure Search works based on indexes and your first step is to push your CRM searchable data to Azure Search indexes. This can be done using Microsoft Flow, Microsoft App Logics, custom solutions or Azure Data Factory. I have used all these tools in my implementations, and you can opt to any of these tools based on your requirements.
  4. Once the data is in your data store, you can create your indexes in the Azure Search. You can go for separate indexes for each data source or combine multiple data sources in one index. Each approach has its own requirements which will need to be met either in your client web application or a separate azure compute resource. Once indexing is done, you can make use Azure Search Rest API directly or using Azure API management to expose your search service to your Dynamics solution.
Summing these all up, you see as business application products get more sophisticated and organizations move from data to big data, engineers now must look for innovative approaches to implement Dynamics Solutions. Microsoft Azure along with Dynamics platform offers necessary tools to solution architects to design such solutions.

Virtual Entities 0x80040203 Invalid Argument Error

I stumbled upon this issue after creating a custom virtual entity data provider. Ivan Ficko has a great tutorial on this here.

The subgrid displayed records perfectly fine in the old web client, but in UCI i received the below error message “0x80040203 Invalid Argument”. After some search and only finding a single post regarding this with no answers, i decided to take matters into my own hands! Digging through my¬† browsers console i managed to find additional information regarding this error. Inspecting the¬† exception, i found the message “entity name is invalid”.

Continue reading

Custom views on lookup wont work without name field

Quick tip here, when setting a custom view on a lookup in the form designer:

pic1

Always ensure that the name attribute is on the view too, even if it won’t be displayed.

pic2

I had noticed that my search results were not working at all when I had typed in the lookup field. After an hour of troubleshooting, I added the name field onto the view and voila! Search and the control, in general, started behaving.

Solution Layering

I’ve recently noticed the Solution Layers button but knew next to nothing about its functionality.¬† It was added to my ever growing list of, “Ok, I need to check that out when I have some time!” While on a call this past week, the Solution Layers feature came up. After a brief overview on the call and some poking around afterwards, it looks to be a useful feature for developers, business analysts, and administrators.

What are Solution Layers?

Solution Layers is not some hidden, mystery feature.  Microsoft has done a great job recently with their online documentation and the article titled View solution layers includes a nice quick explanation of Solution layers:

  • Let you see the order in which a solution changed a component.
  • Let you view all properties of a component within a specific solution, including the changes to the component.
  • Can be used to troubleshoot dependency or solution-layering issues by displaying change details for a component that was introduced by a solution change.

So the Solution Layers tool offers insight into system components and their relationships to Solution deployments. The significant bit here to me is that it shows changes to the component and when the installation or updates were introduced.

Where do I find Solution Layers?

When you select a Solution component, such as an Entity, Process, or Web Resource, or sub component such as an Entity Form or Attribute, you will now see a button labeled Solution Layers.

For example, I opened the Power Apps Checker solution in a recently provisioned demo environment.  Expanding the Entities, we can see the button on the Analysis Result Detail Entity. Drilling into the Forms list, we see the tool button available with the Information main Form.  

Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity Information Form
Solution Layers for the Analysis Result Detail Entity Information Form

If you open the Solution Layers dialog for the Analysis Result Detail Entity, we can see a one item list of Solutions.  This is a list of the Solutions to which this Entity is related.

Entity level Solution Layers
Entity level Solution Layers

Select the Solution listed and you can view the Analysis Result Detail Entity details that are related to the Solution.

 Analysis Result Detail Entity Solution Layer Details
Analysis Result Detail Entity Solution Layer Details

This view provides the list of the changed properties for the Entity when the Solution was imported in the first Changed Properties ‘tab’, and the full list of Entity properties in the All Properties tab. If we open the Information Form for this Entity, we see very similar Information: a single Solution and the detailed changes of the selected Entity Form for that Solution import.¬†

We only see one item in both the Entity and Entity Form levels because this Entity and all of its components are unique to this Solution. We can also see the list of Changed Properties is the same as the list of All Properties. This tells us that the Analysis Result Detail Entity was installed with Power Apps Checker solution and has not been affected by any other Solution installs.

That is some nice information, but not especially useful. The Solution Layers component really shines when we look at Entities that can be impacted by other solution imports.  For example, a system Entity Contact can be impacted by many different Solutions on your system. Or you may have a custom Entity being deploying as part of a product or an ongoing project that will see regular changes, whether through major Solution releases or hotfix style solution deployments.

Contact is a popular Entity

If we open a different solution that contains the Contact Entity, we see the real power behind this tool. If we open the solution named Sales Navigator for Dynamics 365 Unified Interface that comes with my demo environment, and view the Contact Entity Solution Layers, we see some immediate differences.

Contact Solution Layers Detail - lots of changes!
Contact Solution Layers Detail – lots of changes!

The Contact Entity has been changed by 21 separate Solutions. The first at the bottom of the list is System, but at the top we see Active as the latest. This means that the Entity or one or more Entity sub components were updated with each of these 21 Solution imports. So, how do we see more detail on all of these Entity changes?

Deltas!

If we dig deeper into the Solution components, we can see more granular detail of the changes. We can drill into the Contact Forms list for this Solution and open the Contact Form Solution Layers dialog.

In this view, we can see that the Contact Form has been updated by 11 different Solution Imports. But what has been changed? Open up a solution from the list to find out:

Contact Form Solution Layers Detail
Contact Form Solution Layers Detail

In this view under Changed Properties, we can see detailed changes that were made with the Solution Import. In this example, we see the underlying Form JSON value was updated, and if you scroll a bit, you will also see that the Form XML. With other value types, such as numbers or boolean values, it’s easy to see the changed value.

For more complex types like Form JSON or XML, you can compare the differences to the previous Solution Layer value. Simply open the previous Solution Layer from the list and view the property value under the All Properties view using a standard text diff tool such as WinDiff or Visual Studio.

Why is this a big deal?

The Dynamics 365 CE and the Power Platform with CDS now has a built in method for change tracking of various layers of the solution components. I include the Power Platform here because when you view an Entity from a Model Driven Power Apps , you have the option of switching to Classic View. In Classic View, you can view the Solution Layers exactly as if you were working within a Dynamics 365 CE solution.

This can be incredibly useful when troubleshooting issues or just managing your own deployments. With solid DevOps practices in place, you should be able to view content like this using source code control tools. But if you are working on a project for which those practices were not well established, I can see this feature as a huge help for developers, business analysts, or system administrators.

I recommend reviewing the article listed above and playing around with the feature. For example, check out changes to solution components like Workflows where you can view the changes to the underlying XAML that contains the workflow logic.

I will be looking into it in more detail myself because I can see the possibility for some nice tools built around this capability!

Streaming Data Sets into Dynamics 365 Customer Engagement

In this post, we are going to look at the challenge of how to display streaming data sets directly onto a Dynamics 365 Customer Engagement form. While there already exists a way to embed Power BI dashboards and reports within Dynamics 365 Customer Engagement, these are not on a form level. To see how to do this currently, have a look here. When followed, you should observe results similar to the following, where a dashboard is initially displayed and then you can click though to the underlying report(s):
 
 
What you’ll notice from this is that these are personal dashboards that lack the ability to be contextually filtered. So to resolve this, we are going to create a Web Resource that has the ability to display a contextual¬†(and streaming) dashboard on a Dynamics 365 Customer Engagement form!
 
To get started, lets have a look at what this will look like architecturally:
 
 
From the architecture, you should notice that we need to create a custom HTML Web Resource that will serve as a placeholder for the Power BI dashboard. When the form loads, we are going to use JavaScript to process the incoming parameters which can include both configurations and contextual data based on the record (form) that the Web Resource is being rendered on. The JavaScript will then call a reusable Dynamics 365 Action that will consume the incoming parameters before calling a Dynamics 365 Plugin. This plugin is necessary as it will help us execute a token exchange with the Azure Key Vault based on the currently logged in user. This token is then used in retrieving a specific secret which contains the required configurations necessary to render the Power BI report contextually and in an authenticated state back on the Dynamics 365 Customer Engagement form.
 
Simultaneously, the Power BI dashboard will be receiving a continuous stream of data from an MX Chip (IoT Device) that is connected to an Azure IoT Hub. This stream of data is provided through the Stream Analytics service which continually processes the incoming data and is able to send it as an output direct to Power BI before it is visualised. For reference, the Stream Analytics Job should look something similar to this:
 
 
You will notice that there is a dedicated Power BI output in the above and that we have limited the Stream Analytics job just to look for our MX Chip device. We also need to include a bit of DAX to format the incoming IoTAlert data to be a bit more readable. Examples of the incoming data, the DAX, and the Power BI configs are below:
 
 
As a result of this, we should now be able to see the streaming data set on the Dynamics 365 Customer Engagement form after a bit of Power BI visualisation magic as follows:
 
 
As we have parameterised the initial Web Resource on the form, this Dashboard is able to pre-filter visuals should we wish, and can also easily be embedded on the form and record type of your choosing! The following video demonstrates the complete pattern in action:

 

DYNAMICS CE WORKFLOWS SCHEDULING USING AZURE FUNCTION APP WITH TIMERS

A ‘making-dynamics guy’s life-easy‘ solution to schedule your Dynamics CE out of box workflows to run on particular frequencies is finally here!

System workflows are the best when it comes to doing a simple task without having to put our heads into writing hell a lot of coding. However, the real pain comes into scene when you want to schedule them as per your requirements. Well, if you’re wondering how you could make this work out in a simple way, here’s the good news – this is totally achievable using the winning combo of an Azure function app with a timer associated with it. If you want to read more about the how Azure function works, you can use this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview

Now, if you want to dive right in, you’re in the right place.

 

ADVANTAGES:

 

  1. Unlike other solutions, the usage of Azure functions allow you to enjoy the benefits of a server-less setup. These are perfectly designed to run without a server and to integrate and monitor jobs that run within CE.
  2. Connection to CE can be made by referring to the core SDK libraries using NuGet.
  3. It consumes less number of resources for running, without having to use custom entities in CE to configure the scheduler.
  4. Easy management of the functions that are set up. You can enable or disable them as and when required just by a button click.
  5. Detailed logging of successes and failures of the workflows that are being executed on frequencies
  6. Handles bulk jobs with a function timeout of 10 minutes. (how cool is that!)

 

PRE-REQUISITIES:

 

This list is surprisingly not long. All you need for this to be set up successfully is, an Azure Subscription or a free Azure trial login account to give it a go.

 

STEPS:

 

  1. Login to your Azure Account, from https://portal.azure.com. You will be able to see your Dashboard in the home screen.
  2. Click on ‘Create a resource’ option, located on the upper left-hand corner of the page.
  3. Type in ‘Function App’ in the search box that appears, enter all the required values and click on create. Once the function starts deploying, wait for the Deployment Succeeded message to appear in your notifications.

  1. Open the app that you just created and create a new function for the app. Make sure you select the type as ‘Timer Trigger’ while you create , as shown below

  1. Set a schedule timer using CRON expression which is displayed under the Integrate section of the function. The format of this expression will be {second} {minute} {hour} {day} {month} {day-of-week}.

I have set the timer expression as 0 */5 * * * *, which means that the workflow will run for every 5 minutes. To know more about different timer settings, refer this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer.

 

  1. Connect to Dynamics CE by referring to the core SDK assemblies using NuGet. Go to Platform features tab on the function and click on App Service Editor. This will open up all the files in the folder in a new window. Create a new file called ‘project.json’ within the same function folder. Use the following code snippet to connect to the CE SDK assemblies.

{

“frameworks”: {

“net46”:{

“dependencies”: {

“Microsoft.CrmSdk.CoreAssemblies”: “8.2.0.2”

}

}

}

}

 

 

  1. We will then add configuration parameters in Application settings of the function, for the C# code to run. These parameters include the CRM instance URL that you are connecting to, the appropriate credentials for connection and the actual name of the workflow that needs to run on the scheduled time.

  1. Now, we add in the following piece of code that triggers the workflow specified in the configuration parameters using the credentials mentioned in the above step.

using System.Net;

using System.Configuration;

using Microsoft.Xrm.Sdk;

using Microsoft.Xrm.Sdk.Client;

using Microsoft.Crm.Sdk.Messages;

using Microsoft.Xrm.Sdk.Query;

 

 

public static void Run(TimerInfo myTimer, TraceWriter log)

{

IServiceManagement<IOrganizationService> orgServiceManagement = ServiceConfigurationFactory.CreateManagement<IOrganizationService>(new Uri(ConfigurationManager.AppSettings[“CRMinstance”]));

 

//Connect to the CRM instance

AuthenticationCredentials authCredentials = new AuthenticationCredentials();

authCredentials.ClientCredentials.UserName.UserName = ConfigurationManager.AppSettings[“CRMusername”];

authCredentials.ClientCredentials.UserName.Password = ConfigurationManager.AppSettings[“CRMpassword”];

AuthenticationCredentials tokenCredentials = orgServiceManagement.Authenticate(authCredentials);

 

//Retreive the service

IOrganizationService service = new OrganizationServiceProxy(orgServiceManagement, tokenCredentials.SecurityTokenResponse);

 

//Get the workflow GUID to run from workflow name

QueryExpression objQueryExpression = new QueryExpression(“workflow”);

objQueryExpression.ColumnSet = new ColumnSet(true);

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“name”, ConditionOperator.Equal, ConfigurationManager.AppSettings[“CRMworkflow”]));

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“parentworkflowid”, ConditionOperator.Null));

EntityCollection entColWorkflows = service.RetrieveMultiple(objQueryExpression);

if (entColWorkflows != null && entColWorkflows.Entities.Count > 0)

{

 

Guid workflowGuid = entColWorkflows.Entities[0].Id;

if(workflowGuid != null)

{

//Get the fetchxml string from Configuration

string entitySetting = ConfigurationManager.AppSettings[“CRMFetchString”];

FetchExpression fetchRecords = new FetchExpression(entitySetting);

 

EntityCollection recordsCollection = service.RetrieveMultiple(fetchRecords);

if (recordsCollection.Entities.Count > 0)

{

log.Info($”Records fetched : {recordsCollection.Entities.Count} at {DateTime.Now}”);

foreach (Entity e in recordsCollection.Entities)

{

ExecuteWorkflowRequest request = new ExecuteWorkflowRequest()

{

WorkflowId = workflowGuid,

EntityId = e.Id

};

log.Info($”Executed workflow successfully : {DateTime.Now}”);

 

// Execute the workflow.

service.Execute(request);

}

}

}

}

 

log.Info($”C# Timer trigger function executed at: {DateTime.Now}”);

}

 

  1. You can test run the C# code you added in the above step to make sure there are no errors.

 

  1. The function is by default enabled, and it can be disabled anytime you want by clicking on the enabled/disabled toggle button under the Manage option of the function. (I have disabled my function and that’s the reason why it has prefixed (disabled) to my function name).

 

  1. The ‘Monitor’ option of the function allows you to check for successes and failures of the function including the detailed logs included in the code.

 

 

And, that is all! Your azure function will keep running the specified workflow until you disable it.