Dynamics 365 for Field Services

How To Start Your Exciting Journey Of “Connected Field Service” And “Azure Iot Hub”?

After my last article I spent some time to explore Azure IoT and Connected Field service. I have been watching presentations and demos on YouTube and I saw many good demonstrations about the capabilities of Azure IoT Hub and Connected Field service. However, I wanted to explore more and see how things really work under hood of the platform. So, in this and few upcoming posts I am summarising my key learnings for the benefit of fellows who want to start.

Note: I structured this blog series based on my learning journey. It is designed based on questions of CRM consultants who wants to start their Connected Field Service Journey.

Let’s start by some key definitions which you will hear a lot in your journey and better be familiar at the start:

IoT Device

IoT Device is a piece of hardware with a small circuit (called Microcontroller), capable of connecting to the internet using WIFI or Mobile services. The purpose of the microcontroller is to capture data from the surrounding and send it to consume services over the internet. In other words, IoT device is a minicomputer capable of sending (and sometimes processing) data over the internet.

Raspberry PI / Windows IoT

The IoT device mentioned above can be a small circuit to react to an event like switch on/switch off or can be very powerful to process images in the device itself. When we develop very powerful devices capable of processing data and applying complex logics, we will need a program to run the logic or process data. To enable this processing, we need a lightweight operating system to host our application. The operating system suitable for operating these powerful devices could be Raspberry PI or Windows IoT.

Azure Internet of Things (IoT)

Azure IoT is series of managed services combined to connect, monitor and control smart IoT enabled devices. So basically, the device’s sensors detect events from surrounding environments and pass the event and data using Microcontroller to Azure IoT service. Azure IoT receives the data from millions of devices and sends it for further processing services for giving the data meaning.

Azure IoT Hub

IoT Hub is a central message hub for bi-directional communication between devices and Azure IoT. So, imagine Azure IoT hub as a heart of Azure IoT service.

Azure IoT Edge

As we move towards future, we will have more IoT devices connected to the Azure IoT hub. Each of these devices will send raw messages to Azure IoT hub for processing. This is ok since the Azure IoT hub is a scalable service, but would not it be more efficient if we use the powerful hardware to pre-process data in devices before sending it to the Azure IoT hub? Azure IoT Edge enables us to write programs to process data inside device before sending it to azure and/or send only the necessary data/events. This will help to offload processing in Azure and reduce traffic. If you have a powerful tool why not use it?

IoT Device Twin

Device twin is logical representation of the device status in Azure. For example, you have a smart bulb which is installed in the Branch A. The device twin can contain the location so you can query twins and identify installed devices in the Branch A.  Or you can store the firmware version in Device Twins and at the time of firmware upgrade, you can upgrade only old firmware versions.

Connected Field Service

The Azure IoT gives you the technology enablement to interact with devices. However, the real value of IoT emerges when you give a business context to it. The connected field service gives the business context to your IoT data. The connected field service comes with an accelerator. When you install the connected field service from the AppSource, you will get all entities and required components in your Dynamics environment. It gives you the ability to configure Dynamics to interact with devices, listen to them and be notified when they don’t feel right.

I have seen IoT Central. What is it and how different is it with Azure IoT Hub?

According to the Microsoft documentation, Azure IoT Central is a software as a service (SaaS) solution that uses a model-based approach to help you to build enterprise-grade IoT solutions without requiring expertise in cloud-solution development. However, Azure IoT Hub is the core Azure PaaS that both Azure IoT Central and Azure IoT solution accelerators use. IoT Hub supports reliable and secure bidirectional communications between millions of IoT devices and a cloud solution.

So, in summary if you have Azure Skills and you want flexibility to expand/manage your solution go for IoT Hub. But if you want an accelerator without extension requirement, then IoT Central is the way forward.

Where do I start my journey?

Where to begin
Where to begin – Learning Path

If you want to start directly from Dynamics you simply can follow instructions here

Otherwise follow the following steps:

  1. Provision your Azure IoT Service (Link)
  2. Create a simulated device (Link)
  3. Send and receive messages (Link)

I have a smart door lock. I want to be notified when it is unlocked. Where do I start?

Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes.To implement this scenario, you will need to register your smart door lock in Azure IoT as mentioned above. Once the device is registered, you will need to define Unlock event in IoT service to enable the service to receive Unlock events. By default, the Azure IoT provides 4 event types:

  • Device Created
  • Device Deleted
  • Device Connected
  • Device Disconnected

However, for our scenario we will need to create a custom event Unlock. To define the custom event you can follow this link.

How do I route incoming message to Dynamics?

Once you define your events above, you will need to create a routing channel for the desired events. Once the Unlock events is detected in the IoT, you can define your routing to send your messages to an endpoint. You can configure your routing as per this link.

How to detect if my device disconnects from IoT?

It is very simple, as mentioned above, the Disconnection event is a default event which is triggered once your device is disconnected. All you must do it configure a service to react to the disconnection event.

How to orchestrate events from IoT to Dynamics?

If you are not using Connected Field Service solution from AppSource, you can configure a Flow to send messages across. There are triggers for IoT hub and events which you can use.

IoT Flow Template
IoT Flow Template

What kind of messages can be transmitted between device and IoT?

How to call #webapi from #PowerPlatform #Portals

In almost all of my portal projects I get a question from my clients: “How to call an external api from #portals”? This question has been common that I decided to write about my experience on this topic which might be helpful for the community. This post will focus on two main areas:

  1. The available options to integrate #portals with external
  2. A step by step guide on one of the least discussed options which is using Oauth Implicit Grant flow and how I created a simple demo for one of my customers

Scenario

I would like to give a business context to this scenario. Any enterprise solution requires integration and interaction of multiple systems which #portals could be one of them. Imagine a scenario where a customer is looking for a quote on a product in the company portal. In this case the #portal is required to bring quote details from CPQ (Configure Price Quote) system to the portal. In another scenario, a #portal is required to integrate with a core banking system to get the customer’s latest balances. In these scenarios and similar ones, we will require the #portal to integrate with an external api to get information.

In order to enable such integrations, the #portal must be able to make calls in a secure way as most of the internal systems require authentication before anything can happen. So what are the options available?

Solutions

Since #powerplatform #portals are tightly integrated with #powerplatform, in most cases the integration is done through the #powerplatform itself. However, the integration through these #powerplatform has three flavors.

  1. The first one is creating actions in the platform which communicated with external API and manages the requests and responses; then calling the actions through a workflow where the workflow is triggered using Entity Form or Entity List events. 
Portal Integration with Web Api
Portal Integration with Web Api using Actions

 

  • The second option is to use #MicrosoftFlow to encapsulate the Workflow and Action part in a Flow. The benefit of this solution is that you won’t need to write code (in most cases but not guaranteed) to call #webapi

    Portal Integration using Flow
    Portal Integration using Flow
  • The above two options, use #PowerPlatform to facilitate the integration and all calls are routed through the platform. However, going through the server is not always feasible. There are situations in which you would like to make client side calls from javascript using Ajax from #portals to call external API. However, the main concerns in these scenarios are authentication. And the solution provided by the platform is “Oauth Implicit Grant Flow“.If you would like to learn more about what is the ”

    Oauth Implicit Grant Flow” beyond the #PowerPlatform, you can read more here.

 

There are concerns over the Oauth Implicit Grant flow and the recommendation is to use “Oauth code grant flow”. According to the Oauth working group, “t is generally not recommended to use the implicit flow (and some servers prohibit this flow entirely). In the time since the spec was originally written, the industry best practice has changed to recommend that public clients should use the authorization code flow with the PKCE extension instead.”. Microsoft is aware of this restriction however, it is believed Oath implicit grant flow is still ok to use.

I have proposed an idea to implement the Oauth code grant flow in this IDEA. Please vote for it.

Now getting back to the topic: How to Integrate:

Portal Integration with Oauth Implicit Grant Flow
Portal Integration with Oauth Implicit Grant Flow

In this scenario, there is no server side calls are required. A complete documentation is available here. However, the documentation is not very helpful if you want to do things quickly since there is a learning cycle involved. OAuth 2.0 implicit grant flow supports endpoints that a client can call to get an ID token. Two endpoints are used for this purpose: authorize and token. I will not go to the details of these calls and I assume you already know what these are.

So here is what you will have to do:

  1. Create your web api. You can download the sample api from this Github project. This website is no different than any MVP website. So you can create your own with Web APIs. 
  2. Next is to register your application in Azure Active Directory. This is a free service which you can use to provide authentication to your web api. A step by step details of the registration process is in this link.The REDIRECT URL must be the direct link to the page you created in the step # 2. You will need to note the following after this step:

    – Client ID
    – Redirect URL

  3. Let’s say you have a Quote page in your portal and you would like to place a button on the portal page to get Quotations from your internal website. You will have to put a custom HTML in your “Content Page” (not the main page) of the portal. This custom HTML will be used to add a QUOTE button to the portal and also retrieve the Quotation by use of a custom javascrtip code.
<h2>The QUOTE BUTTON</h2>

<button type="button" onclick="callAuthorizeEndpoint()">Give me a Quote!</button>

 <script>
//Remove this line to avoid State validation
$.cookie("useStateValidation",1);
function callAuthorizeEndpoint(){
//Used for State validation
var useStateValidation = $.cookie("useStateValidation");
var appStateKey = 'p07T@lst@T3';
var sampleAppState = {id:500, name:"logic"};
//Replace with Client Id Registered on CRM
var clientId = "CLIENT ID OBTAINED FROM AZURE ACTIVE DIRECTORY";
//Replace with Redirect URL registered on CRM
var redirectUri = encodeURIComponent("https://MYPORTAL.powerappsportals.com/REDIRECT_PAGE/");
//Authorize Endpoint
var redirectLocation = `/_services/auth/authorize?client_id=${clientId}&redirect_uri=${redirectUri}`;
//Save state in a cookie if State validation is enabled
if(useStateValidation){
$.cookie(appStateKey, JSON.stringify(sampleAppState));
redirectLocation = redirectLocation + `&state=${appStateKey}`;
console.log("Added State Parameter");
}

//Redirect
window.location = redirectLocation;
}
</script>


  1. Modify the source code in the web api website to use the Client ID and Redirect URL in its startup page.
public virtual Task ValidateIdentity(OAuthValidateIdentityContext context)
        {
try
{
if (!context.Request.Headers.ContainsKey("Authorization"))
{
return Task.FromResult<object>(null);
}

// Retrieve the JWT token in Authorization Header
var jwt = context.Request.Headers["Authorization"].Replace("Bearer ", string.Empty);
var handler = new JwtSecurityTokenHandler();
var token = new JwtSecurityToken(jwt);
var claimIdentity = new ClaimsIdentity(token.Claims, DefaultAuthenticationTypes.ExternalBearer);
var param = new TokenValidationParameters
{
ValidateAudience = false, // Make this false if token was generated without clientId
ValidAudience = "CLIENT ID", //Replace with Client Id Registered on CRM. Token should have been fetched with the same clientId.
ValidateIssuer = true,
IssuerSigningKey = _signingKey,
IssuerValidator = (issuer, securityToken, parameters) =>
{
var allowed = GetAllowedPortal().Trim().ToLowerInvariant();

if (issuer.ToLowerInvariant().Equals(allowed))
{
return issuer;
}
throw new Exception("Token Issuer is not a known Portal");
}
};

SecurityToken validatedToken = null;
handler.ValidateToken(token.RawData, param, out validatedToken);
var claimPrincipal = new ClaimsPrincipal(claimIdentity);
context.Response.Context.Authentication.User = claimPrincipal;
context.Validated(claimIdentity);
}
catch(Exception exception)
{
System.Diagnostics.Debug.WriteLine(exception);
return null;
}
return Task.FromResult<object>(null);

}
  1. The next step is to use Custom HTML on the Redirect PAGE so that you can make the call to the Web API by the token obtained in this step.
function getResultInUrlFragment(hash){
    if(hash){
        var result = {};
        hash.substring("1").split('&').forEach(function(keyValuePair){
            var arr = keyValuePair.split('=');
//  Add to result, only the keys with values
            arr[1] && (result[arr[0]] = arr[1]);
        });
return result;
    }
else{
return null;
    }
}
//Validate State parameter
//Returns true for valid state and false otherwise
function validateState(stateInUrlFragment){
if(!stateInUrlFragment){
console.error("State Validation Failed. State parameter not found in URL fragment");
return false;
    }

// State parameter in URL Fragment doesn't have a corresponding cookie.
if(!$.cookie(stateInUrlFragment)){
console.error("State Validation Failed. Invalid state parameter");
return false;
    }
return true;
}

var useStateValidation = $.cookie("useStateValidation");
var appState = null;

//Fetch the parameters in Url fragment
var authorizeEndpointResult = getResultInUrlFragment(window.location.hash);

//Validate State
if(useStateValidation){
if(!validateState(authorizeEndpointResult.state)){
authorizeEndpointResult = null;
    }
else{
appState = $.cookie(authorizeEndpointResult.state);        
console.log("State: "+appState);
    }
}

//Display token
if(authorizeEndpointResult){
    var data = authorizeEndpointResult.token;
console.log("Token:" + data);
   $.ajax({
type: "GET",
url: "https://URL_TO_THE_WEB_API.azurewebsites.net/api/external/ping",
contentType: "application/json; charset=utf-8",
dataType: "json",
headers: {
Accept:"text/plain; charset=utf-8",
        "Authorization": "Bearer "+data
},
success: function (data) {
alert(JSON.stringify(data));
console.log(data);
}, //End of AJAX Success function
failure: function (data) {
alert(data.responseText);
}, //End of AJAX failure function
error: function (data) {
alert(data.responseText);
} //End of AJAX error function
});
}

I hope this post helps you a bit to make your portals connect to the outside world!

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

D365UG Bristol

Man, oh MAN! I am just on the train back from Bristol after attending their first 365 User Group. and it was AWESOME!

First up, big props to the organisers @lubo @dbarber @xrmjoel @robpeledie @leembaker for doing such an amazing job. The venue was great, the food was too (including vegan options :-)) and the swag bags were epic.

But more importantly, the speakers were amazing.

First up, Mark Smith did a fantastic presentation to explain the (seemingly unexplainable) Common Data Model, which is something that sits on the CDS and exists to make sense of your data so that it is usable and, more importantly, means you can build apps faster. Considering he knew nothing about the CDM 5 days ago, this was a great presentation which goes to show how quickly you can assimilate information when the data is available in a usable format (see what I did there *winks*)

Second up was @scott-durowdevelop1-net showing us how much the Power Platform has changed in such a short space of time, and why that’s such a good thing, enabling access to a much wider audience and giving rise to things like the Citizen Developers who can effect such great change in an organisation, and how we all need to embrace change and be adaptable.

And finally, @themarkchristie with an entertaining presentation on how he bought headphones at the Microsoft store and @lubo busted them 5 minutes later – but not to worry, Virtual agent was able to help him raise a support ticket through Forms Pro, which then could be assigned to someone in Field Service, leading to the headphones being fixed (whilst being worn by Mark Smith).

I learned a lot about how to better use the tech we have openly available to us in a more inventive way, and how to work with the community to find answers to stuff that you are struggling with.

I love these community events and I hope to attend each and every one that I can.

Again, big thanks and well done to the organisers.

Love to all

Alison

(Me when I won a Flic button from @themarkchristie)

Wave 2 Release Plan with the MVP’s Part 2

Part 2 of our MVP review of the PowerPlatform 2019 Wave 2 Release Plan. In this episode I am joined by Mark Christie, Iain Connolly, Andrew Bibby, and Shawn Tabor to discuss:

  • Field Service
  • Unified Interface
  • Canvas Apps
  • Microsoft Flow
  • Common Data Service
  • The death of Task Flows

And in case you missed it, catch part 1 here.

Opening Dynamic CRM Entity Form by passing Query String

Photo by Luca Bravo on Unsplash
One of the awesome features of the Power Platform is its extension capabilities. We often talk about integrating Power Platform using web services, azure services or plugins but we overlook the platform client side capabilities. The Dynamics platform allows interacting with resources using Addressable elements. URL addressable elements enable you to include links to Dynamics 365 for Customer Engagement apps forms, views, dialogs, and reports in other applications. In this manner, you can easily extend other applications, reports, or websites so that users can view information and perform actions without switching applications.

Requirement

I had a requirement to open an Account entity form based on the Account Telephone number. The Dynamics platform allows only to open the entity record in edit mode only by passing the entity ID. However, my requirement was to open the entity form based on the telephone number.

Considerations

  • Opening a form in edit mode is possible only if we know the ID (or GUID) of the record. If you pass any other query string like telephone, employeeno or etc.  you will receive 500-internal error.
  • You will need an HTML webresource as intermediate component to resolve your query string and in my case telephone to the entity ID and then open the form in edit mode by passing ID.
  • The only query string name you can use to pass to the organization URL is “data”. If you use any other query string name such as employeeId, contactid and etc. will lead you to the 500-Internal Server Error.
  • You will need to use GlobalContext by calling getGlobalContext method in your web resource. The getQueryStringParameters method is deprecated. You will need to find another way to get the value of your query string. I used Andrew Butenko post to extract query string. A big shout out to Andrew Putenko. At the same time a big shout out to Jason Lattimer for his great CRMRestBuilder.

Solution

I used an HTML webresource, with a Javascript function to extract and resolve query string and then call OpenForm function to open the form.
<!DOCTYPE html>
<html lang="en" xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta charset="utf-8" />
    <title>Web Resource</title>
    <script src="ClientGlobalContext.js.aspx" type="text/javascript"></script>
    <script src="https://code.jquery.com/jquery-3.4.1.min.js"> </script>
    <script>
        function Onload() {
            var queryString = location.search.substring(1);
            var params = {};
            var queryStringParts = queryString.split("&");
            for (var i = 0; i < queryStringParts.length; i++) {
                var pieces = queryStringParts[i].split("=");
                params[pieces[0]] = pieces.length == 1 ? null : decodeURIComponent(pieces[1]);
            }

            var phone = params["data"];//formContext.data.attributes["data"];
            var req = new XMLHttpRequest();
            req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v9.1/accounts?$select=accountid&$filter=telephone1 eq '" + phone + "'", true);
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("Prefer", "odata.include-annotations=\"*\",odata.maxpagesize=1");
            req.onreadystatechange = function () {
                if (this.readyState === 4) {
                    req.onreadystatechange = null;
                    if (this.status === 200) {
                        var results = JSON.parse(this.response);
                        for (var i = 0; i < results.value.length; i++) {
                            var accountid = results.value[i]["accountid"];
                            OpenForm("account", accountid);
                        }
                    } else {
                        Xrm.Utility.alertDialog(this.statusText);
                    }
                }
            };
            req.send();           
        }

        function OpenForm(entity, id) {
            var entityFormOptions = {};
            entityFormOptions["entityName"] = entity;
            entityFormOptions["openInNewWindow"] = true;
            entityFormOptions["entityId"] = id;
            Xrm.Navigation.openForm(entityFormOptions).then(
                function (success) {
                    console.log(success);
                },
                function (error) {
                    console.log(error);
                });
        }
</script>
</head>
<body>
<script>Onload();</script>
</body>
</html>

Streaming Data Sets into Dynamics 365 Customer Engagement

In this post, we are going to look at the challenge of how to display streaming data sets directly onto a Dynamics 365 Customer Engagement form. While there already exists a way to embed Power BI dashboards and reports within Dynamics 365 Customer Engagement, these are not on a form level. To see how to do this currently, have a look here. When followed, you should observe results similar to the following, where a dashboard is initially displayed and then you can click though to the underlying report(s):
 
 
What you’ll notice from this is that these are personal dashboards that lack the ability to be contextually filtered. So to resolve this, we are going to create a Web Resource that has the ability to display a contextual (and streaming) dashboard on a Dynamics 365 Customer Engagement form!
 
To get started, lets have a look at what this will look like architecturally:
 
 
From the architecture, you should notice that we need to create a custom HTML Web Resource that will serve as a placeholder for the Power BI dashboard. When the form loads, we are going to use JavaScript to process the incoming parameters which can include both configurations and contextual data based on the record (form) that the Web Resource is being rendered on. The JavaScript will then call a reusable Dynamics 365 Action that will consume the incoming parameters before calling a Dynamics 365 Plugin. This plugin is necessary as it will help us execute a token exchange with the Azure Key Vault based on the currently logged in user. This token is then used in retrieving a specific secret which contains the required configurations necessary to render the Power BI report contextually and in an authenticated state back on the Dynamics 365 Customer Engagement form.
 
Simultaneously, the Power BI dashboard will be receiving a continuous stream of data from an MX Chip (IoT Device) that is connected to an Azure IoT Hub. This stream of data is provided through the Stream Analytics service which continually processes the incoming data and is able to send it as an output direct to Power BI before it is visualised. For reference, the Stream Analytics Job should look something similar to this:
 
 
You will notice that there is a dedicated Power BI output in the above and that we have limited the Stream Analytics job just to look for our MX Chip device. We also need to include a bit of DAX to format the incoming IoTAlert data to be a bit more readable. Examples of the incoming data, the DAX, and the Power BI configs are below:
 
 
As a result of this, we should now be able to see the streaming data set on the Dynamics 365 Customer Engagement form after a bit of Power BI visualisation magic as follows:
 
 
As we have parameterised the initial Web Resource on the form, this Dashboard is able to pre-filter visuals should we wish, and can also easily be embedded on the form and record type of your choosing! The following video demonstrates the complete pattern in action:

 

My Quest to D365 Saturday Stockholm

Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to meet a load of new people for the first time which was AMAZING!

These events are so important for the community because they are often the only opportunities some of the community members really have to interact with other customers, partners ISVs and Microsoft employees. I love running into people that have encountered the same issues as I have, that way I know I’m not going bonkers and we can work on a solution together.

The crowd was great! There were many enthusiastic people in the audience who were really getting involved in the sessions, looking for information and really testing all of the speakers knowledge. You can find the list of sessions and speakers HERE.

I big reason I really enjoyed this event was the different layers and levels of content being shared across the sessions. The sessions were split into three tracks, these being:

Applications (Dynamics 365 CE), Dev (Dynamics 365 CE) and Business & Project Management. This gave participants the opportunity to stick to a single, themed, track or weave between tracks. This is pretty much how my experience went. I went to at least 1 session from each track. I wanted to get a flavour for everything. I also got to see some wizardry from folks like Julie Yack, Gus Gonzalez, Nick Doelman and Gustaf Westerlund.

Other presenters and panellists included Sara Lagerquist, Jonas Rapp, Fredrik Niederud, Katherine Hogseth, Mark Smith and Antti Pajunen. Each delivering some amazing content based on their experiences with Dynamics 365 and Power Platform.

There was a plethora of information and content being shared between speakers and passionate attendees. Everything from Microsoft Portals and Social Engagement to developing your own XrmToolBox tools (Careful with the spelling here….HAHA) was being talked about. I personally got involved in a number of Power Platform conversations, which suited me just fine because that’s kinda what I’m doing at the moment.

I had the pleasure of running 2 sessions. One in the Application track and one in the Dev track (I am no developer… Don’t judge me). The 2 sessions were:

  1. Power Platform – How it all fits together (Download Here)
  2. Building your first Canvas App with CDS, and Azure (Download Here)

Apparently people don’t like the first 2 rows.. great crowd though! Thanks to everyone that attended. Try work out what Mark is doing in the background there! HAHA!

The first presentation focused on the different elements of the Power Platform and the way it all works together. Many Dynamics 365 users often worry a bit about this because it seems so large and complicated, but it really isn’t once you have wrapped your head around the different technologies. To highlight the way the different elements of the technology worked together I included a Roadside Assist demonstration that was created during the PowerApps & CDS Hackathon Those Dynamics Guys and Hitachi Solutions Europe hosted together.

My second presentation consisted some of the “Do’s and Don’ts” around building your first Canvas App with your customer. I followed the presentation with the following:

  1. Adding several fields to a custom entity in the Common Data Service (CDS)
  2. Importing some data
  3. Creating a new canvas app
  4. Connecting the Canvas app to the CDS
  5. Adding in the Azure translation service to the app
  6. Publishing the app

The actual canvas app I created with the little model driven app solution, including data is available HERE.

The below pic gives of the impression that I am about to start having a conversation with my own hand, like an invisible Muppet. May be a great trick for my next demo 😀

One of the BEST sessions that I have been in was the “CAGE MATCH” moderated by the one and only Julie Yack,  This was EPIC fun! We were split into teams of 5 and given problems by the audience to resolve. It was a little daunting being in the presence of some of these long time MVPs, BUT, THE SHOW MUST GO ON, so we got stuck right in. Unfortunately, the team I was in didn’t take the win 🙁 Its cool, I am preparing my battle cards for the next one!

All in All, it was a fantastic event and a great opportunity to network with this amazing Dynamics and power Platform community that we all have grown to know and learn from. A MASSIVE thank you to the sponsors of the event!

Also, a big thanks to all of the folks that hung out after the event and enjoyed several beverages with me. Was a great time and I’d love to do it again 🙂

Here are some more delightful images from the day 🙂 My camera skills aren’t great so i had to grab a few from social. Thanks to those that grabbed pics in my session! I hope that this encourages more people to attend these events because I genuinely gain so much from being there.

Nick Doelman Smashing his Portals Presentation

One of my favourite Finns – Antti

Julie Yack doing her presentation on Social Engagement

MORE of the awesome Julie Yack

WHAT?? ANOTHER ONE of my favourite Finns – KIMMO!

JOOONNNAASS RAPP!!! The Legend!

We were all so excited! mark, Jonas and me 🙂

 

 

Dynamics 365 Saturday – Dublin 2018

The Dynamics 365 Saturday 2018 took place in Microsoft Dublin (One Microsoft Place) and was hosted by Janet Robb and her Team. Congratulations on Janet for a great work on organising such well-structured event.

Lots of cool sessions provided, as well as workshops for PowerApps and CRM for the ones who were interested.

 

Even though I’m not a Dynamics consultant, to me as a SharePoint/Office 365 Consultant, it was very cool to see how the same stack of products (Power platform + Azure) that we use on top of SharePoint/Office 365 is been adopted in the Dynamics Community as well. As mentioned on the keynote, for all D365 specialists it’s been a constant work to keep up to date with tons of new features been released (I can assure that for Office 365 as well), and an event like this helps to connect the community and to get a summarized overview of what’s been there in the market.

Special thanks to our Dynamics Guys who came from the UK and smashed in their presentations:

Chris Huntingford – Dynamics 365 for Marketing

Kylie Hill – My Favorite new D365 Features

It was a pleasure to meet those legends in person.

Also, the interaction from the community posting and sharing content and pictures on social media (Hashtag #365SatDub) was really nice, cool to see such engagement and nice pictures from everyone on Linkedin and Twitter.

Looking forward to the next #365SatDub!!

Can’t Associate Existing Contacts to an Account in the D365 UCI? – WHERE IS THE DAMN BUTTON??

Don’t feel like the emotional post? 😀 😀 FIND THE SOLUTION HERE!!

Recently I was doing some config. within Dynamics 365 for a customer that was keen on using the UCI (Unified Client Interface).  I’ve done a load of demo’s for this type of thing but never actually helped configure for a live implementation. To be deadly honest, I had to change my mindset a bit about how I designed things.

After a few hours of build i ended up setting up a little training solution that allowed users the ability to associate existing contacts to an account. A pretty straight forward N:N relationship on the account form showing the associated contacts that had been trained and the internal users that had actually done the training.

I was happy with the setup but when it came down to testing, something weird happened… There was no button available on the sub grid that allowed me to associate an existing record. The Add Existing User button was there, but no “Add Existing Contact” button 🙁 

I checked if it was working in the Web refresh… Lo and behold, i could work just fine! No issues with adding existing contacts to an account. 

I tested out different views and still no luck I even tried a 1:N relationship to Contact and it only allowed me to create a NEW contact rather than add an existing one. 

After a bit of searching… I found the one solution that worked! THANK YOU to the Joel THE TIP OF THE DAY team for this post! you saved me a TON of time. This could have been a rabbit hole. Also Scott, your Ribbon Workbench comes to the rescue again! Legends!

FIND THE SOLUTION HERE!!

Skills for the PowerApp developer

It seems to me that PowerApps will prove to be a game-changer by enabling businesses and organisations to build bespoke apps designed to meet very specific needs.

The curious thing about PowerApps is that the product is pretty well where it needs to be as of today but the major blocker for a global revolution is that there simply aren’t enough people out there with anything like the skills that they would need to be able to build robust business product.

So let’s assume that you can’t buy in PowerApps skills or maybe that you have some but need to grow some more.  What would you be looking for in an individual to identify them as a potential PowerApps developer?  In this article I’m going to outline some of the skills that I would be looking for when identifying candidates with the potential to be high performers.

Excel

It feels kind of strange to start with a product rather than personal skills but in this case the Excel formula structure and Excel cell construction is so closely related to that of PowerApps that is the number 1 skill.

The Excel and PowerApps teams work closely together so that any new formulas created are aligned to each other.  With this in mind this also opens the door to the many years of formulas created by individuals within excel that can theoretically be borrowed or referenced to enhance a PowerApp.

Maths

PowerApps is the product that is the most mathematical that I’ve ever seen.  Variables can be easily created that can be used in an algebraic fashion.  X and Y properties exist to govern the position of everything that you see on a screen.

Empathy

You need to be able to consider your products from the perspective of other users.  If people don’t like your product they won’t use it, and you should be aiming for them to love them.

Logic

To my mind there are 2 kinds of logic.  Firstly, the more mathematical/excel based ‘If X=1 to this otherwise do that’ and secondly there is the type of logic where you put yourself in the shoes of a user or administrator and be able to see whether or not a form (as in business world this is the most likely application) has a logical flow.

Design

Having an eye for design is a skill people have to a greater or lesser extent.  To be fair it doesn’t come naturally to me, but I do know that from making lots of products some designs work better than others and it is a skill that can be learned.

Initiative

PowerApps won’t go to you, you have to go to PowerApps.  If you create a blank app it will stay blank until you do something.  Additionally, many technical solutions may require techniques that feel like work-arounds mainly because the problems you have been facing haven’t been faced before, so looking them up on Google won’t necessarily yield any results.

Lifelong Learning

The product is ready, but is changing all the time.  Good sources for keeping up to date are twitter and the PowerApps blog, but even the latter tend to focus on ‘big’ updates whereas some of smaller adjustments can still be game changers.  Keep up with it.  I’ll be honest in saying that there are times when staying at the cutting edge can be difficult to manage versus traditional tools and skills that stay very similar over long periods of time.

Perseverance

Some problems will require high levels of determination to find the exact syntax you need to solve your problem and when those fail you may need to pull back stick your head up and work out if there ways around the problem or if the items is even needed (i.e. do you really need to collect the data at all or is it a nice to have)

Patience

Not necessarily my forte, but patience may be required of yourself as you may not have the skills that you need at your fingertips and similarly the same may be true of those around you.  You’ll also need a level of patience on the part of your sponsors or users.  I can almost guarantee that unless they are very familiar with digital form building you may produce a product to the exact specification requested but nevertheless you, and they, know that it’s not quite right and that adjustments, sometimes significant may be required.

Use the community

The PowerApps community at this stage is currently populate by a relatively small but passionate group of individuals building highly imaginative solutions designed to stretch the product.  Make use of these by joining https://powerusers.microsoft.com, follow people on twitter posting using the #powerapps or @PowerApps references.  By all means take a look at my youtube channel www.youtube.com/dataspinners and don’t forget to join https://dynamics365society.uk which contains a PowerApps bank that you can make use of once you sign up (for free).

Go on a course

Personally, I’m quite happy with an online course as typically these are completed over a longer period of time which for me is a better way of allowing the learning to sink in.  The best single free resource is https://courses.edx.org where DAT207x is a very useful starting point.  You can search up courses run in a classroom setting, but you will need to ensure that you invest time in utilitizing the product one you’ve completed it as otherwise you’ll lose your knowledge quite quickly.

Concluding remarks

This article was intended for hirers or individuals trying to make some sense as to the range skills needed for PowerApps excellence.  Hopefully, it isn’t too daunting.  Whatever you do, start small but build usable solutions.  This particular tool offers us astonishing scope for us to run our businesses in efficient ways and the more we can grow as a community the closer we will all get to a culture of efficiency the impact of which may be widespread, more operations, better scheduling less wastage all by putting the right information in front of the right people at the right time.

 

 

PSA & Field Service Schedule Board Error

Recently I ran into this little issue when trying to access the D365 schedule board for both PSA and Field Service. “Resource not found for segment ‘msdyn_FpsAction’. There were also a bunch of other general SQL / Unexpected errors when I was trying to access projects within PSA. This happened immediately after I installed the Microsoft Partner Portal solution with the additional PSA extension.

Basically every blog I went to pointed me to the same link, which didn’t seem to work 🙁

So, for those of you that run into this error in the future, It has to do with certain processes being in Draft mode. I also discovered that a load of these processes didn’t even have an owner.

Firstly, I assigned the processes to myself.

 

Then activate the processes that had randomly been deactivated for no reason. This will then show the schedule board.

Viola… Your schedule board should now work…AND you shouldn’t get any random errors when trying to save projects. Seems the Partner Portal installation, with the additional PSA extension, deactivated a ton of Actions and workflows, and didn’t re-activate them.