Dynamics 365 for Customer Engagement

Smart Grid PCF for Dynamics 365

PCFs are one of the new hot cakes for Model-Driven and Canvas apps.  I wanted to try something new with this one custom controls.

Thank you, Alex Shlega for his Checkbox List PCF and Danish Naglekar for  PCF builder XRM Toolbox plugin. My new subgrid PCF “SmartGrid” will be released soon to TDG Power Platform Bank after final testings :).

 

How to resolve “error Executing the api /eventhubs” in #MicrosoftFlow?

While trying to connect Microsoft Flow to Azure Event Hub, you cannot retrieve the Event Hub name and instead you get “Error Executing the api /eventhubs” error. The event hub connector in Flow, allows you to connect to event hub using connection strings and get notified as soon as a new event in available in the hub. However, there are certain things you will need to know.

Event Hub Namespace vs. Event Hub

An Event Hubs namespace provides a unique scoping container, referenced by its fully qualified domain name, in which you create one or more event hubs. So Event Hubs are inside Event Hub Namespace. Both of the Event Hub Namespace and Event Hub have their own connection string which can be used to access these resources. However, it is important to know that the Microsoft flow connector for Event hub accepts the Event Hub Namespace’s connection string rather than Event Hub resource’s connection string.

Error “Executing the api /eventhubs”

You will see the below error while trying to use Event Hub resource.

The solution is to use Event Hub Namespace’s connection string.

To confirm whether your connection string is associated with your Event Hubs namespace or with a specific event hub, make sure the connection string doesn’t have the EntityPath parameter. If you find this parameter, the connection string is for a specific Event Hub “entity” and is not the correct string to use with your logic app.

Reference

https://docs.microsoft.com/en-us/azure/connectors/connectors-create-api-azure-event-hubs

How To Start Your Exciting Journey Of “Connected Field Service” And “Azure Iot Hub”?

After my last article I spent some time to explore Azure IoT and Connected Field service. I have been watching presentations and demos on YouTube and I saw many good demonstrations about the capabilities of Azure IoT Hub and Connected Field service. However, I wanted to explore more and see how things really work under hood of the platform. So, in this and few upcoming posts I am summarising my key learnings for the benefit of fellows who want to start.

Note: I structured this blog series based on my learning journey. It is designed based on questions of CRM consultants who wants to start their Connected Field Service Journey.

Let’s start by some key definitions which you will hear a lot in your journey and better be familiar at the start:

IoT Device

IoT Device is a piece of hardware with a small circuit (called Microcontroller), capable of connecting to the internet using WIFI or Mobile services. The purpose of the microcontroller is to capture data from the surrounding and send it to consume services over the internet. In other words, IoT device is a minicomputer capable of sending (and sometimes processing) data over the internet.

Raspberry PI / Windows IoT

The IoT device mentioned above can be a small circuit to react to an event like switch on/switch off or can be very powerful to process images in the device itself. When we develop very powerful devices capable of processing data and applying complex logics, we will need a program to run the logic or process data. To enable this processing, we need a lightweight operating system to host our application. The operating system suitable for operating these powerful devices could be Raspberry PI or Windows IoT.

Azure Internet of Things (IoT)

Azure IoT is series of managed services combined to connect, monitor and control smart IoT enabled devices. So basically, the device’s sensors detect events from surrounding environments and pass the event and data using Microcontroller to Azure IoT service. Azure IoT receives the data from millions of devices and sends it for further processing services for giving the data meaning.

Azure IoT Hub

IoT Hub is a central message hub for bi-directional communication between devices and Azure IoT. So, imagine Azure IoT hub as a heart of Azure IoT service.

Azure IoT Edge

As we move towards future, we will have more IoT devices connected to the Azure IoT hub. Each of these devices will send raw messages to Azure IoT hub for processing. This is ok since the Azure IoT hub is a scalable service, but would not it be more efficient if we use the powerful hardware to pre-process data in devices before sending it to the Azure IoT hub? Azure IoT Edge enables us to write programs to process data inside device before sending it to azure and/or send only the necessary data/events. This will help to offload processing in Azure and reduce traffic. If you have a powerful tool why not use it?

IoT Device Twin

Device twin is logical representation of the device status in Azure. For example, you have a smart bulb which is installed in the Branch A. The device twin can contain the location so you can query twins and identify installed devices in the Branch A.  Or you can store the firmware version in Device Twins and at the time of firmware upgrade, you can upgrade only old firmware versions.

Connected Field Service

The Azure IoT gives you the technology enablement to interact with devices. However, the real value of IoT emerges when you give a business context to it. The connected field service gives the business context to your IoT data. The connected field service comes with an accelerator. When you install the connected field service from the AppSource, you will get all entities and required components in your Dynamics environment. It gives you the ability to configure Dynamics to interact with devices, listen to them and be notified when they don’t feel right.

I have seen IoT Central. What is it and how different is it with Azure IoT Hub?

According to the Microsoft documentation, Azure IoT Central is a software as a service (SaaS) solution that uses a model-based approach to help you to build enterprise-grade IoT solutions without requiring expertise in cloud-solution development. However, Azure IoT Hub is the core Azure PaaS that both Azure IoT Central and Azure IoT solution accelerators use. IoT Hub supports reliable and secure bidirectional communications between millions of IoT devices and a cloud solution.

So, in summary if you have Azure Skills and you want flexibility to expand/manage your solution go for IoT Hub. But if you want an accelerator without extension requirement, then IoT Central is the way forward.

Where do I start my journey?

Where to begin
Where to begin – Learning Path

If you want to start directly from Dynamics you simply can follow instructions here

Otherwise follow the following steps:

  1. Provision your Azure IoT Service (Link)
  2. Create a simulated device (Link)
  3. Send and receive messages (Link)

I have a smart door lock. I want to be notified when it is unlocked. Where do I start?

Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes.To implement this scenario, you will need to register your smart door lock in Azure IoT as mentioned above. Once the device is registered, you will need to define Unlock event in IoT service to enable the service to receive Unlock events. By default, the Azure IoT provides 4 event types:

  • Device Created
  • Device Deleted
  • Device Connected
  • Device Disconnected

However, for our scenario we will need to create a custom event Unlock. To define the custom event you can follow this link.

How do I route incoming message to Dynamics?

Once you define your events above, you will need to create a routing channel for the desired events. Once the Unlock events is detected in the IoT, you can define your routing to send your messages to an endpoint. You can configure your routing as per this link.

How to detect if my device disconnects from IoT?

It is very simple, as mentioned above, the Disconnection event is a default event which is triggered once your device is disconnected. All you must do it configure a service to react to the disconnection event.

How to orchestrate events from IoT to Dynamics?

If you are not using Connected Field Service solution from AppSource, you can configure a Flow to send messages across. There are triggers for IoT hub and events which you can use.

IoT Flow Template
IoT Flow Template

What kind of messages can be transmitted between device and IoT?

Power Platform 24 Live!

We recently completed the first ever 24 hour event exclusively focused on the Power Platform – Power Platform 24. The Dynamics and Power Platform community has fantastic events literally across the globe. As amazing as these in person events are, not everyone can attend as either a speaker or attendee. We wanted to remove geography as an obstacle!

Moving to a virtual format allows the team to include both speakers and attendees who may otherwise miss the opportunity to share with and learn from the community. For organizers, the added benefit is lower overhead because you don’t need to secure a venue, coordinate speaker travel, provide prizes, or feed anyone. All said, lots of pros for this virtual event format.

Our first event went off without a hitch! So maybe we had a very minor glitch or two, but the sessions were fantastic and the event ran like a well oiled machine all because of amazing speakers and a group of dedicated organizers.

I personally learned much from the experience, so I wanted to share some thoughts for those that might want to organize a similar event. This post covers a bit about the approach to organizing the event and some tools used to run the show.

Organizing the Event

The event came together fairly quickly after the idea was thrown out in a group conversation: anyone interested in hosting a virtual Power Platform event? The response was of course, “Heck yeah!” The volunteers immediately began throwing around ideas. We had a lot to discuss but most of it boiled down to these main topics:

  • How do we choose speakers?
  • How do we host each presentation?
  • How do we register attendees?

Most of the organizers have some experience running in person events, virtual events, or both so we started with some best practices in mind. We also brought experience with a variety of tools based from past events. This experience gave us a nice head start, we then simply needed to choose what works best for a virtual event spanning a full 24 hours!

Choosing speakers

The team chose sessionize.com as the platform for a call for speakers and vetting the submissions. If you have not used the platform, definitely check it out. Sessionize is offers excellent tools for both organizers and speakers to to organize event submissions and manage sessions across multiple events. Another huge plus is that for the service is free for free community events. Sessionize features alone could take up a full post!

Once you lock down your call for speakers, we used sessionize to categorize, review, and rate submissions. The organizing team reviewed each of the more than 70 submissions, ranking them based on the information provided by the speaker. This was honestly one of the hardest part of the process because we received so many excellent submissions.

We considered multiple tracks because of the number of submissions, meaning we could run two or three concurrent hour long sessions. This was our first 24 hour event, so we chose a single track of 24 one hour sessions, starting at 8:30 AM EST and running through 8:30 AM the next day.

Hosting the event

This was the big decision: What platform do we use to host the event? We can all list a dozen virtual event platforms in just a few minutes, but that doesn’t actually make things easier. We ended up choosing Teams and a Teams Live Event. This makes sense as this is a Microsoft Power Platform event, but here is an excellent article that Purvin Patel shared which helped make our decision: Produce a live event using Teams. This article outlines how to set up a Team Live Event and details around Producer roles.

The Teams Live event setup means assigning users to a producer role where they can monitor a control the live stream, Q&A channel, manage the event notes, and start/stop the event. Another important feature is the ability to record each session. This may not be a requirement for other events, but we wanted to provide recordings for both attendees and speakers. This is an excellent feature but the limitation of 4 hours per recording is something to keep in mind if you choose this platform. We needed to keep this limitation in mind when scheduling the sessions and producers.

Using a Teams Live event requires Office 365 and Teams licenses. Fortunately, the XrmVirtual crew is already delivering live events using Teams, so they offered to run the event for us. We now had a chosen platform, so we needed to decide how to run the sessions.

Delivering the Sessions

We broke the 24 hours into six blocks of 4 hours and we took volunteers as producers for each segment, which worked perfectly with our 4 hour cap on recording. A producer was logged in the speaker during the session to handle connected issues, answer or raise questions, and manage transitions between speakers.

This meant that we posted six 4 hour Team Live events that ran in sequence. Once these were established, individual invites were sent to each speaker with a link for the correct block of time. This was all handled by the XrmVirtual team and I felt it worked out great as both a speaker and a producer. It was easy for me but I know it took a lot of time to set up!

At the start of each session, a producer logged in to Teams with the correct account, share any slides that were required at the time, and kick off the session, and began recording. The speaker could then just shared their screen and delivered the session. Once the session was complete, the producer shut down the event to end recording while the next producer was already up an running with the next speaker.

Registering event attendees

Registering event attendees seems pretty important, so why is it last in the list?

Well our solution for registering users for the event was pretty simple: we didn’t register users. Fortunately, the Teams Live event platform allows users to connect without a prior registration and post questions anonymously. We had no need to track any user info, manage cancellations, etc. Attendees could jump on to catch a session and disconnect when done.

This could be an issue with different virtual delivery platforms but it did not seem to be an issue for us. I believe we averaged about 100 attendees per session which is a pretty nice number. I’ve had in person sessions with only 5 people, so 100 is pretty nice! We had some excellent questions by attendees which really adds to the delivery. And of course, attendees who missed the live session can jump online and view the recorded sessions on demand!

Testing, 1…2…

One practice that made this event run so smoothly was… practice! The week prior to the event, XrmVirtual team set up test sessions to ensure speakers could connect without issue. Each speaker jumped on to the Team event, shared their screen, and tested their audio. It sounds simple, and it was, but it saved us from potential issues on the day of the event.

We also made sure that producers understood the Teams setup operates. The XrmVirtual team provided a new account from their Office organization for each 4 hour block. Each account was granted producer rights on their respective Team Live events. Having enough accounts is another item to consider if you choose a Teams Live event as a platform.

I was not the one that set up the Teams Live event for all of the sessions, but from an end user perspective, I found this event went smoothly and the Teams platform is fairly easy to use.

Thanks once again!

I will call out the organizing team here in case you want to reach out and say thanks! For me, I wanted to say thanks once again to the organizing team for gathering and vetting the speakers, setting up the infrastructure, communicating with speakers and attendees, taking time to act as producers (at really crazy hours!), processing all of the recorded videos, and advertising the event.

Thanks for simply giving up a chunk of your free time to make this event happen.

Julie Yack
David Yack
Joel Lindstrom
Aiden Kaskela
Beth Burrell
Michael Ochs
Sarah Jelinek

Everyone on the team pitched in, but I think a few special shout outs are in order – thanks to Julie for owning the meetings and technical bits with the producer setup and being online for 16 or so hours monitoring the event real time. And thanks to David Yack for spending his weekend breaking down all of the videos and hosting them for our viewing pleasure.

And thanks to all of the speakers that gave up their time to plan and provide some excellent sessions for the community! Check out the full list of speakers at the Power Platform 24 site! You can check out the recorded events now that they have been posted here!

I am looking forward to another Power Platform 24 event… Keep an eye out for the next event announcement!

Are you a Passionate #PowerPlatform developer, looking for your next big challenge?

Looking for your next big thing?
What I love about the #PowerPlatform is that we have tones of new features every 6 months and if you are an avid learner, it gives no reason to you to be stagnant and not learn new things every day. You have numerous options from creating #powerapps, #pcf component or just explore new enhancements in the #dynamics365 applications.
After spending some time on exploring some core capabilities of the #PowerPlatform and playing with the field service application, I have been thinking about what is the next big thing I should focus on. I was looking for something challenging which has an impact on us. To decide on what should be the next, I decided to take a step back and look at the options I have to decide on what to do next. I looked in technology blogs, read books and listened to many podcasts which helped me to clear my mind and shortlist some major places I could start. I would like to share what I did in the hope that it may help other folks who are also looking for their next challenge:

Industry Reports & Microsoft investments

Like it or not, business needs and drive product development. It is the business, industry and eventually profits which are driving the investment into products which we can work on implementation or extension. I think one the best sources you can get your own next best challenge is industry or economic reports. I always have an eye on #Gartner reports. I see people share #Gartner reports back and forth on LinkedIn to show how a product is doing in the competition. I look at it from a different angel. I always read the #Gartner report but looking at the weaknesses. Weaknesses in the competition drive demand for innovations and solutions. By reading reports, you will understand what are the weaknesses in a product or platform and how these weaknesses matter to end customers. The next big challenge can be addressing some of the weaknesses!
Another indicator is Microsoft investments. When I see Microsoft is investing heavily on a platform or product, it gives me a good idea on what is the next big thing. Of course, Microsoft has a big team of industry experts and awesome people familiar in various businesses. Speaking of myself, I do trust the investment numbers of big players because that is going to create a demand for my innovative mind!
What do you think about this one?

What about cross-domain ideas?

Our field is fluid. It changes every day. It is not only #PowerPlatform. It is the nature of our field. There was a time when implementing a CRM was the goal of organisations but now CRM is only one piece in the big picture! Have you even thought of creating something including multiple technologies? How about combining #Azure and #PowerPlatform? The topic of cross-domain work is the one I am very interested. I have explored Azure SearchAzure SpeechAzure Bots, Azure Documents and Azure IoT Hub alongside #PowerPlatform. The synergy between #Azure and #PowerPlatform creates a system which really serves customers better. If you are out of ideas in #PowerPlatform, then start looking in #Azure. I bet you will have a lovely time learning new things and explore how #Azure can help you to do better implementations!

Working on community Ideas

There is a long list (and it is growing day by day) of product ideas in the IDEAS PORTAL. The #PowerPlatform product team does a good job on assessing, shortlisting and working on the great ideas to release in future waves. However, it is not possible to implement all of ideas due to various reasons such as criticality, demand, priorities and etc. Passionate people in the community can assess these ideas and see if they like the idea. Once chosen, you can work on the idea and release your own source to the GitHub to spread its benefit. Publishing your source helps the idea get bigger and your code quality will be better, too!

XrmToolBox plugins

I think XrmToolBox is the single best thing that the Dynamics community has come up with. I use it in every project. It inspires me when I see friends out them develop such a lovely plugin that works like a charm. (Big shoutout to Tanguy Touzard , Jonas Rapp and Aiden Kaskela whom I follow when it comes to the XrmToolBox) I never had a major problem using the tool which tells me how many good coders are out there. Your next best thing can be a XrmToolBox plugin or an extension of an existing one. If you think your idea is small or bad, you better think  again. There is no small or bad idea. Once you float your idea, it will become a snowball. Other awesome people contribute to yours and you will see how your small idea will become a big one helping others to do better. Even if you have no small idea, you can extend the functionality of an existing plugin. You always have an option to choose from 🙂 but remember one thing: Being stagnant is NO option!
The above list is not all I explored but it shortlists some of the things you can relate to. I would love to hear from you and see what is your experience in finding your next big thing?

P.S: A friendly reminder by @rappen on how to type: XrmToolBox (https://twitter.com/rappen/status/951009993989459969)

Image by Joanna Kosinska from Joanna Kosinska

Licensing Guide for Dynamics 365 Online – October 2019 Update

Overview

With October release right around the corner, Microsoft will be moving away from the plan-based structure to base-and-attach licensing model to allow customers to mix and match applications depending on their consumption. The new licensing model will allow customers to purchase the applications they need when they need them, or as some would like to call it -an a’la carte sales model.

But, as close as we are to the fall update, I am still hearing some concerns and questions regarding the new Dynamics 365 licensing. It may seem a bit complex and might require some to do some math after, but it is intended for customers to have more control of their subscriptions and therefore allow them to “pay only for what they use”  As Alysa Taylor, Corporate Vice President of Business Applications and Global Industry assures us…

In October, Microsoft will be moving from “one-size-fits-all” Microsoft Dynamics 365 licensing plans to focus on providing customers with the specific Dynamics 365 applications that meet the unique needs of their organization. Customers need software that aligns to their functional roles and scenarios, and they require the ability to add or remove applications as their company grows and changes over time. The new licensing model will allow customers to purchase the applications they need, when they need them. Each application is extensible and applications can be easily mixed and matched to configure integrated solutions that align to a customer’s unique business requirements.

How it works

Every user starts with a base license. The Base license is the first application purchased for individual use within an organization.  The Attach licenses can be purchased on top of the base license as incremental applications at a heavy discount.

451642-PEZYQN-621

Both Base and Attach licenses provide access to the same application functionality.

Screenshot_9

This new model allows each organization to use the right layering for each individual user depending on their needs. But depending on your consumption, this new model might still affect your monthly payment. Payment is based on usage per user per month. I suggest reading the full pricing guide and check Microsoft updates.

Restrictions and things to consider…

  • The Dynamics 365 for Marketing app will remain tenant-based and can only be purchased at the standard price of $1500 for new customers. Attach licenses will NOT apply.

451642-PEZYQN-621sc

  • Additionally, Dynamics 365 for Project Service Automation and Talent are NOT available as Attach licenses.
  • The minimum 20 seats for Finance and Operation will still take effect. At least 20 licenses must be purchased for any one of the applications.
  • Common Data Service will still be allocated with Unified Operation apps without additional licensing, but multiplexing issues will still apply.
  • Storage SKUs will remain unchanged and can be purchased and allocated as needed.
  • Base license must be the higher-priced app. For users that require using apps from Customer Engagement and Unified Operations, Unified Operations must always be the Base License. The same applies between CE Enterprise licenses and CE Professional licenses – Enterprise must always be the base license.

Screenshot_11

Customer Engagement

See below for the placemat view pricing guide for Dynamics 365 applications under Customer Engagement.

Some entities are accessible as read-only and are categorized as unrestricted. Update to restricted and unrestricted entities are also being made with the goal of enabling more users to access data directly through a PowerApps application or Flow workflow without requiring a Dynamics license but more on this on a separate post.

But if you require to use all the Customer Engagement Apps, you can purchase Project Service Automation as Base License and add the rest as Attach Licenses.

Screenshot_3

Unified Operations

One of the key changes that are also coming this October is the separation of Finance & Operations module into two separate applications — Supply Chain Management and Finance.

Both of these apps can be purchased as base and attach licenses of one another, depending on your needs, and would still require the 20 seat minimum.

Screenshot_8

The functions are broken down into these two applications…

Screenshot_10

Below is the placemat view of the pricing guide for Unified Operations applications.

Screenshot_14.png

Impact

New customers can no longer purchase the Dynamics 365 CE and UO plans. The new Base and Attach license model will apply from the 1st of October. So, it is important for potential and existing customers to be made aware of these changes.

For existing customers, the changes will only take effect after their renewal date. So, for a 1-year long subscription purchased on January 1, 2018, the changes will take effect on the same date the following year and must be converted before then. To those that still have concerns, make sure to keep an eye out for more updates.

These changes are easy to understand but may still have a substantial impact on your existing subscriptions. Depending on your organization’s needs, it is important to seek the advice of a consultant/partner regarding this matter.

How to call #webapi from #PowerPlatform #Portals

In almost all of my portal projects I get a question from my clients: “How to call an external api from #portals”? This question has been common that I decided to write about my experience on this topic which might be helpful for the community. This post will focus on two main areas:

  1. The available options to integrate #portals with external
  2. A step by step guide on one of the least discussed options which is using Oauth Implicit Grant flow and how I created a simple demo for one of my customers

Scenario

I would like to give a business context to this scenario. Any enterprise solution requires integration and interaction of multiple systems which #portals could be one of them. Imagine a scenario where a customer is looking for a quote on a product in the company portal. In this case the #portal is required to bring quote details from CPQ (Configure Price Quote) system to the portal. In another scenario, a #portal is required to integrate with a core banking system to get the customer’s latest balances. In these scenarios and similar ones, we will require the #portal to integrate with an external api to get information.

In order to enable such integrations, the #portal must be able to make calls in a secure way as most of the internal systems require authentication before anything can happen. So what are the options available?

Solutions

Since #powerplatform #portals are tightly integrated with #powerplatform, in most cases the integration is done through the #powerplatform itself. However, the integration through these #powerplatform has three flavors.

  1. The first one is creating actions in the platform which communicated with external API and manages the requests and responses; then calling the actions through a workflow where the workflow is triggered using Entity Form or Entity List events. 
Portal Integration with Web Api
Portal Integration with Web Api using Actions

 

  • The second option is to use #MicrosoftFlow to encapsulate the Workflow and Action part in a Flow. The benefit of this solution is that you won’t need to write code (in most cases but not guaranteed) to call #webapi

    Portal Integration using Flow
    Portal Integration using Flow
  • The above two options, use #PowerPlatform to facilitate the integration and all calls are routed through the platform. However, going through the server is not always feasible. There are situations in which you would like to make client side calls from javascript using Ajax from #portals to call external API. However, the main concerns in these scenarios are authentication. And the solution provided by the platform is “Oauth Implicit Grant Flow“.If you would like to learn more about what is the ”

    Oauth Implicit Grant Flow” beyond the #PowerPlatform, you can read more here.

 

There are concerns over the Oauth Implicit Grant flow and the recommendation is to use “Oauth code grant flow”. According to the Oauth working group, “t is generally not recommended to use the implicit flow (and some servers prohibit this flow entirely). In the time since the spec was originally written, the industry best practice has changed to recommend that public clients should use the authorization code flow with the PKCE extension instead.”. Microsoft is aware of this restriction however, it is believed Oath implicit grant flow is still ok to use.

I have proposed an idea to implement the Oauth code grant flow in this IDEA. Please vote for it.

Now getting back to the topic: How to Integrate:

Portal Integration with Oauth Implicit Grant Flow
Portal Integration with Oauth Implicit Grant Flow

In this scenario, there is no server side calls are required. A complete documentation is available here. However, the documentation is not very helpful if you want to do things quickly since there is a learning cycle involved. OAuth 2.0 implicit grant flow supports endpoints that a client can call to get an ID token. Two endpoints are used for this purpose: authorize and token. I will not go to the details of these calls and I assume you already know what these are.

So here is what you will have to do:

  1. Create your web api. You can download the sample api from this Github project. This website is no different than any MVP website. So you can create your own with Web APIs. 
  2. Next is to register your application in Azure Active Directory. This is a free service which you can use to provide authentication to your web api. A step by step details of the registration process is in this link.The REDIRECT URL must be the direct link to the page you created in the step # 2. You will need to note the following after this step:

    – Client ID
    – Redirect URL

  3. Let’s say you have a Quote page in your portal and you would like to place a button on the portal page to get Quotations from your internal website. You will have to put a custom HTML in your “Content Page” (not the main page) of the portal. This custom HTML will be used to add a QUOTE button to the portal and also retrieve the Quotation by use of a custom javascrtip code.
<h2>The QUOTE BUTTON</h2>

<button type="button" onclick="callAuthorizeEndpoint()">Give me a Quote!</button>

 <script>
//Remove this line to avoid State validation
$.cookie("useStateValidation",1);
function callAuthorizeEndpoint(){
//Used for State validation
var useStateValidation = $.cookie("useStateValidation");
var appStateKey = 'p07T@lst@T3';
var sampleAppState = {id:500, name:"logic"};
//Replace with Client Id Registered on CRM
var clientId = "CLIENT ID OBTAINED FROM AZURE ACTIVE DIRECTORY";
//Replace with Redirect URL registered on CRM
var redirectUri = encodeURIComponent("https://MYPORTAL.powerappsportals.com/REDIRECT_PAGE/");
//Authorize Endpoint
var redirectLocation = `/_services/auth/authorize?client_id=${clientId}&redirect_uri=${redirectUri}`;
//Save state in a cookie if State validation is enabled
if(useStateValidation){
$.cookie(appStateKey, JSON.stringify(sampleAppState));
redirectLocation = redirectLocation + `&state=${appStateKey}`;
console.log("Added State Parameter");
}

//Redirect
window.location = redirectLocation;
}
</script>


  1. Modify the source code in the web api website to use the Client ID and Redirect URL in its startup page.
public virtual Task ValidateIdentity(OAuthValidateIdentityContext context)
        {
try
{
if (!context.Request.Headers.ContainsKey("Authorization"))
{
return Task.FromResult<object>(null);
}

// Retrieve the JWT token in Authorization Header
var jwt = context.Request.Headers["Authorization"].Replace("Bearer ", string.Empty);
var handler = new JwtSecurityTokenHandler();
var token = new JwtSecurityToken(jwt);
var claimIdentity = new ClaimsIdentity(token.Claims, DefaultAuthenticationTypes.ExternalBearer);
var param = new TokenValidationParameters
{
ValidateAudience = false, // Make this false if token was generated without clientId
ValidAudience = "CLIENT ID", //Replace with Client Id Registered on CRM. Token should have been fetched with the same clientId.
ValidateIssuer = true,
IssuerSigningKey = _signingKey,
IssuerValidator = (issuer, securityToken, parameters) =>
{
var allowed = GetAllowedPortal().Trim().ToLowerInvariant();

if (issuer.ToLowerInvariant().Equals(allowed))
{
return issuer;
}
throw new Exception("Token Issuer is not a known Portal");
}
};

SecurityToken validatedToken = null;
handler.ValidateToken(token.RawData, param, out validatedToken);
var claimPrincipal = new ClaimsPrincipal(claimIdentity);
context.Response.Context.Authentication.User = claimPrincipal;
context.Validated(claimIdentity);
}
catch(Exception exception)
{
System.Diagnostics.Debug.WriteLine(exception);
return null;
}
return Task.FromResult<object>(null);

}
  1. The next step is to use Custom HTML on the Redirect PAGE so that you can make the call to the Web API by the token obtained in this step.
function getResultInUrlFragment(hash){
    if(hash){
        var result = {};
        hash.substring("1").split('&').forEach(function(keyValuePair){
            var arr = keyValuePair.split('=');
//  Add to result, only the keys with values
            arr[1] && (result[arr[0]] = arr[1]);
        });
return result;
    }
else{
return null;
    }
}
//Validate State parameter
//Returns true for valid state and false otherwise
function validateState(stateInUrlFragment){
if(!stateInUrlFragment){
console.error("State Validation Failed. State parameter not found in URL fragment");
return false;
    }

// State parameter in URL Fragment doesn't have a corresponding cookie.
if(!$.cookie(stateInUrlFragment)){
console.error("State Validation Failed. Invalid state parameter");
return false;
    }
return true;
}

var useStateValidation = $.cookie("useStateValidation");
var appState = null;

//Fetch the parameters in Url fragment
var authorizeEndpointResult = getResultInUrlFragment(window.location.hash);

//Validate State
if(useStateValidation){
if(!validateState(authorizeEndpointResult.state)){
authorizeEndpointResult = null;
    }
else{
appState = $.cookie(authorizeEndpointResult.state);        
console.log("State: "+appState);
    }
}

//Display token
if(authorizeEndpointResult){
    var data = authorizeEndpointResult.token;
console.log("Token:" + data);
   $.ajax({
type: "GET",
url: "https://URL_TO_THE_WEB_API.azurewebsites.net/api/external/ping",
contentType: "application/json; charset=utf-8",
dataType: "json",
headers: {
Accept:"text/plain; charset=utf-8",
        "Authorization": "Bearer "+data
},
success: function (data) {
alert(JSON.stringify(data));
console.log(data);
}, //End of AJAX Success function
failure: function (data) {
alert(data.responseText);
}, //End of AJAX failure function
error: function (data) {
alert(data.responseText);
} //End of AJAX error function
});
}

I hope this post helps you a bit to make your portals connect to the outside world!

Liquid Templates in Dynamics Portals – Part 2

Liquid Templates in Dynamics Portals – Part 1 is a high level overview and a bit of history of Liquid Templates. Now we can take a closer look at Liquid Templates support in Dynamics Portals. While this capability offers flexibility to Portal developers, it can be overwhelming when getting started.

The Microsoft Documentation on Liquid is fairly solid but I think it skips many Liquid fundamentals offered by the Shopify Liquid online documentation. This is understandable since I assume they want to focus on Portals specific extensions, but the foundation is important for newcomers.

In this post, we will cover language features not outlined in the Microsoft Portals documentation:

  • Liquid Templates fundamentals – a quick overview of the main capabilities of Liquid Templates
  • Dynamics specific extensions – what extensions has Microsoft added to Liquid for Portals

Once we cover the fundamentals, we can dig deeper into:

  • Liquid Template usage in Dynamics Portals – where can we use Liquid Templates
  • Common usage patterns – examples of when and how to leverage Liquid Templates in Portals
  • Complex examples – more extensive customization using Liquid

We won’t cover all items in this post but we can get things off to a good start.

Liquid Templates fundamentals

Liquid Templates (or Liquid) is a ‘templating language’ but this phrase can be a bit confusing. When I think of templates, Email templates, Document templates, or even reports come to mind. I see language, I think Java, C#, C++, or Visual Basic with which I can build standalone applications for the desktop, mobile, or the web.

Liquid offers features traditionally associated with templates. With Email or Document Templates in Dynamics 365 CE (CRM), we enter placeholders or slugs for data fields inline with our content. The system fills in the slugs with the selected CRM record data when generating the Email or Document. We can do the same with Liquid and Power Portals – placeholders for Common Data Service (CDS) data that render when the page loads.

Liquid goes beyond templates with capabilities similar to programming languages like JavaScript. For example, you can implement conditional logic using Control Flow operators like the if tag, capture variable values using the assign tag, or we can iterate on collections of items using the loop tag.

So with Liquid, we inject CDS data elements into our Portal content (Templates) and provide some logical control over presentation (Language). These two capabilities make Liquid a powerful means of extending a Portals solution. Let’s take a look at some of the specifics of the Liquid language.

Liquid Template Building Blocks

We know the basics of JavaScript, so what are the basics of Liquid? Right on the Liquid documentation Introduction:

Liquid code can be categorized into objectstags, and filters.

Introduction

I won’t repeat the documentation since it’s well done and pretty extensive. But I think it’s important to understand these three categories:

  • Objects grant access to data and structures from CDS and Portals
  • Tags provide logic and flow to the Liquid Template
  • Filters allow us to process or transform data available in objects

We can access objects using double curly braces. The example below tells Liquid to render the value of the name object:

{{ name }}

Filters extend how we access the object, transforming or processing the data returned with the object. Adding to the example above, we can convert the value of name to upper case with the ucase filter:

{{ name | upcase }}

We access tags using a combination of curly brace and percent sign. Here, we can show some conditional logic using the if tag. This snippet will check to see if the name object is null, and if not, display the text within:

{% if name %}
  Hello {{ name }}!
{% endif %}

Liquid recognizes the elements between the curly braces as our instructions, executes them on the server, and returns the results as our rendered Portal content. This means that objects, tags, and filters are pretty important building blocks for the Liquid Template language!

Liquid Objects and Types

Objects mean data, but what kind of data can we expect? In the standard Liquid documentation, objects have a variety of Types

  • String
  • Number
  • Boolean
  • Nil
  • Array

An object in Liquid can be one of these Types. String, Number, and Boolean should be familiar. Nil is the equivalent of Null and can be important: null values mean something very different than empty strings in CDS. An Array allows working with collections or lists of values, such as a list of strings or numbers. You can iterate on Arrays or access individual items using their position or index.

Right away, we see extensions to the Liquid language by the Power Portals framework. Power Portals supports a few additional Types, as described in the online documentation:

  • Dictionary
  • DateTime

A Dictionary is similar to an Array as it holds collections of values, but items in a Dictionary can be accessed using a string key, not just the index. This will be important when retrieving CDS data. For example, if you have a list of Contacts, you can access a contact by their Id directly:

{{ contact[Id] }}

Also note that an Object can also be a comprised of several of these Types. If we think back to CDS objects, we have Entities with Attributes. For example, a Contact has Attributes like First Name, Last Name, and Created On. An object that represents a Contact and these attributes would then be a combination of String and DateTime types. We could represent these Contact Entity Attributes in Liquid with something similar to the following:

  • {{ contact.firstname }}
  • {{ contact.lastname }}
  • {{ contact.createdon }}

Here we can access a Contact Entity record and Attributes through a Liquid object named contact with String and DateTime properties in a fashion very similar to what we see in JavaScript.

This is another example of CDS extensions to the Liquid language, but standard Liquid objects work in a similar manner. For example, Shopify provides an object called page and from this object, you can access the title like so:

{{ page.title }}

With the basics of accessing data down, how can we work with this data in our Portal?

Operators and Conditional Tags!

Operators allow us to compare or evaluate objects, and the Power Portals support for operators includes a few additions to the Shopify foundation. The standard operators should be familiar to both developers and non-developers alike from math class: Equals (==), Not Equals (!=), Less Than (<), etc.

Operators are used when coupled with conditional logic tags, such as the if tag we have seen in the snippet above. We can update the snippet to use the Equals operator. For example, if we had a Boolean flag indicating whether someone is logged in, we can say hello to the user:

{% if logged_in == true %}
  Hello {{ user.fullname }}!
{% endif %}

More interesting operators worth note are Condition And (and), Condition Or (or). These two operators allow us to check for multiple conditions in one if statement. What if we want to be sure their full name has a value?

{% if logged_in == true and user.fullname != nil %}
  Hello {{ user.fullname }}!
{% endif %}

A few more conditionals that allow us to evaluate strings are contains, startswith, and endswith. We can also check for the page, only showing the Hello message on the Home page:

{% if logged_in == true and user.fullname != nil and page.title startswith 'Home' %}
  Hello {{ user.fullname }}!
{% endif %}

Another conditional tag is unlesswhich is the opposite of if. We could rework this last snippet using unless:

{% unless page.title startswith 'Profile' %} 
   {% if logged_in == true and user.fullname != nil %}
     Hello {{ user.fullname }}! 
   {% endif %}
{% endunless %}

Now, we will show the Hello for all pages except their Profile page. This also shows how we can nest these statements. Here, the if conditionals will never be evaluated until the unless conditional is met.

These are simple examples, but we can easily add some conditional logic inline that will be rendered on the server rather than building a lot of JavaScript that needs execute this logic the content on the client.

Filters for transformation

Now we know how to display data using objects and we can decide when to display data using conditional tags, we can transform or manipulate the data using Filters. We add filters to our object display syntax using a pipe “|” delimiter and Filters can be added by platforms extending liquid, but the default filters we get with the base Liquid implementation are pretty extensive.

Many filters offer simple formatting, such as converting the case of a string. Continuing our previous snippet, maybe we want to show how excited we are and make the name upper case by adding | upcase as a filter:

{% unless page.title startswith 'Profile' %} 
   {% if logged_in == true and user.fullname != nil %}
     Hello {{ user.fullname | upcase }}! 
   {% endif %}
{% endunless %}

Another common example is a format mask for dates. Lets tell the user what day and time it is when they logged into the site:

{% unless page.title startswith 'Profile' %} 
   {% if logged_in == true and user.fullname != nil %}
     Hello {{ user.fullname | upcase }}! Today is {{ "now" | date: "%Y-%m-%d %H:%M" }}.
   {% endif %}
{% endunless %}

Here we use a special object called now that will return the current date and time, and then we use the date filter that will format the date according to the mask provided. In this example the final result with the date would look something like:

Hello JIM! Today is 2019-08-06 23:24

Liquid provides many other filters for dealing with numbers and collections, such as adding numbers (| plus) or converting . These filters are more useful in more complex situations than just formatting dates or strings. You might be capturing variables while iterating on a loop, or you might be filtering items out of a collection for display on a custom page.

We will dive into some of these more complex filters when we cover collections and additional Portal specific extensions in more detail. We will use some more realistic examples to demonstrate how filters can be extremely useful!

Up next…

This was a fairly long post but we covered objects and data types, conditional tags, operators, and some powerful filters. Understanding these topics are critical because these are used throughout the more complex scenarios offered with Liquid Templates

Next we will take a look at Collections and their related control flow tags in more detail and how they related to more of the Power Portals specific extensions.

As always, comments, questions, and corrections are all welcome!

Email Sentiment Analysis in Power Platform to improve customer service

What I love about my life as a #consultant is having the opportunity to hear customer problems and responding to them with something of value which improves their business in their own industry and market. What I love about being #Microsoft #Technology #consultant is working on a technology which not only cares about end users but also makes it easier for me (or any citizen developer) to come up with solutions easy to implement and with the #powerplatform and #msflow, many things don’t even need to open my #visualstudio (which I love and open every day even if I am not coding – Sounds crazy, nah! :D). Let’s get back to the track now!

Scenario

I had a request from my customer who was getting bombarded with case emails in its support department. The customer asked me to find a solution to prioritize emails based on urgency and probability of customers getting defected.

My initial thought was that “How do I need to quantify if a customer is going to defect because they are not satisfied”? After pondering on few solutions, I could come up with the idea of “Email Sentiment” as KPI for customer defection. If a customer is not satisfied with a service, their first reaction is to send a bad email to the company (At least this what I do) before going to the social media. So I took the initial complaining email as a sign of losing customers. The next thing was how to implement the idea? And this is how I did:

Solutions

  1. The basis of the solution was to use Azure Text Analysis service to detect the email message sentiment. The underlying service being utilized was Azure Text Sentiment Analysis service.
  2. The next thing was to customize the email message entity to hold the sentiment value and potentially trigger a notification to manager or just sort emails based on their negative sentiment value.
  3. The last thing was to connect Power Platform to the Azure Text Sentiment Analysis service and get the sentiment value of email message from azure. I had two ways to implement this:
    • The first solution was to write a customer action to call the service and pass the email text to the azure endpoint. On receiving the response of the analysis service, the action would return the sentiment as its return. Finally calling the action on a workflow which triggers on the Creation of Email Activity!
    • The second solution was to use #MicrosftFlow and do everything without writing a single line of code. Obviously, I used this technique.

The solution is extremely easy because #MicrosoftFlow provides an out of the box connector to the text analysis service and all you will need to do is to provide the service key and service endpoint. Below is how my #Microsoftflow looks like:

Microsoft Flow Sentiment Analysis

Azure returns the sentiment score along with its analysis as Positive, Negative and Neutral. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, while scores close to 0 indicate negative sentiment. A score of 0.5 indicates the lack of sentiment (e.g. a factoid statement).

In my solution, I stored sentiment value as Whole Number, so I had to cast the decimal value between 0 and 1 to a number between 0 and 100. To do this, I used Operation step to multiply the sentiment score by 100 and cast it to an integer value. So I used the below formula:

int(substring(string(mul(body('Detect_Sentiment')?['score'],100)),0,indexof(string(mul(body('Detect_Sentiment')?['score'],100)),'.')))

Note: #MicrosftFlow does not have round function so I had to convert the value to string and substring until the decimal point.

Key Points:

  1. All of the Text Analytics API endpoints accept raw text data. The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks.
  2. Your rate limit will vary with your pricing tier.
  3. The Text Analytics API uses Unicode encoding for text representation and character count calculations. Requests can be submitted in both UTF-8 and UTF-16 with no measurable differences in the character count.

Improve efficiency of Call centers using Dynamics 365 and Azure cognitive services

Photo by Hrayr Movsisyan on Unsplash

I am Fascinated by sophistication of Azure services and how they help us to improve our solutions and extend the way we can solve customer problems. Recently I had a requirement to implement  a dynamics 365 solution to enable a call center to capture cases while their operators are offline.

One solution was to provide a self-service portal to customers to log the cases when Call center operators are offline. But in this case the customer was looking for something very quick to implement and having the ability to link incoming cases with their call center channel and derive some reporting based on it.

Approach

I started looking at Azure services and see how I can use Azure cognitive services and speech recognition to help me solve this requirement and like always I Azure did not disappoint me. In this post I would like to share my experience with you and take you to the steps that you would need to create such a solution. Of course possibilities are endless. However, this post will give you a starting point to begin your journey.

I have seen solutions where telephony systems send voice recordings of callers as an email attachment to a queue in CRM. The CRM then converts that queue item to a case and attaches the voice recording as note to the case. The challenge with this solution is the call center operators have to open attachments manually and have to write the description of the case after listening to the audio file. This means their time is spent on inefficient activities whereas they should be utilize in better ways.

Another problem with this approach is size of attachments. As time goes by, audio attachments will increase the database size impacting the maintenance of solution.

Scenario

Our scenario is based on the fact that call center agents are not working 24 hours a day.

While agents  are offline customer should still be able to contact call center record the voice messages to create cases.

We will use the following components:

  1. Azure Blob to receive recorded audio files from telephony system.
  2. Azure cognitive services to listen to recorded audio files and translate the content to a text message. The audio file will be saved in  Azure blob (which is cheaper than CRM database storage).
  3. Azure function (with Azure Blob Binding) to recognize the text from the audio file and extracts the case description.
  4. Dynamics 365 Web API to create a case in CRM using the description extracted from Azure Cognitive services.  We can also add blob metadata like filename, etc. to case properties.
Solution Architecture

The full source code is available at GitHub

However, the main code snippet to perform conversion is below:

 public static async Task <string> RecognitionWithPullAudioStreamAsync ( string key, string region, Stream myBlob , ILogger log )

        {

            // Creates an instance of a speech config with specified subscription key and service region.

            // Replace with your own subscription key and service region (e.g., "westus").

            var config = SpeechConfig.FromSubscription(key, region);

            string finalText = string.Empty;

            var stopRecognition = new TaskCompletionSource<int>();

            // Create an audio stream from a wav file.

            // Replace with your own audio file name.

            using ( var audioInput = Helper. OpenWavFile ( myBlob ) )

            {

                // Creates a speech recognizer using audio stream input.

                using ( var recognizer = new SpeechRecognizer ( config , audioInput ) )

                {

                    // Subscribes to events.

                    recognizer. Recognizing += ( s , e ) =>

                    {                       

                    };

                    recognizer. Recognized += ( s , e ) =>

                    {

                        if ( e. Result. Reason == ResultReason. RecognizedSpeech )

                        {

                            finalText += e. Result. Text + " ";

                        }

                        else if ( e. Result. Reason == ResultReason. NoMatch )

                        {

                            log.LogInformation ( $"NOMATCH: Speech could not be recognized." );

                        }

                    };

                    recognizer. Canceled += ( s , e ) =>

                    {

                        log. LogInformation ( $"CANCELED: Reason={e. Reason}" );

                        if ( e. Reason == CancellationReason. Error )

                        {

                            log. LogInformation ( $"CANCELED: ErrorCode={e. ErrorCode}" );

                            log. LogInformation ( $"CANCELED: ErrorDetails={e. ErrorDetails}" );

                            log. LogInformation ( $"CANCELED: Did you update the subscription info?" );

                        }

                        stopRecognition. TrySetResult ( 0 );

                    };

                    recognizer. SessionStarted += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession started event." );

                    };

                    recognizer. SessionStopped += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession stopped event." );

                        log. LogInformation ( "\nStop recognition." );

                        stopRecognition. TrySetResult ( 0 );

                    };

                    // Starts continuous recognition. Uses StopContinuousRecognitionAsync() to stop recognition.

                    await recognizer. StartContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    // Waits for completion.

                    // Use Task.WaitAny to keep the task rooted.

                    Task. WaitAny ( new [ ] { stopRecognition. Task } );

                    // Stops recognition.

                    await recognizer. StopContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    return finalText;

                }

            }

        }

Important considerations:

  1. [This point is optional, if you use Web API to create cases in CRM] You will need use Multi-tenant configuration, if your Azure Function Tenant and the tenant in which your CRM API is registered, are different. If your Azure function tenant and the tenant in which your CRM API is registered, you can use Single Tenant configuration.
  2. The input file from the telephony to Azure blob must be in a specific format. The required format specification is:
Property Value
File Format RIFF (WAV)
Sampling Rate 8000 Hz or 16000 Hz
Channels 1 (mono)
Sample Format PCM, 16-bit integers
File Duration 0.1 seconds < duration < 60 seconds
Silence Collar > 0.1 seconds

 

4. You can use ffmpeg tool to convert your recording to this specific format. For your testing, you can download and use the tool as below:
Download ffmpeg from this link.
Use the command: ffmpeg -i “<source>.mp3” -acodec pcm_s16le -ac 1 -ar 16000 “<output>.wav”
5. My sample in GitHub covers input in one single chunk of audio. However, if you wish to have continuous streaming, you will need to implement the         StartContinuousRecognitionAsync method.
6. The azure function should be configured to be blob trigger.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

Display multiple entities on Power BI map control

 

Photo by Susannah Burleson on Unsplash

Recently I had to display location of multiple entities on a CRM dashboard. The requirement was to display all Workorders, Projects, Resources and Bookings in a map control so the project scheduler / field service dispatcher could see where is the location of each Workorders, Projects, Resources and Bookings on map. The bing map control works fine on individual entities which are enabled for geolocation however, in this scenario I had to plot all different entities on a single map.

My thoughts were that I could choose from one of the following methods:

  1. Use bing map control on a dashboard. Use a webresource to retrieve all entities in Workorders, Projects, Resources and Bookings. And then use a draw function to place each entity location on the bing map.
  2. The second approach was to use Power BI and its Visual Map control to plot all entities on a map. Then host the Power BI control on my dashboard. I decided to use this approach to display entities on a map control.

Power BI Map control to show multiple entities

The map control in Power BI uses one source table with longitude and latitude information to display table rows on map. The challenge with this approach is that the visual map control supports only one entity’s longitude and latitude and therefore we can only use one entity as source of the map data. In my scenario I had multiple entity types i.e. Workorders, Projects, Resources and Bookings. Each of these entities have its own longitude and latitude and we cannot use all these entities together as  a source for our Power BI Map.

The way I overcome to this challenge was to use a temporary table to union data from all Workorders, Projects, Resources and Bookings in this table and use this temporary table as the source of Power BI Map control. This is how I did it:

  1. Connect to the CRM Bookings table. This will bring all columns of the table to the Power BI.
  2. Remove unwanted columns in the Query Editor (optional).
    = Table.SelectColumns(Source,{"name", "msdyn_longitude", "msdyn_worklocation", "bookableresourcebookingid", "msdyn_latitude"})
  3. Reorder remaining columns in a way that you like to see your data (optional).
    = Table.ReorderColumns(#"Removed Other Columns",{"name", "msdyn_longitude", "msdyn_worklocation", "msdyn_latitude", "bookableresourcebookingid"})
  4. Rename column headings (optional).
    = Table.RenameColumns(#"Reordered Columns1",{{"bookableresourcebookingid", "id"}})
  5. Filter rows that you want to exclude from map (optional).
    = Table.SelectRows(#"Renamed Columns", each [latitude] <> null)
  6. Add a custom column to the query as TABLE Identifier/Category so you can identify workorder rows in the union table.
    = Table.AddColumn(#"Filtered Rows", "category", each Text.Upper("Bookable Resource Booking"))
     
  7. Change the column types (optional).
    = Table.TransformColumnTypes(#"Reordered Columns",{{"category", type text}, {"longitude", type number}, {"latitude", type number}})

If you have more than one entity, repeat the above steps for each table in your query editor.

The next step is to create a temporary table and union all the above tables data using DAX query into this temporary table.

  1. Go to Modeling Table.
  2. Click on New Table. Use the below query to fill the table (Alter table names based on your scenario),
    TempTable = UNION('Bookable Resource Booking','Bookable Resources','Work Orders','Project Sites')
  3. Drag a Map Visualisation control to the Power BI.
  4. Select “Category” or Entity Name from the TempTable as Legend. This will ensure to show your entities in different colors.
    Drag longitude and Latitude fields to the X and Y axis.
  5. Note: By default when you form tables, Power BI adds SUM function to summarize longitude and latitude. These columns with summarize functionality don’t work in maps. You must remove summarize attribute from them by choosing “Don’t summarize”.