Converting Dynamics’s Geolocation To SQL Geolocation Using Microsoft Flow And Azure Function

Background

One of the awesome features of the Azure Search service is the ability to search information based on location. Azure Search processes, filters, and displays geographic locations. It enables users to explore data based on the proximity of a search result to a physical location. This feature is powered by SQL Server Geolocation data type. Since SQL Server 2008, developers are able store geospatial data in SQL server using Geolocation fields. Geolocation fields allow querying data with location based queries. To facilitate the Azure Search service to search within CRM accounts and contact, I had to pushed my account and contact searchable information to SQL server hosted in Azure. To copy information from Dynamics to Azure SQL server, I used Microsoft flow. Everything worked good except, copying CRM longitude and Latitude to SQL Server.

The problem

The problem with copying longitude and latitude to SQL server Geolocation field is the compatibility. When you try to insert longitude and latitude fields to Geolocation you encounter casting error.

The solution

  1. The solution I used to tackle this problem is making use of Azure Function and converting Longitude and Latitude to Geolocation type in the Azure function and return the response before the Insert action in the flow. See the below steps:
  2. Step 1 is self-explanatory.
  3. The step “CC Contact” extracts the Contact name (or any lookup name property) from a lookup.
  4. The “Http” step, calls the Azure Function to converts the CRM longitude and Latitude to SQL Geolocation field
  5. The “Insert Row” step, inserts our data to SQL server row.
Microsoft Flow
Microsoft Flow

The Azure Function

The Azure function is a very simple function. You will need to import Microsoft.SqlServer.Types Nuget package and use the below code:
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();  
       Coordinates data = JsonConvert.DeserializeObject<Coordinates>(requestBody);  
       SqlGeography point = data.GetGeography();  
       return ( ActionResult ) new OkObjectResult ( $"{point}" );  
 public class Coordinates  
   {  
     public double Longitude { get; set; }  
     public double Latitude { get; set; }  
     public SqlGeography GetGeography ( )  
     {        
       try  
       {  
         return SqlGeography. Point ( Latitude , Longitude , 4326 );  
       }  
       catch ( Exception ex )  
       {  
 // Log ex and handle exception  
         throw ex;  
       }  
     }  
   } 

 

 

Implementing Enterprise Search In Power Platform

Photo by Anthony Martino on Unsplash
Photo by Anthony Martino on Unsplash
Providing good search capabilities is a key feature in modern business applications to support usability and end user satisfaction. We have seen how the search capabilities of the Dynamics platform has evolved from providing “Quick Search” and “Advanced File” to “Relevance Search”. The goal of the platform search features has been to support users to find the relevant information they need in the quickest and easiest form. These search features are out-of-the-box and easy to enable/configure/use. As the platform progresses to offer richer features to users and enable them to search better, the demand for richer and better search techniques grow, and we see instances where the platform capabilities cannot meet user demands with its out-of-the-box capabilities. Before going further about advanced search scenarios, you can read about the platform out-of-the-box search capabilities in this official documentation. In this article I share why we may decide to implement a search solution of our Dynamics solution using Azure Search Service.
In enterprise implementations, business applications are not the only systems used in the organization. We often see call center agents and sales representatives need to obtain their required information from various systems to service customers. Searching users in every system is a cumbersome job which may cause setbacks in end-user adaption. Integrating Dynamics with Azure search offers consolidation of search operations in one specialized search service with ability to connecting to various data sources and apply modern search techniques to find the most relevant data. A practical example of this scenario can be seen in one my recent experiences where the organization users had to search for user information in CRM, SharePoint, Sybase and a pool of CSV files.

Customized Search experience

To facilitate more user adoption, using customized search techniques are highly favorable. In all modern search engines, we see use of “Auto complete”, “Suggestions” and “highlighting” features which can be added to the Dynamics solutions search experience. Displaying search results by support of “Document Preview”, “Document Opening in a customized containers”, “Facets”, “Filter” and “Sorting” are examples that enhance your Dynamics solution’s capabilities.

Customized Search Behavior

The true power of search is demonstrated with different pieces of information are linked together to make sense of a bigger picture. Extracting words and sentences from documents including images and pdf files, extracting key phrases, people names, location names, languages and other custom entities with the help of AI is another unique feature that you can add to your Dynamics’s search capabilities. Another amazing search capability you can have in your Dynamics implementation is the ability to search based on geolocation information, i.e. you can search for all your partner network from CRM or get the location of your field service force. The beauty of implementing your own enterprise search lies in the fact that you can search information in your data stores and link them using AI to generate knowledge and better insight to your data.

Customized Search Result

Another need for customized search in your Dynamics solution to the ability to refine your search result profile. When you use AI in your search, the system gives you the power to see how relevant search results are to your search keywords. And by knowing this you can refine your search profiles to generate a different result for the same keywords. This way you train the AI engine to work better for you and enable users to get more accurate search results.
Architecture

Dynamics integration with Azure Search service can be integrated in the following pattern:

 

  1. Integration through web resources: These web resources will host a web application acting as a client to the search service. The web resource can be a HTML file or an iFrame hosted on forms. The important point in this approach to ensure cross-origin settings in the client application and writing your html in a secure way and according to the best practices.
  2. Integration through custom power platform controls. You may build your own custom control which sends REST requests to the Azure Search and display results by consumes REST responses. The custom control can call Azure Search services using Actions or direct REST calls to Azure Service.
  3. Azure Search works based on indexes and your first step is to push your CRM searchable data to Azure Search indexes. This can be done using Microsoft Flow, Microsoft App Logics, custom solutions or Azure Data Factory. I have used all these tools in my implementations, and you can opt to any of these tools based on your requirements.
  4. Once the data is in your data store, you can create your indexes in the Azure Search. You can go for separate indexes for each data source or combine multiple data sources in one index. Each approach has its own requirements which will need to be met either in your client web application or a separate azure compute resource. Once indexing is done, you can make use Azure Search Rest API directly or using Azure API management to expose your search service to your Dynamics solution.
Summing these all up, you see as business application products get more sophisticated and organizations move from data to big data, engineers now must look for innovative approaches to implement Dynamics Solutions. Microsoft Azure along with Dynamics platform offers necessary tools to solution architects to design such solutions.

Virtual Entities 0x80040203 Invalid Argument Error

I stumbled upon this issue after creating a custom virtual entity data provider. Ivan Ficko has a great tutorial on this here.

The subgrid displayed records perfectly fine in the old web client, but in UCI i received the below error message “0x80040203 Invalid Argument”. After some search and only finding a single post regarding this with no answers, i decided to take matters into my own hands! Digging through my  browsers console i managed to find additional information regarding this error. Inspecting the  exception, i found the message “entity name is invalid”.

Continue reading

Custom views on lookup wont work without name field

Quick tip here, when setting a custom view on a lookup in the form designer:

pic1

Always ensure that the name attribute is on the view too, even if it won’t be displayed.

pic2

I had noticed that my search results were not working at all when I had typed in the lookup field. After an hour of troubleshooting, I added the name field onto the view and voila! Search and the control, in general, started behaving.

Solution Layering

I’ve recently noticed the Solution Layers button but knew next to nothing about its functionality.  It was added to my ever growing list of, “Ok, I need to check that out when I have some time!” While on a call this past week, the Solution Layers feature came up. After a brief overview on the call and some poking around afterwards, it looks to be a useful feature for developers, business analysts, and administrators.

What are Solution Layers?

Solution Layers is not some hidden, mystery feature.  Microsoft has done a great job recently with their online documentation and the article titled View solution layers includes a nice quick explanation of Solution layers:

  • Let you see the order in which a solution changed a component.
  • Let you view all properties of a component within a specific solution, including the changes to the component.
  • Can be used to troubleshoot dependency or solution-layering issues by displaying change details for a component that was introduced by a solution change.

So the Solution Layers tool offers insight into system components and their relationships to Solution deployments. The significant bit here to me is that it shows changes to the component and when the installation or updates were introduced.

Where do I find Solution Layers?

When you select a Solution component, such as an Entity, Process, or Web Resource, or sub component such as an Entity Form or Attribute, you will now see a button labeled Solution Layers.

For example, I opened the Power Apps Checker solution in a recently provisioned demo environment.  Expanding the Entities, we can see the button on the Analysis Result Detail Entity. Drilling into the Forms list, we see the tool button available with the Information main Form.  

Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity Information Form
Solution Layers for the Analysis Result Detail Entity Information Form

If you open the Solution Layers dialog for the Analysis Result Detail Entity, we can see a one item list of Solutions.  This is a list of the Solutions to which this Entity is related.

Entity level Solution Layers
Entity level Solution Layers

Select the Solution listed and you can view the Analysis Result Detail Entity details that are related to the Solution.

 Analysis Result Detail Entity Solution Layer Details
Analysis Result Detail Entity Solution Layer Details

This view provides the list of the changed properties for the Entity when the Solution was imported in the first Changed Properties ‘tab’, and the full list of Entity properties in the All Properties tab. If we open the Information Form for this Entity, we see very similar Information: a single Solution and the detailed changes of the selected Entity Form for that Solution import. 

We only see one item in both the Entity and Entity Form levels because this Entity and all of its components are unique to this Solution. We can also see the list of Changed Properties is the same as the list of All Properties. This tells us that the Analysis Result Detail Entity was installed with Power Apps Checker solution and has not been affected by any other Solution installs.

That is some nice information, but not especially useful. The Solution Layers component really shines when we look at Entities that can be impacted by other solution imports.  For example, a system Entity Contact can be impacted by many different Solutions on your system. Or you may have a custom Entity being deploying as part of a product or an ongoing project that will see regular changes, whether through major Solution releases or hotfix style solution deployments.

Contact is a popular Entity

If we open a different solution that contains the Contact Entity, we see the real power behind this tool. If we open the solution named Sales Navigator for Dynamics 365 Unified Interface that comes with my demo environment, and view the Contact Entity Solution Layers, we see some immediate differences.

Contact Solution Layers Detail - lots of changes!
Contact Solution Layers Detail – lots of changes!

The Contact Entity has been changed by 21 separate Solutions. The first at the bottom of the list is System, but at the top we see Active as the latest. This means that the Entity or one or more Entity sub components were updated with each of these 21 Solution imports. So, how do we see more detail on all of these Entity changes?

Deltas!

If we dig deeper into the Solution components, we can see more granular detail of the changes. We can drill into the Contact Forms list for this Solution and open the Contact Form Solution Layers dialog.

In this view, we can see that the Contact Form has been updated by 11 different Solution Imports. But what has been changed? Open up a solution from the list to find out:

Contact Form Solution Layers Detail
Contact Form Solution Layers Detail

In this view under Changed Properties, we can see detailed changes that were made with the Solution Import. In this example, we see the underlying Form JSON value was updated, and if you scroll a bit, you will also see that the Form XML. With other value types, such as numbers or boolean values, it’s easy to see the changed value.

For more complex types like Form JSON or XML, you can compare the differences to the previous Solution Layer value. Simply open the previous Solution Layer from the list and view the property value under the All Properties view using a standard text diff tool such as WinDiff or Visual Studio.

Why is this a big deal?

The Dynamics 365 CE and the Power Platform with CDS now has a built in method for change tracking of various layers of the solution components. I include the Power Platform here because when you view an Entity from a Model Driven Power Apps , you have the option of switching to Classic View. In Classic View, you can view the Solution Layers exactly as if you were working within a Dynamics 365 CE solution.

This can be incredibly useful when troubleshooting issues or just managing your own deployments. With solid DevOps practices in place, you should be able to view content like this using source code control tools. But if you are working on a project for which those practices were not well established, I can see this feature as a huge help for developers, business analysts, or system administrators.

I recommend reviewing the article listed above and playing around with the feature. For example, check out changes to solution components like Workflows where you can view the changes to the underlying XAML that contains the workflow logic.

I will be looking into it in more detail myself because I can see the possibility for some nice tools built around this capability!

Streaming Data Sets into Dynamics 365 Customer Engagement

In this post, we are going to look at the challenge of how to display streaming data sets directly onto a Dynamics 365 Customer Engagement form. While there already exists a way to embed Power BI dashboards and reports within Dynamics 365 Customer Engagement, these are not on a form level. To see how to do this currently, have a look here. When followed, you should observe results similar to the following, where a dashboard is initially displayed and then you can click though to the underlying report(s):
 
 
What you’ll notice from this is that these are personal dashboards that lack the ability to be contextually filtered. So to resolve this, we are going to create a Web Resource that has the ability to display a contextual (and streaming) dashboard on a Dynamics 365 Customer Engagement form!
 
To get started, lets have a look at what this will look like architecturally:
 
 
From the architecture, you should notice that we need to create a custom HTML Web Resource that will serve as a placeholder for the Power BI dashboard. When the form loads, we are going to use JavaScript to process the incoming parameters which can include both configurations and contextual data based on the record (form) that the Web Resource is being rendered on. The JavaScript will then call a reusable Dynamics 365 Action that will consume the incoming parameters before calling a Dynamics 365 Plugin. This plugin is necessary as it will help us execute a token exchange with the Azure Key Vault based on the currently logged in user. This token is then used in retrieving a specific secret which contains the required configurations necessary to render the Power BI report contextually and in an authenticated state back on the Dynamics 365 Customer Engagement form.
 
Simultaneously, the Power BI dashboard will be receiving a continuous stream of data from an MX Chip (IoT Device) that is connected to an Azure IoT Hub. This stream of data is provided through the Stream Analytics service which continually processes the incoming data and is able to send it as an output direct to Power BI before it is visualised. For reference, the Stream Analytics Job should look something similar to this:
 
 
You will notice that there is a dedicated Power BI output in the above and that we have limited the Stream Analytics job just to look for our MX Chip device. We also need to include a bit of DAX to format the incoming IoTAlert data to be a bit more readable. Examples of the incoming data, the DAX, and the Power BI configs are below:
 
 
As a result of this, we should now be able to see the streaming data set on the Dynamics 365 Customer Engagement form after a bit of Power BI visualisation magic as follows:
 
 
As we have parameterised the initial Web Resource on the form, this Dashboard is able to pre-filter visuals should we wish, and can also easily be embedded on the form and record type of your choosing! The following video demonstrates the complete pattern in action:

 

DYNAMICS CE WORKFLOWS SCHEDULING USING AZURE FUNCTION APP WITH TIMERS

A ‘making-dynamics guy’s life-easy‘ solution to schedule your Dynamics CE out of box workflows to run on particular frequencies is finally here!

System workflows are the best when it comes to doing a simple task without having to put our heads into writing hell a lot of coding. However, the real pain comes into scene when you want to schedule them as per your requirements. Well, if you’re wondering how you could make this work out in a simple way, here’s the good news – this is totally achievable using the winning combo of an Azure function app with a timer associated with it. If you want to read more about the how Azure function works, you can use this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview

Now, if you want to dive right in, you’re in the right place.

 

ADVANTAGES:

 

  1. Unlike other solutions, the usage of Azure functions allow you to enjoy the benefits of a server-less setup. These are perfectly designed to run without a server and to integrate and monitor jobs that run within CE.
  2. Connection to CE can be made by referring to the core SDK libraries using NuGet.
  3. It consumes less number of resources for running, without having to use custom entities in CE to configure the scheduler.
  4. Easy management of the functions that are set up. You can enable or disable them as and when required just by a button click.
  5. Detailed logging of successes and failures of the workflows that are being executed on frequencies
  6. Handles bulk jobs with a function timeout of 10 minutes. (how cool is that!)

 

PRE-REQUISITIES:

 

This list is surprisingly not long. All you need for this to be set up successfully is, an Azure Subscription or a free Azure trial login account to give it a go.

 

STEPS:

 

  1. Login to your Azure Account, from https://portal.azure.com. You will be able to see your Dashboard in the home screen.
  2. Click on ‘Create a resource’ option, located on the upper left-hand corner of the page.
  3. Type in ‘Function App’ in the search box that appears, enter all the required values and click on create. Once the function starts deploying, wait for the Deployment Succeeded message to appear in your notifications.

  1. Open the app that you just created and create a new function for the app. Make sure you select the type as ‘Timer Trigger’ while you create , as shown below

  1. Set a schedule timer using CRON expression which is displayed under the Integrate section of the function. The format of this expression will be {second} {minute} {hour} {day} {month} {day-of-week}.

I have set the timer expression as 0 */5 * * * *, which means that the workflow will run for every 5 minutes. To know more about different timer settings, refer this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer.

 

  1. Connect to Dynamics CE by referring to the core SDK assemblies using NuGet. Go to Platform features tab on the function and click on App Service Editor. This will open up all the files in the folder in a new window. Create a new file called ‘project.json’ within the same function folder. Use the following code snippet to connect to the CE SDK assemblies.

{

“frameworks”: {

“net46”:{

“dependencies”: {

“Microsoft.CrmSdk.CoreAssemblies”: “8.2.0.2”

}

}

}

}

 

 

  1. We will then add configuration parameters in Application settings of the function, for the C# code to run. These parameters include the CRM instance URL that you are connecting to, the appropriate credentials for connection and the actual name of the workflow that needs to run on the scheduled time.

  1. Now, we add in the following piece of code that triggers the workflow specified in the configuration parameters using the credentials mentioned in the above step.

using System.Net;

using System.Configuration;

using Microsoft.Xrm.Sdk;

using Microsoft.Xrm.Sdk.Client;

using Microsoft.Crm.Sdk.Messages;

using Microsoft.Xrm.Sdk.Query;

 

 

public static void Run(TimerInfo myTimer, TraceWriter log)

{

IServiceManagement<IOrganizationService> orgServiceManagement = ServiceConfigurationFactory.CreateManagement<IOrganizationService>(new Uri(ConfigurationManager.AppSettings[“CRMinstance”]));

 

//Connect to the CRM instance

AuthenticationCredentials authCredentials = new AuthenticationCredentials();

authCredentials.ClientCredentials.UserName.UserName = ConfigurationManager.AppSettings[“CRMusername”];

authCredentials.ClientCredentials.UserName.Password = ConfigurationManager.AppSettings[“CRMpassword”];

AuthenticationCredentials tokenCredentials = orgServiceManagement.Authenticate(authCredentials);

 

//Retreive the service

IOrganizationService service = new OrganizationServiceProxy(orgServiceManagement, tokenCredentials.SecurityTokenResponse);

 

//Get the workflow GUID to run from workflow name

QueryExpression objQueryExpression = new QueryExpression(“workflow”);

objQueryExpression.ColumnSet = new ColumnSet(true);

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“name”, ConditionOperator.Equal, ConfigurationManager.AppSettings[“CRMworkflow”]));

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“parentworkflowid”, ConditionOperator.Null));

EntityCollection entColWorkflows = service.RetrieveMultiple(objQueryExpression);

if (entColWorkflows != null && entColWorkflows.Entities.Count > 0)

{

 

Guid workflowGuid = entColWorkflows.Entities[0].Id;

if(workflowGuid != null)

{

//Get the fetchxml string from Configuration

string entitySetting = ConfigurationManager.AppSettings[“CRMFetchString”];

FetchExpression fetchRecords = new FetchExpression(entitySetting);

 

EntityCollection recordsCollection = service.RetrieveMultiple(fetchRecords);

if (recordsCollection.Entities.Count > 0)

{

log.Info($”Records fetched : {recordsCollection.Entities.Count} at {DateTime.Now}”);

foreach (Entity e in recordsCollection.Entities)

{

ExecuteWorkflowRequest request = new ExecuteWorkflowRequest()

{

WorkflowId = workflowGuid,

EntityId = e.Id

};

log.Info($”Executed workflow successfully : {DateTime.Now}”);

 

// Execute the workflow.

service.Execute(request);

}

}

}

}

 

log.Info($”C# Timer trigger function executed at: {DateTime.Now}”);

}

 

  1. You can test run the C# code you added in the above step to make sure there are no errors.

 

  1. The function is by default enabled, and it can be disabled anytime you want by clicking on the enabled/disabled toggle button under the Manage option of the function. (I have disabled my function and that’s the reason why it has prefixed (disabled) to my function name).

 

  1. The ‘Monitor’ option of the function allows you to check for successes and failures of the function including the detailed logs included in the code.

 

 

And, that is all! Your azure function will keep running the specified workflow until you disable it.

How to embed a Canvas app into a CE form, pass the record id and update the CE record.

The requirement: Allow a CE user to update marketing consent and to provide guidance and logic around the process, this app is the basis for the latter.

Solution: This could be achieved using a custom webpage, or possibly a Dialog (deprecated soon), but the latest recommended approach is to use a canvas app embedded within CE, so here are the steps to achieve this;

Note: this screenshot/app was to prove the process works so has some random fields in it, ultimately there would be a lot more to it with extra logic.

  1. Create the connection to D365
  2. Browse to Apps and Create a new blank Canvas app
  3. Insert a new Form (Edit)
  4. Select [Data Source] and Add new, select your connection to D365 from earlier
  5. Choose the appropriate D365 environment
  6. Select the [Accounts] table and [Connect]
    • This will add some fields to the Form for you and is where you can select the ones you want/don’t want, format them/rename them, change the colours etc.
    • The blue header/footers in my example is a [Label], white text, blue background, the colour code is #3B79B7 which matches the CE UI theme
    • Rename the Form in the left panel from Form1 to [AccountForm]

  1. Next, you need to update Form to take an input parameter with the ID of the CE record which we will pass in further on, on the Form, select [Advanced], then [Item] and enter
    • LookUp(Accounts, accountid= Param(“ID”))
  2. Next insert an Icon – [Check] so that we can Submit the changed data back to CE, go into Advanced on the Icon and Update the OnSelect to;
    • SubmitForm(AccountForm)
  3. Save and Publish the app
  4. It should now look something like this, depending on the field types you select, I added a footer with the CE ID value displayed to check what was passed through.

  1. Browse to https://web.powerapps.com/environments/ then Select [Apps] from the left menu

  1. Click on the ellipsis of the App, and make a note of the Web Link from the [Details] Tab. It will look like this – https://web.powerapps.com/apps/<AppID> make a note of this APPID for the steps below.
  2. Next is the CE components, add a new HTML Webresource and paste in the following code replacing the <AppId> with the one you recorded previously.
    • Set the width and height to your App sizes, taken from the App settings page.

  1. Open the [Account] entity form, Add a new Tab, Insert a new Web Resource onto it, select the web resource that you created in the previous step and set the following parameters;
    • Display Label on Form= false
    • Number of rows = adjust depending on the size of the App
    • Scrolling = As necessary
    • Display Border = false
  2. Save and Publish
  3. It should look like this once it’s finished

 

Where does CE PSA fit if I have Finance and Operations?

Updated last: 23/12/2018

This is a live blog post that will be updated with changes that are applied to the application – I’ll also update it with input from the community too. 

Right, I thought it’d be best to write a quick post on this topic as it is a question I receive quite regularly which is along the lines of…. “Hey Will, I see you’ve been working on Customer Engagement PSA – I don’t really understands how that would fit in with an organisation that has Finance and Operations system or at all”.  Then I take a deep breath and I say something along these lines…

(There are a few version of this response depending on what the business does)

PSA flow:

What we must remember is PSA is there to ultimately help the prospect to cash process, but hey we hear and read “Prospect to Cash” thrown around a lot and it doesn’t help explain anything, what I mean with this is as follows;

  1. the ability to turn someone you may have been in contact with to a Lead
  2. then qualify said Lead to an Opportunity
    1. During the opportunity process you will start, hopefully, creating a proposal and to really provide a precise as can be quote it is best to create a project with a thorough work break-down structure along with associated costs (expenses, role costs etc.) then to import this structure along with associated costs into the contract to provide a quote.
  3. Submit the quote to the customer and hopefully mark it as won – or maybe you may have to create another until you ultimately, hopefully, win
  4. The quote then turns into an Order/Contract with an associated project and all this richness can then be synced across to Finance and Operations – the contract will be pulled across along with the associated project details; project name, project contract associated, actual start date, work breakdown structure (if you’ve assigned resources then these can be brought across too) etc.

Where to place your personnel in a PSA & FinOps stack implementation:

Now the more interesting piece is where do you ask your employees to enter their Time and Expenses, where do you ask the Project Manager to carry out their tasks and where do you ask the Resourcing Manager to sit?

Now we must remember PSA – IS NOT A FINANCE SYSTEM, IT IS NOT TRYING TO BE A FINANCE SYSTEM, IT’S PURPOSE IS NOT TO DEAL WITH ANYTHING RELATED TO ACCOUNTING AND FINANCE, the purpose is to provide a buffer between account management and back office tasks such as the accounts department and to provide more granularity to items such as quoting (remember this is from a perspective when Finance & Operations exists as part of the implementation).

However, what it does do well is to provide the ability to price up quotes thoroughly thanks to this project creation functionality and it also performs some project processes well that can then be handed over for further processing.

Now let’s take a quick dive into where to place the Project Managers, Employees and Resourcing Managers.

Employees– now, personally, as an employee I prefer the user interface in CE  for entering Timesheets and Expenses rather than Finance and Operations – it is more aesthetically pleasing. However, there are limitations around expenses – there are no expense policies out of the box so this would need to be provided via customisation.

Along with other workflow requirements, and let’s face it expense workflows (from my experience implementing systems, especially in global systems) can be incredibly complex which will also be better suited for Finance and Operations as PSA only allows one level approval when in reality multi-level and conditions are required.

PSA does have the ability to bring in the hours you entered last week, or the appointments/projects you’ve been assigned in the resource scheduler but Finance and Operations allows this too.

What I’m getting at here is it is best to stick with Finance and Operations and if you wish to make the user interface more kinder on the eyes then use the mobile application functionality or throw together a PowerApp.

Resourcing Manager– now this is where I lean towards PSA, as long as you sync proficiency models, skills, characteristics, roles, cost prices, sales prices etc. between Finance and Operations and CE PSA (or if you’re company is using talent then have a network of the three Talent>PSA>FinOps) then I much prefer the Scheduling board within PSA and the way you submit requests to be fulfilled. Look at the screenshot below and how glorious it is, colours, pictures, charts – PSA has it all (you can even use the map functionality- living the dream)!

Project Manager– now this depends on the organisation, PSA allows the PM to manage their project team, monitor cost absorption (effort tracking as well), look at project estimates, submit resourcing requests (all this also exists within Finance and Operations)- but if you want your PM to also invoice clients, perform a more advanced level of WIP adjustments then this role will suit Finance and Operations.

Also the dashboards are not that brilliant in PSA – yes you can use PowerBI embedded functionality but Finance and Operations has brilliant out of the box reports, as well as enhanced areas such as the Project Manager Workspace (provides an overview of their project related activities as well as allows them to initiate their most frequent tasks) as well as PowerBI integration – soooooo…..

General Finance points related to PSA functionality: PSA does let you push through flexible journals, you can export actuals (or integrate them), you can adjust actuals (as well as few adjustment histories) and you can invoice through funding sources and billing rules (not as advanced as Finance and Operations) set out on the project contract.

Important to note that there is no out of the box functionality to tie Purchase Orders to projects, thus this is not wrapped up and summed into items such as const consumption etc. a journal can be used for this in the mean time but creating the PO in FinOps and then pushing that across as a journal to keep track in PSA may be one route (dependant on if your PMs sit there if not it really does not matter). Furthermore to this there is no commitment or encumbrance accounting to keep track of the financial health of a project with regards to Purchase Orders.

Another key part of project management is budget control. Unfortunately there is no budget control that sits within PSA only a cost consumption meter so this will have to be validated/tracked through Finance & Operations but the validation will only occur post transaction if you choose to leave T&E within PSA (not a wise move).

Conclusion:

So let’s conclude – PSA DOES HAVE A FIT within the full suite of Dynamics 365 and for organisations that uses both CE and Finance and Operations if it is used for it’s intended purpose which in my eyes is to assist with quoting proposals and assisting with some of the non-accounting project processes to allow that smooth transition from sales to delivery.

And one more thing….. if the company DOES NOT have finance and operations but another Accounting system that does not include project management and they also require a sales system then PSA is a great fit!!!!

 

Two-way Azure Plugin Walkthrough

My first blog, so go gentle 😀

Summary:

When you’re calling external integration services and you’re using an enterprise service bus, you need to send a message with your CRM data to the service bus (obvs!). But in order to be able to use the built in behaviour of a crm plugin (rollback in particular) you’re going to need to run the plugin synchronously so that the user knows a problem has occurred. If you use the OOB plugin then it only works Asynchronously. Luckily service end points are baked into the CRM infrastructure, so we can get a leg up on passing remote execution context to a listener. That can then call the integration service and, should it fail, return a value to the plugin so it can respond accordingly.

The blog is a technical walkthrough of the steps to accomplish it. Enjoy!

https://dynamicsjourney.wordpress.com/2018/09/21/two-way-azure-plugin-walkthrough/

D365 Social Analytics Solution

As promised! Demoed at our D365 Saturday Summer Boot Camp session on Replacing Dynamics workflows with Flow.

This solution gathers tweets matching a specified hashtag saving them into a custom entity in Dynamics. A second flow then uses the Cognitive Service api to extract useful information from the tweets such as sentiment, key phrases and also translate the tweet if it’s not in English. This blog post contains the two flows as well the solution used in Dynamics with brief instructions on how to put everything back together

 

Dynamics solution

Contains a Social Analytics custom entity with some magic in the background!

Unmanaged.zip

Managed.zip

Flows

GetTweets.zip (Gets Tweets matching hashtags and creates records in social analytics entity)

DynamicsSocialAnalyticsV2.zip ( On create of a record in the Social analytics entity, use Text API to get sentiment, translation if not English, key words and update back into Dynamics)

Setup

Install the unmanaged  or managed solution into your instance which ever floats your boat 🙂.

Text Analytics API Key

You will need an azure subscription on with a Cognitive Text Analytics API service .You can get a trial API key with t 5000 executions for 7 days ( A free azure subscription will not limit you to the 7 days). Go to https://azure.microsoft.com/en-gb/try/cognitive-services/?api=text-analytics

Make sure Text Analytics is selected and hit Get API Key – Choose guest and get started.

You should eventually end up with the your API key and endpoint as shown in the image below which will be needed later on in the flow.

Social Analytics Flow

Head to your flow environment https://flow.microsoft.com. Go to My Flows , you should see Import in the top right, hit that . Upload  and import the flow DynamicsSocialAnalyticsV2.zip. You will need to fix the connections to your Dynamics instance. For Text Analytics, select “Select during Import” , Create a new connection, search for Text Analytics, select it and enter one of your Keys and your Endpoint URL. Come back to the import screen, refresh the list and select your new Text analytics connector. Do the same  for the Translator connection, named “Microsoft Translator” you shouldn’t need an API key for that. Once all the connectors have been fixed import the flow.

Once complete you should be able to see the Dynamics Social Analytics flow. Edit the flow and point both the Dynamics triggers, at the start and the update action all the way at the bottom to your instance by clearing out the org name, selecting yours and then the Social Analytics entity provided in the installed solution.

Before

After

Get Tweets Flow

Import GetTweets.zip flow. Fix the connections again by adding a twitter and your Dynamics connections.After upload , you will need to fix the create Dynamics record action at the bottom of the flow as before. Replace #D365Saturday to your favourite hashtag and bob’s your uncle, you can duplicate this flow if you wish to track multiple hashtags.