Dynamics 365 for Customer Engagement

Utilization by role is blank – Dynamics 365 CE PSA

Does your “Utilization by Role” chart look as blank as mine (see image below)!?? Well coming from an F ‘n’ O background I wasn’t sure of the issue at first but then I thought maybe the data is not being gathered… then I was curious if CE has the concept of batch jops like we do in Finance and Ops… VOILA it does!!!

By going to Project Service>Settings>Batch Jobs:

Then creating a new Batch Job named “Utilisation Charts” selecting its frequency and setting it as active then saving. Allowed me to then select “Run Workflow:

From there I could select the “UpdateRoleUtlization” workflow:

Then BOOM, sweet, sweet dataaaaaaaaa:

 

 

Resource Allocation mode: Hybrid or Central?

The smallest post I’ve ever produced:

What on earth does “Resource allocation mode” mean on the Customer Engagement Project Services Automation Parameters?

Well, it turns out it that the two choices of “Hybrid or “Central” are quite simple in meaning:

  • Central: Only resource managers can allocate resources to projects.
  • Hybrid: Resource managers, project managers, and account managers can allocate resources to projects.

A perfect way to control resource allocation within the environment.

Boom, I’m glad we cleared that up – now stop reading this and get back to work 😉

UnBoxing the Power Platform Model-Driven Form Designer Preview

I saw recently on Twitter that the WYSIWYG Model-Driven form editor is in preview and I thought, “HEY, Why not do a video without ever opening and testing this functionality”… SO that’s exactly what I did. it’s a little bit all over the show and as I find stuff I get more excited. I had to edit out a cough a few times, but other than that, this is raw footage of a totally new experience for me (gulp).

My opinion is that this is a great preview and I think that this is really going to make config a lot easier when its complete. Yes, there are things missing, but its a PREVIEW. it was fun to test it and I’ll be doing my best to use this as often as I can to i can get used to it.

 

 

Streaming Data Sets into Dynamics 365 Customer Engagement

In this post, we are going to look at the challenge of how to display streaming data sets directly onto a Dynamics 365 Customer Engagement form. While there already exists a way to embed Power BI dashboards and reports within Dynamics 365 Customer Engagement, these are not on a form level. To see how to do this currently, have a look here. When followed, you should observe results similar to the following, where a dashboard is initially displayed and then you can click though to the underlying report(s):
 
 
What you’ll notice from this is that these are personal dashboards that lack the ability to be contextually filtered. So to resolve this, we are going to create a Web Resource that has the ability to display a contextual (and streaming) dashboard on a Dynamics 365 Customer Engagement form!
 
To get started, lets have a look at what this will look like architecturally:
 
 
From the architecture, you should notice that we need to create a custom HTML Web Resource that will serve as a placeholder for the Power BI dashboard. When the form loads, we are going to use JavaScript to process the incoming parameters which can include both configurations and contextual data based on the record (form) that the Web Resource is being rendered on. The JavaScript will then call a reusable Dynamics 365 Action that will consume the incoming parameters before calling a Dynamics 365 Plugin. This plugin is necessary as it will help us execute a token exchange with the Azure Key Vault based on the currently logged in user. This token is then used in retrieving a specific secret which contains the required configurations necessary to render the Power BI report contextually and in an authenticated state back on the Dynamics 365 Customer Engagement form.
 
Simultaneously, the Power BI dashboard will be receiving a continuous stream of data from an MX Chip (IoT Device) that is connected to an Azure IoT Hub. This stream of data is provided through the Stream Analytics service which continually processes the incoming data and is able to send it as an output direct to Power BI before it is visualised. For reference, the Stream Analytics Job should look something similar to this:
 
 
You will notice that there is a dedicated Power BI output in the above and that we have limited the Stream Analytics job just to look for our MX Chip device. We also need to include a bit of DAX to format the incoming IoTAlert data to be a bit more readable. Examples of the incoming data, the DAX, and the Power BI configs are below:
 
 
As a result of this, we should now be able to see the streaming data set on the Dynamics 365 Customer Engagement form after a bit of Power BI visualisation magic as follows:
 
 
As we have parameterised the initial Web Resource on the form, this Dashboard is able to pre-filter visuals should we wish, and can also easily be embedded on the form and record type of your choosing! The following video demonstrates the complete pattern in action:

 

DYNAMICS CE WORKFLOWS SCHEDULING USING AZURE FUNCTION APP WITH TIMERS

A ‘making-dynamics guy’s life-easy‘ solution to schedule your Dynamics CE out of box workflows to run on particular frequencies is finally here!

System workflows are the best when it comes to doing a simple task without having to put our heads into writing hell a lot of coding. However, the real pain comes into scene when you want to schedule them as per your requirements. Well, if you’re wondering how you could make this work out in a simple way, here’s the good news – this is totally achievable using the winning combo of an Azure function app with a timer associated with it. If you want to read more about the how Azure function works, you can use this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview

Now, if you want to dive right in, you’re in the right place.

 

ADVANTAGES:

 

  1. Unlike other solutions, the usage of Azure functions allow you to enjoy the benefits of a server-less setup. These are perfectly designed to run without a server and to integrate and monitor jobs that run within CE.
  2. Connection to CE can be made by referring to the core SDK libraries using NuGet.
  3. It consumes less number of resources for running, without having to use custom entities in CE to configure the scheduler.
  4. Easy management of the functions that are set up. You can enable or disable them as and when required just by a button click.
  5. Detailed logging of successes and failures of the workflows that are being executed on frequencies
  6. Handles bulk jobs with a function timeout of 10 minutes. (how cool is that!)

 

PRE-REQUISITIES:

 

This list is surprisingly not long. All you need for this to be set up successfully is, an Azure Subscription or a free Azure trial login account to give it a go.

 

STEPS:

 

  1. Login to your Azure Account, from https://portal.azure.com. You will be able to see your Dashboard in the home screen.
  2. Click on ‘Create a resource’ option, located on the upper left-hand corner of the page.
  3. Type in ‘Function App’ in the search box that appears, enter all the required values and click on create. Once the function starts deploying, wait for the Deployment Succeeded message to appear in your notifications.

  1. Open the app that you just created and create a new function for the app. Make sure you select the type as ‘Timer Trigger’ while you create , as shown below

  1. Set a schedule timer using CRON expression which is displayed under the Integrate section of the function. The format of this expression will be {second} {minute} {hour} {day} {month} {day-of-week}.

I have set the timer expression as 0 */5 * * * *, which means that the workflow will run for every 5 minutes. To know more about different timer settings, refer this link – https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer.

 

  1. Connect to Dynamics CE by referring to the core SDK assemblies using NuGet. Go to Platform features tab on the function and click on App Service Editor. This will open up all the files in the folder in a new window. Create a new file called ‘project.json’ within the same function folder. Use the following code snippet to connect to the CE SDK assemblies.

{

“frameworks”: {

“net46”:{

“dependencies”: {

“Microsoft.CrmSdk.CoreAssemblies”: “8.2.0.2”

}

}

}

}

 

 

  1. We will then add configuration parameters in Application settings of the function, for the C# code to run. These parameters include the CRM instance URL that you are connecting to, the appropriate credentials for connection and the actual name of the workflow that needs to run on the scheduled time.

  1. Now, we add in the following piece of code that triggers the workflow specified in the configuration parameters using the credentials mentioned in the above step.

using System.Net;

using System.Configuration;

using Microsoft.Xrm.Sdk;

using Microsoft.Xrm.Sdk.Client;

using Microsoft.Crm.Sdk.Messages;

using Microsoft.Xrm.Sdk.Query;

 

 

public static void Run(TimerInfo myTimer, TraceWriter log)

{

IServiceManagement<IOrganizationService> orgServiceManagement = ServiceConfigurationFactory.CreateManagement<IOrganizationService>(new Uri(ConfigurationManager.AppSettings[“CRMinstance”]));

 

//Connect to the CRM instance

AuthenticationCredentials authCredentials = new AuthenticationCredentials();

authCredentials.ClientCredentials.UserName.UserName = ConfigurationManager.AppSettings[“CRMusername”];

authCredentials.ClientCredentials.UserName.Password = ConfigurationManager.AppSettings[“CRMpassword”];

AuthenticationCredentials tokenCredentials = orgServiceManagement.Authenticate(authCredentials);

 

//Retreive the service

IOrganizationService service = new OrganizationServiceProxy(orgServiceManagement, tokenCredentials.SecurityTokenResponse);

 

//Get the workflow GUID to run from workflow name

QueryExpression objQueryExpression = new QueryExpression(“workflow”);

objQueryExpression.ColumnSet = new ColumnSet(true);

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“name”, ConditionOperator.Equal, ConfigurationManager.AppSettings[“CRMworkflow”]));

objQueryExpression.Criteria.AddCondition(new ConditionExpression(“parentworkflowid”, ConditionOperator.Null));

EntityCollection entColWorkflows = service.RetrieveMultiple(objQueryExpression);

if (entColWorkflows != null && entColWorkflows.Entities.Count > 0)

{

 

Guid workflowGuid = entColWorkflows.Entities[0].Id;

if(workflowGuid != null)

{

//Get the fetchxml string from Configuration

string entitySetting = ConfigurationManager.AppSettings[“CRMFetchString”];

FetchExpression fetchRecords = new FetchExpression(entitySetting);

 

EntityCollection recordsCollection = service.RetrieveMultiple(fetchRecords);

if (recordsCollection.Entities.Count > 0)

{

log.Info($”Records fetched : {recordsCollection.Entities.Count} at {DateTime.Now}”);

foreach (Entity e in recordsCollection.Entities)

{

ExecuteWorkflowRequest request = new ExecuteWorkflowRequest()

{

WorkflowId = workflowGuid,

EntityId = e.Id

};

log.Info($”Executed workflow successfully : {DateTime.Now}”);

 

// Execute the workflow.

service.Execute(request);

}

}

}

}

 

log.Info($”C# Timer trigger function executed at: {DateTime.Now}”);

}

 

  1. You can test run the C# code you added in the above step to make sure there are no errors.

 

  1. The function is by default enabled, and it can be disabled anytime you want by clicking on the enabled/disabled toggle button under the Manage option of the function. (I have disabled my function and that’s the reason why it has prefixed (disabled) to my function name).

 

  1. The ‘Monitor’ option of the function allows you to check for successes and failures of the function including the detailed logs included in the code.

 

 

And, that is all! Your azure function will keep running the specified workflow until you disable it.

My Quest to D365 Saturday Stockholm

Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to meet a load of new people for the first time which was AMAZING!

These events are so important for the community because they are often the only opportunities some of the community members really have to interact with other customers, partners ISVs and Microsoft employees. I love running into people that have encountered the same issues as I have, that way I know I’m not going bonkers and we can work on a solution together.

The crowd was great! There were many enthusiastic people in the audience who were really getting involved in the sessions, looking for information and really testing all of the speakers knowledge. You can find the list of sessions and speakers HERE.

I big reason I really enjoyed this event was the different layers and levels of content being shared across the sessions. The sessions were split into three tracks, these being:

Applications (Dynamics 365 CE), Dev (Dynamics 365 CE) and Business & Project Management. This gave participants the opportunity to stick to a single, themed, track or weave between tracks. This is pretty much how my experience went. I went to at least 1 session from each track. I wanted to get a flavour for everything. I also got to see some wizardry from folks like Julie Yack, Gus Gonzalez, Nick Doelman and Gustaf Westerlund.

Other presenters and panellists included Sara Lagerquist, Jonas Rapp, Fredrik Niederud, Katherine Hogseth, Mark Smith and Antti Pajunen. Each delivering some amazing content based on their experiences with Dynamics 365 and Power Platform.

There was a plethora of information and content being shared between speakers and passionate attendees. Everything from Microsoft Portals and Social Engagement to developing your own XrmToolBox tools (Careful with the spelling here….HAHA) was being talked about. I personally got involved in a number of Power Platform conversations, which suited me just fine because that’s kinda what I’m doing at the moment.

I had the pleasure of running 2 sessions. One in the Application track and one in the Dev track (I am no developer… Don’t judge me). The 2 sessions were:

  1. Power Platform – How it all fits together (Download Here)
  2. Building your first Canvas App with CDS, and Azure (Download Here)

Apparently people don’t like the first 2 rows.. great crowd though! Thanks to everyone that attended. Try work out what Mark is doing in the background there! HAHA!

The first presentation focused on the different elements of the Power Platform and the way it all works together. Many Dynamics 365 users often worry a bit about this because it seems so large and complicated, but it really isn’t once you have wrapped your head around the different technologies. To highlight the way the different elements of the technology worked together I included a Roadside Assist demonstration that was created during the PowerApps & CDS Hackathon Those Dynamics Guys and Hitachi Solutions Europe hosted together.

My second presentation consisted some of the “Do’s and Don’ts” around building your first Canvas App with your customer. I followed the presentation with the following:

  1. Adding several fields to a custom entity in the Common Data Service (CDS)
  2. Importing some data
  3. Creating a new canvas app
  4. Connecting the Canvas app to the CDS
  5. Adding in the Azure translation service to the app
  6. Publishing the app

The actual canvas app I created with the little model driven app solution, including data is available HERE.

The below pic gives of the impression that I am about to start having a conversation with my own hand, like an invisible Muppet. May be a great trick for my next demo 😀

One of the BEST sessions that I have been in was the “CAGE MATCH” moderated by the one and only Julie Yack,  This was EPIC fun! We were split into teams of 5 and given problems by the audience to resolve. It was a little daunting being in the presence of some of these long time MVPs, BUT, THE SHOW MUST GO ON, so we got stuck right in. Unfortunately, the team I was in didn’t take the win 🙁 Its cool, I am preparing my battle cards for the next one!

All in All, it was a fantastic event and a great opportunity to network with this amazing Dynamics and power Platform community that we all have grown to know and learn from. A MASSIVE thank you to the sponsors of the event!

Also, a big thanks to all of the folks that hung out after the event and enjoyed several beverages with me. Was a great time and I’d love to do it again 🙂

Here are some more delightful images from the day 🙂 My camera skills aren’t great so i had to grab a few from social. Thanks to those that grabbed pics in my session! I hope that this encourages more people to attend these events because I genuinely gain so much from being there.

Nick Doelman Smashing his Portals Presentation

One of my favourite Finns – Antti

Julie Yack doing her presentation on Social Engagement

MORE of the awesome Julie Yack

WHAT?? ANOTHER ONE of my favourite Finns – KIMMO!

JOOONNNAASS RAPP!!! The Legend!

We were all so excited! mark, Jonas and me 🙂

 

 

How to embed a Canvas app into a CE form, pass the record id and update the CE record.

The requirement: Allow a CE user to update marketing consent and to provide guidance and logic around the process, this app is the basis for the latter.

Solution: This could be achieved using a custom webpage, or possibly a Dialog (deprecated soon), but the latest recommended approach is to use a canvas app embedded within CE, so here are the steps to achieve this;

Note: this screenshot/app was to prove the process works so has some random fields in it, ultimately there would be a lot more to it with extra logic.

  1. Create the connection to D365
  2. Browse to Apps and Create a new blank Canvas app
  3. Insert a new Form (Edit)
  4. Select [Data Source] and Add new, select your connection to D365 from earlier
  5. Choose the appropriate D365 environment
  6. Select the [Accounts] table and [Connect]
    • This will add some fields to the Form for you and is where you can select the ones you want/don’t want, format them/rename them, change the colours etc.
    • The blue header/footers in my example is a [Label], white text, blue background, the colour code is #3B79B7 which matches the CE UI theme
    • Rename the Form in the left panel from Form1 to [AccountForm]

  1. Next, you need to update Form to take an input parameter with the ID of the CE record which we will pass in further on, on the Form, select [Advanced], then [Item] and enter
    • LookUp(Accounts, accountid= Param(“ID”))
  2. Next insert an Icon – [Check] so that we can Submit the changed data back to CE, go into Advanced on the Icon and Update the OnSelect to;
    • SubmitForm(AccountForm)
  3. Save and Publish the app
  4. It should now look something like this, depending on the field types you select, I added a footer with the CE ID value displayed to check what was passed through.

  1. Browse to https://web.powerapps.com/environments/ then Select [Apps] from the left menu

  1. Click on the ellipsis of the App, and make a note of the Web Link from the [Details] Tab. It will look like this – https://web.powerapps.com/apps/<AppID> make a note of this APPID for the steps below.
  2. Next is the CE components, add a new HTML Webresource and paste in the following code replacing the <AppId> with the one you recorded previously.
    • Set the width and height to your App sizes, taken from the App settings page.

  1. Open the [Account] entity form, Add a new Tab, Insert a new Web Resource onto it, select the web resource that you created in the previous step and set the following parameters;
    • Display Label on Form= false
    • Number of rows = adjust depending on the size of the App
    • Scrolling = As necessary
    • Display Border = false
  2. Save and Publish
  3. It should look like this once it’s finished

 

Where does CE PSA fit if I have Finance and Operations?

Updated last: 23/12/2018

This is a live blog post that will be updated with changes that are applied to the application – I’ll also update it with input from the community too. 

Right, I thought it’d be best to write a quick post on this topic as it is a question I receive quite regularly which is along the lines of…. “Hey Will, I see you’ve been working on Customer Engagement PSA – I don’t really understands how that would fit in with an organisation that has Finance and Operations system or at all”.  Then I take a deep breath and I say something along these lines…

(There are a few version of this response depending on what the business does)

PSA flow:

What we must remember is PSA is there to ultimately help the prospect to cash process, but hey we hear and read “Prospect to Cash” thrown around a lot and it doesn’t help explain anything, what I mean with this is as follows;

  1. the ability to turn someone you may have been in contact with to a Lead
  2. then qualify said Lead to an Opportunity
    1. During the opportunity process you will start, hopefully, creating a proposal and to really provide a precise as can be quote it is best to create a project with a thorough work break-down structure along with associated costs (expenses, role costs etc.) then to import this structure along with associated costs into the contract to provide a quote.
  3. Submit the quote to the customer and hopefully mark it as won – or maybe you may have to create another until you ultimately, hopefully, win
  4. The quote then turns into an Order/Contract with an associated project and all this richness can then be synced across to Finance and Operations – the contract will be pulled across along with the associated project details; project name, project contract associated, actual start date, work breakdown structure (if you’ve assigned resources then these can be brought across too) etc.

Where to place your personnel in a PSA & FinOps stack implementation:

Now the more interesting piece is where do you ask your employees to enter their Time and Expenses, where do you ask the Project Manager to carry out their tasks and where do you ask the Resourcing Manager to sit?

Now we must remember PSA – IS NOT A FINANCE SYSTEM, IT IS NOT TRYING TO BE A FINANCE SYSTEM, IT’S PURPOSE IS NOT TO DEAL WITH ANYTHING RELATED TO ACCOUNTING AND FINANCE, the purpose is to provide a buffer between account management and back office tasks such as the accounts department and to provide more granularity to items such as quoting (remember this is from a perspective when Finance & Operations exists as part of the implementation).

However, what it does do well is to provide the ability to price up quotes thoroughly thanks to this project creation functionality and it also performs some project processes well that can then be handed over for further processing.

Now let’s take a quick dive into where to place the Project Managers, Employees and Resourcing Managers.

Employees– now, personally, as an employee I prefer the user interface in CE  for entering Timesheets and Expenses rather than Finance and Operations – it is more aesthetically pleasing. However, there are limitations around expenses – there are no expense policies out of the box so this would need to be provided via customisation.

Along with other workflow requirements, and let’s face it expense workflows (from my experience implementing systems, especially in global systems) can be incredibly complex which will also be better suited for Finance and Operations as PSA only allows one level approval when in reality multi-level and conditions are required.

PSA does have the ability to bring in the hours you entered last week, or the appointments/projects you’ve been assigned in the resource scheduler but Finance and Operations allows this too.

What I’m getting at here is it is best to stick with Finance and Operations and if you wish to make the user interface more kinder on the eyes then use the mobile application functionality or throw together a PowerApp.

Resourcing Manager– now this is where I lean towards PSA, as long as you sync proficiency models, skills, characteristics, roles, cost prices, sales prices etc. between Finance and Operations and CE PSA (or if you’re company is using talent then have a network of the three Talent>PSA>FinOps) then I much prefer the Scheduling board within PSA and the way you submit requests to be fulfilled. Look at the screenshot below and how glorious it is, colours, pictures, charts – PSA has it all (you can even use the map functionality- living the dream)!

Project Manager– now this depends on the organisation, PSA allows the PM to manage their project team, monitor cost absorption (effort tracking as well), look at project estimates, submit resourcing requests (all this also exists within Finance and Operations)- but if you want your PM to also invoice clients, perform a more advanced level of WIP adjustments then this role will suit Finance and Operations.

Also the dashboards are not that brilliant in PSA – yes you can use PowerBI embedded functionality but Finance and Operations has brilliant out of the box reports, as well as enhanced areas such as the Project Manager Workspace (provides an overview of their project related activities as well as allows them to initiate their most frequent tasks) as well as PowerBI integration – soooooo…..

General Finance points related to PSA functionality: PSA does let you push through flexible journals, you can export actuals (or integrate them), you can adjust actuals (as well as few adjustment histories) and you can invoice through funding sources and billing rules (not as advanced as Finance and Operations) set out on the project contract.

Important to note that there is no out of the box functionality to tie Purchase Orders to projects, thus this is not wrapped up and summed into items such as const consumption etc. a journal can be used for this in the mean time but creating the PO in FinOps and then pushing that across as a journal to keep track in PSA may be one route (dependant on if your PMs sit there if not it really does not matter). Furthermore to this there is no commitment or encumbrance accounting to keep track of the financial health of a project with regards to Purchase Orders.

Another key part of project management is budget control. Unfortunately there is no budget control that sits within PSA only a cost consumption meter so this will have to be validated/tracked through Finance & Operations but the validation will only occur post transaction if you choose to leave T&E within PSA (not a wise move).

Conclusion:

So let’s conclude – PSA DOES HAVE A FIT within the full suite of Dynamics 365 and for organisations that uses both CE and Finance and Operations if it is used for it’s intended purpose which in my eyes is to assist with quoting proposals and assisting with some of the non-accounting project processes to allow that smooth transition from sales to delivery.

And one more thing….. if the company DOES NOT have finance and operations but another Accounting system that does not include project management and they also require a sales system then PSA is a great fit!!!!

 

MICROSOFT FLOW BASICS AND LIMITATIONS WHEN WORKING WITH DYNAMICS 365

In this post I will be covering some Microsoft Flow basics and limitations when working with Dynamics 365. This will help you determine which Flow plan and/or connectors suites best for your needs.

Connecting to your Dynamics 365 instance

Firstly let’s look at the connectors for Dynamics 365. You have two options when it comes to connecting to a D365 instance.

  1. Dynamics 365 connector

D365Connector

The Dynamics 365 connector provides limited access to the Dynamics 365 organisation.

For more info on trigger events and actions please visit: https://docs.microsoft.com/en-us/connectors/dynamicscrmonline/

  1. Common Data Services (CDS) connector

CDSConnector

Provides access to the org-based database on the Microsoft Common Data Service.

For more info on trigger events and actions please visit: https://docs.microsoft.com/en-us/connectors/runtimeservice/

Now let’s do a side by side comparison between some of the notable features:

Feature Dynamics 365 Connector CDS Connector
Trigger Flow on create Available Available
Trigger Flow on updates Available Available
Trigger Flow on specific attribute updates Not availableLimited to record level updates only

* Which means you will have to take extra measures if you have to update the triggering record within the same flow. This is to stop the flow from triggering infinitely.

Available
Change Tracking limitations Requires Change Tracking to be enabled in D365 Change Tracking is not required
Define level of scope for the Flow trigger Not availableLimited to Organisation level only Available

  • Organisation level
  • Parent: Child Business Unit level
  • Business Unit level
  • User level
Trigger Flow on deletes Available Available
Manually trigger when a flow is selected Not available Available
Action: Create Note (annotation) for specified entity record Manual Special simplified action is available
Action: Retrieve all Notes (annotations) for the provided entity Id Manual Special simplified action is available
Action: Retrieves file content for specified Note (annotation) Manual Special simplified action is available
Connector Type Standard Premium(Only available in Flow Plan 1 and 2)

Triggers

Let’s have a look at the trigger event screens of each connector. I have selected the “When a record is updated” trigger event for the screenshots.

Dynamics 365 connector:

D365Trigger

CDS Connector:

CDSTrigger

CDS connector will give you the option to select the Scope for event triggers. Scope can be set to Organisation, Parent: Child Business unit, Business Unit or User level. This is similar to the native workflow engine in D365.

In addition to the scope you will also have the option to select attribute filters. Attribute filters will ensure the event trigger is only invoked when the specified attributes are updated.

Points to consider when using update triggers:

  • Update event triggers are invoked on Update requests to the record. Event triggers would NOT check whether any attribute values are being changed or not. As long as the update request is successful the Flow would be triggered.

What does this mean?

For update triggers at record level, the flow would still be invoked even if the update request has not made any changes to the record (Applies to D365 connector and CDS connector both)

For update triggers with attribute filters, the flow would be invoked even if the update request is setting the attribute its existing value (Applies to CDS Connector)

Flow Plans

Now that we have covered triggers and actions let’s have a look at Flow Plans. Currently Flow offers 3 plans.

Flow Free Flow Plan 1 Flow Plan 2
  • 750 runs per month
  • Unlimited flow creation
  • 15-minute checks
  • 4,500 runs per month
  • Unlimited flow creation
  • 3-minute checks
  • Premium Connectors
  • 15,000 runs per month
  • Unlimited flow creation
  • 1-minute checks
  • Premium Connectors
  • Org policy settings
  • Business process flows

You can check out Microsoft Flow Plans page for more information.

Limits and configuration in Microsoft Flow

Documentation from Microsoft provides more information on current request limits, run duration and retention, looping and debatching limits, definition limits, SharePoint limits or IP address configuration.

For current limits and configuration details please visit Microsoft Docs here.

There are also some limitations in the Flow designer UI compared to the native workflow designer in D365. One of them being the ability to design grouped conditional statements. Currently Flow does not provide grouped conditions to be configured in Basic mode. Which means you will have to use the advanced mode to build your conditional statements. I have noticed that LogicApps have already added the ability to group conditional statements in the basic designer and hopefully this is on the roadmap for Flow too.

Flow:

FlowCondition

LogicApps:

LogicAppsCondition

Even with these limitations Flow offers a lot more than the native D365 workflow engine.

You can checkout Microsoft Flow Documentation page for more information and how-to guides.

I would also highly recommend watching “What the Flow” vlog series by Elaiza if you wish to learn more about Flow and how to transition from native D365 workflows to Flow.

Two-way Azure Plugin Walkthrough

My first blog, so go gentle 😀

Summary:

When you’re calling external integration services and you’re using an enterprise service bus, you need to send a message with your CRM data to the service bus (obvs!). But in order to be able to use the built in behaviour of a crm plugin (rollback in particular) you’re going to need to run the plugin synchronously so that the user knows a problem has occurred. If you use the OOB plugin then it only works Asynchronously. Luckily service end points are baked into the CRM infrastructure, so we can get a leg up on passing remote execution context to a listener. That can then call the integration service and, should it fail, return a value to the plugin so it can respond accordingly.

The blog is a technical walkthrough of the steps to accomplish it. Enjoy!

https://dynamicsjourney.wordpress.com/2018/09/21/two-way-azure-plugin-walkthrough/

D365 Social Analytics Solution

As promised! Demoed at our D365 Saturday Summer Boot Camp session on Replacing Dynamics workflows with Flow.

This solution gathers tweets matching a specified hashtag saving them into a custom entity in Dynamics. A second flow then uses the Cognitive Service api to extract useful information from the tweets such as sentiment, key phrases and also translate the tweet if it’s not in English. This blog post contains the two flows as well the solution used in Dynamics with brief instructions on how to put everything back together

 

Dynamics solution

Contains a Social Analytics custom entity with some magic in the background!

Unmanaged.zip

Managed.zip

Flows

GetTweets.zip (Gets Tweets matching hashtags and creates records in social analytics entity)

DynamicsSocialAnalyticsV2.zip ( On create of a record in the Social analytics entity, use Text API to get sentiment, translation if not English, key words and update back into Dynamics)

Setup

Install the unmanaged  or managed solution into your instance which ever floats your boat 🙂.

Text Analytics API Key

You will need an azure subscription on with a Cognitive Text Analytics API service .You can get a trial API key with t 5000 executions for 7 days ( A free azure subscription will not limit you to the 7 days). Go to https://azure.microsoft.com/en-gb/try/cognitive-services/?api=text-analytics

Make sure Text Analytics is selected and hit Get API Key – Choose guest and get started.

You should eventually end up with the your API key and endpoint as shown in the image below which will be needed later on in the flow.

Social Analytics Flow

Head to your flow environment https://flow.microsoft.com. Go to My Flows , you should see Import in the top right, hit that . Upload  and import the flow DynamicsSocialAnalyticsV2.zip. You will need to fix the connections to your Dynamics instance. For Text Analytics, select “Select during Import” , Create a new connection, search for Text Analytics, select it and enter one of your Keys and your Endpoint URL. Come back to the import screen, refresh the list and select your new Text analytics connector. Do the same  for the Translator connection, named “Microsoft Translator” you shouldn’t need an API key for that. Once all the connectors have been fixed import the flow.

Once complete you should be able to see the Dynamics Social Analytics flow. Edit the flow and point both the Dynamics triggers, at the start and the update action all the way at the bottom to your instance by clearing out the org name, selecting yours and then the Social Analytics entity provided in the installed solution.

Before

After

Get Tweets Flow

Import GetTweets.zip flow. Fix the connections again by adding a twitter and your Dynamics connections.After upload , you will need to fix the create Dynamics record action at the bottom of the flow as before. Replace #D365Saturday to your favourite hashtag and bob’s your uncle, you can duplicate this flow if you wish to track multiple hashtags.