Flow, HTTP Actions, and Files

I am working on a new presentation sample project and I wanted to test invoking an HTTP request from a Flow. Specifically, I want to invoke a Function App from a Flow using an HTTP Flow Action. In my sample, I will kick this off when a new Note is created with an Attachment.

To quickly test calling the HTTP Action, I uses an existing Function App sample that I had worked on a few weeks ago: a small Function App that I put together to test populating a PDF template using CRM data.

Poking around with Function Apps

This sample is creatively named CRMToPDF because it retrieves a record from CRM and populates a fillable PDF form from the CRM record using iText, returning the updated PDF for download. Pretty simple in terms of code, but it was a nice proof of concept testing the iText libraries (more on that in another post!).

Since this Function App returns the PDF file as the response, I was curious as to how Flow would handle it.  Could I “download” a file from a URL in Flow and attach it to an email using the Outlook email Action?

You bet I can!

With a few short steps, I was able to grab the resulting file and attach it to an email. This isn’t a huge surprise since so many Flow connectors already deal with moving files around. But it surprised me how simple it was to accomplish what I wanted to do.

So the Flow I created is really simple:

  • Trigger on a new Note
  • Invoke an HTTP Flow Action
  • Email the resulting PDF to myself

Here are the initial Flow steps, a Trigger and HTTP Action:

Trigger on new Note record, call HTTP GET

The trigger is on ALL Notes, so this would definitely change in the real world. And the HTTP GET only includes my Function App Authorization key. In a real example, I would pass in additional parameters, such as the of the Note ID or Object ID as an additional Query or as part of the request Body.

The Outlook Email Action looks like this:

New Email using the HTTP Response

You can see that this Action is pretty straightforward. It’s just an email to myself from the Owner of the Note. In Dynamics 365 CE, this mean that the System User had to enable sending emails on their behalf which is just a value under Personalized Settings. I just filled in a few bits of other info, like the Body and Subject.

The important part for me here is setting up the Attachment: set the Attachment Name to “CRM2PDF.pdf” and the Attachment Content to the Body of the HTTP Response.

That’s it. Yep. That’s all!

I first started looking at Flow a bit last year and wrote a short post about moving a document from Dynamics 365 CE to SharePoint, Flow Examples: Note attachment to SharePoint. This turned out to be relatively straightforward and a really cool Flow but had a few quirks, like converting the Note document body using the base64ToBinary function.

When I started looking at this sample, I expected some similar required steps, but setting the Body as the Attachment Content just worked. I put this entire Flow together in about 15 minutes, and it worked on the first try! (As a developer, I NEVER expect it to work on the first try!)

This tells me that the Flow engine is aware of the content type being returned by the HTTP get and can handle it properly when moving between the actions. The Actions know how to work with the files between the source HTTP Action and the next Outlook email Action.

That sounds like another obvious comment, but it makes me happy as a developer not having to do any kind of manipulation or parsing or other coding magic! For an idea of what is being returned from the HTTP action, we can look at the Flow Test logs for the HTTP GET Action and open the Outputs:

This isn’t super complex JSON for most developers: HTTP response code, several headers, the filename, etc. But for non developers, this could present an impossible roadblock. With the Flow designer and this huge library of existing Actions, a non developer can point their Flow to a service endpoint and move files about without a single line of code.

That’s some powerful stuff.

Microsoft Flow – How to move SharePoint Online list items to folders

Let’s say you have a SharePoint list with folders organized by continent and you have as well a choice column with the continent name in this. You want to move automatically all the list items with the chosen continent to the specific continent folder.

The first idea would be to use Microsoft Flow to do that, but there is no out-of-the-box action that moves list items to folders. At the moment of writing this post, there is an action to move files but nothing for list items. Even thinking about using SharePoint Rest API with the ‘Send an HTTP request to SharePoint’ connector there is no endpoint to move list items to a folder.

This is not clear in the Microsoft documentation, but actually, the same Rest API endpoint and action that is available for files can be used for list items by using the item file property. In this post, it will be shown how to use SharePoint Rest API to move items with Microsoft Flow, so as soon as an item is created or edited, it will be moved to the right location.

Basically, this Flow will do after being triggered:

1.   Get current list root folder path

2.   Get current item file name and the current path

3.   Build a new file path

4.   Check if the path is actually new and if so move the item

To better understand the following steps, knowledge on SharePoint Rest API and Microsoft Flow expressions is helpful.

To start building the flow, the trigger used is ‘When an item is created or modified’.

In this example, variables are used to store the list title, destination folder name and the site Url. Because the same values will be used in different connectors, it is a better practice to not let them hard-coded in the actions. For example, if this flow is exported to another environment or copied to be used in a different list, the changes to the flow will be minimal, only the initial variables and the trigger, instead of updating a bunch of connectors with hard-coded values.

Next step is to get the list folder Url using the SharePoint Rest API, using the ‘Send an HTTP Request to SharePoint’ action with the GET method. The action was renamed to ‘GetRootFolder’ so it is easier to access its output later in expressions. All the variables actions in this example are renamed as well, just for easier maintenance.

After this action, a new variable is initialized to store the list root folder Url with the following formula (use expressions to access the JSON content that is returned by the SharePoint Rest API, with the body function and the action name you can access the content):

body(‘GetRootFolder’)[‘d’][‘ServerRelativeUrl’]

Next action is to get the current list item File Server relative Url using SharePoint rest API like it was done with the list Root folder. In this case, it is needed to explicitly tell the Rest endpoint to load the ‘FileRef’ and ‘FileLeafRef’ properties.

The following variables are used to build the new list item folder Url, so the below formulas are used to assign values to the proper variables:

Item file name (Set with current file name only, used to build final destination path):

body(‘GetItem’)[‘d’][‘FileLeafRef’]

Item Url (Set with current item server relative Url, using the expression):

body(‘GetItem’)[‘d’][‘FileRef’]

New item Url (Root Folder/New Folder/Item File Name):

concat(variables(‘RootFolder’),‘/’,variables(‘MoveToFolderName’),‘/’,variables(‘ItemFileName’))

Then, the SharePoint ‘Send an HTTP connector to SharePoint’ is used again, to call the Files REST endpoint and move the item to the new location (yes, the same endpoint used for files).

It is validated if the File new folder path is different then the current one:

If it is, then a call to the API for the File/moveTo method is done in order to do the move:

This time a POST request is sent because we actually will request a change in this call to the endpoint, which will take care of the item move (also renamed the action to ‘MoveItem’).

After this last action is called, the item is moved to the proper folder.

The final flow will look like this:

With this flow, regardless of where the items are saved, they will be moved to the proper folder after being created/edited.

Hack4Good – My First Hackathon

Hack4Good Group Photo

TL;DR

I’ll warn you – this is a long read! To summarise though – this Community is beyond awesome and the Hack4Good event just proved that we can genuinely change the world.

The Hype

When TDG announced that there was to be a hackathon in London, with the focus of it being the Non-Profit/Charity sector, I was straight in there on the registration (after which Mrs H was then informed  that I was booked in – easier to ask forgiveness than seek permission)

This was to be my first ever hackathon, a year ago I hadn’t even HEARD of hackathons, and it ticked so many boxes for me. For those who don’t know, it’s not hacking in the sense of breaking into systems etc – this is all about using software and platforms that are at your disposal to hack together a solution to a scenario within a given time limit. The most innovative, practical, deliverable, and potential-filled solution would win the day.

When the emails started to come out Chris asked (in typical CAPS LOCK STYLE) if I would lead a team. Me being me, I jumped at the chance – in for a penny, in for a pound.

And so the excitement began. Weeks turned into days, and my poor family and friends got fed up of hearing how stoked I was. When I saw this list of other team leaders, and saw the people who were on my team, I started to question my credentials. There were so many legends of the community involved – people I look up to, and follow with eagerness and anticipation.

The Buildup

At 5:30am on Saturday 16th February, loaded with snacks and tech, I headed towards the railway station. Nerves meeting with excitement, doubts meeting determination.

Arriving just before 8am I was struck by just how, on first impressions, the Microsoft Reactor in London is a strange space. Fully stocked drinks area, with stereotypical caffeine overload available, games area, and then a large open space with tables and a large video screen. It almost seemed spartan in its simplicity.

As everyone started to arrive, and we set up our various laptops and devices, that open space suddenly became this hive of technology and potential.

Hugs and Hellos were dished out with abandon, and cries of “It’s so good to meet you at last” were deafening in their abundance. I moved from person to person and finally got to meet people who I’d talked to online or who I’d been following for ages. I was even surprised to find people who wanted to meet me!

The Morning

With typical fervour and energy the trio of Chris Huntingford, Kyle Hill and William Dorrington (who had come over for the start despite having removal lorries outside his front door!) kicked off the day.

A surprise video message from James Phillips, Corporate Vice-President of Microsoft, impressed upon all of us just how much the community is noticed by Microsoft and raised the expectations of all in the room another notch. If our dials were at 11 before that video, they were at 12 afterwards – and even Spinal Tap didn’t get to 12!

I’ll be honest at this point and admit that I can’t remember who presented exactly what and when – my mind was a maelstrom of ideas and planning.

The engaging Architect and Storyteller Alex Rijnoveanu (@WomanVsTech) from Microsoft delivered enthusiasm and encouragement.

The very funny, and trying-not-to-be-as-sweary, Sarah Critchley (@crmcat)presented in a way that only she could – with an idea about helping out stray cats using powerapps and other bits.

m-hance presented alongside Solent Mind, and that I related to what they did in a huge way because of the work I see in my day job at St. Andrew’s Healthcare. It was a sobering presentation in many ways, but also opened up our eyes as to “the art of the possible”.

Saurabh Pant and Sameer Bhangar had flown in from Microsoft (yes, all the way from Seattle) just for this event and then through away their planned roadmap presentation to give us all a major pep talk and stir us up even more. I have to say that the funniest thing was their very friendly (and also slightly sweary) rant about how much they had heard about Samit Saini in the past year! In so doing, it just served to show us all just what was possible – those who knew Samits journey smiled and laughed, and those who didn’t had their eyes opened to a new level of potential.

Quantiq presented some of the work they had done with the Leonard Cheshire charity and also give a glimpse of their toolkit for healthcare and the ideas kept flowing. As I look around at the other teams I could see people taking notes, typing away, and whispering to each other. This hackathon was going to be competitive, but boy was it going to deliver some amazing results.

I’ll apologise now to all the presenters as I haven’t done you justice in my few words, and I may have mangled your presentations up, but believe me when I say that all the presentations hit home with all of us listening. Those presentations took our plans, determination, and enthusiasm up to levels you just wouldn’t believe if you weren’t there!

Let The Hacking Commence

With a final presentation to lay down the rules of engagement, and also to make it clear that stopping for lunch was most definitely not an option, the starters gun was fired and the 4.5 hours of planning, building, and preparing began.

The buzz in the room was electric as each team discussed and planned out their scenario and then grabbing whiteboards and white space to map out what a solution could look like.

I’ll be writing more about the Team White proposal in the coming days, as there is more to come from that, but we settled on a solution that would utilise so much of the stack but would be able to be modularised and deployed as a “solution-in-a-box” approach.

With my amazing team of Penny, Josh, Denis and Raj we set about building Microsoft Forms, PowerApps, Dynamics 365 solutions, Flows, and the concept of the Hololens. Oh yes, Gadget King Raj had brought us a Hololens – and that just expanded the possibilities for us. We weren’t looking at gimmicks and tech-for-techs-sake, we were looking at a genuinely life-changing solution using some amazing software and hardware.

With a soundtrack of some amazing 80’s rock being pumped out (and yes, thanks Chris for Rickrolling us!), everyone was doing something. If you could have harnessed the energy in that room at that point you would have been able to power half of London.

Floor walkers popped by each of the teams each one listening and absorbing before offering advice, help, suggestions and more – but what was even more amazing was that the teams were all talking to each other. You read that right, the teams all talked to each other.

There was sharing of scenarios, encouragement, suggestions for improvement or additions, and helping hands. This was a competition that was like no other. This was a competition in which we ALL wanted to see every team achieve their goals. I’m a mildly (ok, seriously) competitive person at times and yet there was no sense of barging past each other to reach the finish line. This was collaboration and cooperation in competition towards a common goal.

The Winners

And with 4 and a half hours gone in the blink of an eye, the race was run. It was time to do the 5(ish) minute speed-dating presentation of the solutions.

As each team stepped up and shared I really do not know how I held it together. These were genuine scenarios, delivered with innovative solutions, and by passionate people.

Every last one.

We all watched, applauded and cheered. None of us could separate the competition. Judging was going to be tough, and so it proved.

With our hosts waffling as much as possible whilst the judges adjudicated, we all sat wondering just who it would be. We all wanted to win, but we all knew that whoever did win was fully deserving of it.

With the decision made, the announcement came that Team Grey (who had flown over from Germany to take part!) had won with an app for rounding up as you ordered food or transport and donated this to your charity of choice. Writing that makes it sound simplistic, but if you think of the implications of it you soon realise that it has massive potential.

It Is NOT Over!

The final speeches and thank you’s were made, the applause leaving hands feeling rather raw and sore, but this isn’t the end. Every proposition in the room has legs, and every person in the room knew that this couldn’t stop just because the clock had run down.

Saturday saw the start of something, the spark that starts a fire. We all felt it and reading all the posts on twitter and LinkedIn etc after the event just reaffirms that determination.

We saw not a glimpse, but rather a bright shining beacon of the power of the community. I go on and on (and on) about Community but what happened in that room on Saturday, with just a part of the enthusiastic and passionate community present, just proved what we can all achieve if we put our minds to it.

Here TDG we have the Community Collaboration Portal for working on community projects together, and there’s the Power Platform Bank for making solutions available, and then there’s all the social media channels out there as well.

Let’s turn this spark into a raging fire of change. Let’s use our collective skills to build new solutions to old problems.

Oh, and let’s do this again real soon!

 

 

 

IoT Button: Process automation with Microsoft Flow using NodeMCU and Arduino IDE

IoT Button: Process automation with Microsoft Flow using NodeMCU and Arduino IDE

 

In this article it will be developed an IoT button applied to the scenario of maintenance of a coffee machine using Microsoft Flow. However, it can be easily adapted to any other scenario or application.

Requirements

  • Access to Microsoft Flow or Azure Logic Apps
  • Arduino IDE
  • NodeMCU development board
  • Push Button
  • 1 x 330 Ω resistor
  • 1 x 1M Ω resistor
  • Jumpers
  • Breadboard
  • Micro USB cable

Setup Microsoft Flow  Environment

    1)    Microsoft Flow portal

Access Microsoft Flow, log in and click “My Flows”.

1.png

 

2)    Create from blank

Click “Create from blank” to create a new workflow.

2.png

 

    3)    Request/Response

Give a name to your Flow. Select the Trigger “Request/Response”.

3.png

    4)    Method GET

In “Advanced Options”, choose  “Method GET”.

4.png

 

5)    Add an action

Click “Add an action” to add a new action.

5.png

 

   6)    Send an email

Choose the action “Office 365 Outlook – Send an email”.

6.png

     7)    Create Flow

Complete all required fields (as you wish), and then click “Create Flow”.

ingles 1.png

    8)    HTTP GET URL

Then copy and save the HTTP GET URL:

https://prod-32.westus.logic.azure.com:443/workflows/<ID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<ID>

8.png

 

Hardware Setup

    1)    Building a Circuit on Breadboard

Build the circuit like the one shown below.

MicrosoftFlow-LogicApps-Button-Frittzing Project_bb.png

Software

The ESP8266 NodeMcu comes with a firmware that lets you program the chip with the Lua scripting language. But if you are already familiar with the Arduino way of doing things, you can also use the Arduino IDE to progam the ESP. In this tutorial we’ll use the Arduino IDE.

 

IDE Arduino setup

    1)    Package ESP8266

Download the IDE, and install it. Open the IDE; Choose File -> Preferences, in “Additional Boards Manager URLs” insert the URL “http://arduino.esp8266.com/stable/package_esp8266com_index.json” and than click “OK”. After this steps, your download will start automatically. Once it is finished, restart the IDE.

9.png

 

    Software Setup

Download the file “MicrosoftFlow_IoT_JoaoLucindo.zip” attached and replace the values:

  • SSID by your wireless network name
  • PASSWORD by your wireless network password
  • HOST by the strings of the HTTP GET URL before 443 (in this case:  “https://prod-32.westus.logic.azure.com” )
  • URL  by the strings of the HTTP GET URL after 443 (in this case “/workflows/<ID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<ID>”)

 By doing that, the final code will be:

#include <ESP8266WiFi.h>

//static const uint8_t D0   = 16;
//static const uint8_t D1   = 5;
//static const uint8_t D2   = 4;
//static const uint8_t D3   = 0;
//static const uint8_t D4   = 2;
//static const uint8_t D5   = 14;
//static const uint8_t D6   = 12;
//static const uint8_t D7   = 13;
//static const uint8_t D8   = 15;
//static const uint8_t D9   = 3;
//static const uint8_t D10  = 1;

int inPin = 16;   // pushbutton connected to digital pin 0   
int val = 0;     // variable to store the read value
//Include the SSL client
#include <WiFiClientSecure.h>

char ssid[] = "<SSID>";       // your network SSID (name)
char password[] = "<PASSWORD>";  // your network key

//Add a SSL client
WiFiClientSecure client;


void setup() {

  pinMode(inPin, INPUT);      // sets the digital pin 1 as input

   Serial.begin(115200);

  // Set WiFi to station mode and disconnect from an AP if it was Previously
  // connected
  WiFi.mode(WIFI_STA);
  WiFi.disconnect();
  delay(100);

  // Attempt to connect to Wifi network:
  Serial.print("Connecting Wifi: ");
  Serial.println(ssid);
  WiFi.begin(ssid, password);
  while (WiFi.status() != WL_CONNECTED) {
    Serial.print(".");
    delay(500);
  }


Serial.println("");
  Serial.println("WiFi connected");
  Serial.println("IP address: ");
  IPAddress ip = WiFi.localIP();
  Serial.println(ip);

}

String MicrosoftFlow() {
 
  char host[] = "prod-37.westus.logic.azure.com";

  if (client.connect(host, 443)) {
    Serial.println("connected");

    String URL = "/workflows/<ID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<ID>";

    Serial.println(URL);

    client.println("GET " + URL + " HTTP/1.1");
    client.print("Host: "); client.println(host);
    client.println("User-Agent: arduino/1.0");
    client.println("");
    }
}

void loop() {
  
  
  val = digitalRead(inPin);  // read input value
  delay(200);
  //Serial.println(val);

  if(val==HIGH){
    MicrosoftFlow();
    delay(1000);
    setup(); 
    }
  
}

 

Now you can compile and upload the code from your computer to the device. You can see the result (after press the push button) in the picture below.

12 - Copy.png

Download package for this from the Power Platform Bank – direct link here: LINK

 

 

 

 

 

 

 

 

 

 

 

Using Eircode (Ireland Postcodes) to get Geolocation from Google Maps with Microsoft Flow

Let’s say you have a SharePoint list storing information with an Eircode column (postal code for addresses in Ireland), and you want to use that information in Power BI later to generate a map organizing all your items by location. Unfortunately, the Power BI maps don’t work fine with Eircodes, so how could we get the locations with the most precise information?

Let’s use Microsoft Flow and Google Maps API for that!

(Sorry Microsoft for not using Bing Maps)

Before start building the Flow, get the Google maps developer API key in: https://developers.google.com/maps/documentation/javascript/get-api-key

And include two new columns of type number and automatic decimal places in the SharePoint list: Latitude and Longitude

With those two things set up, it’s time to begin the Flow creation.

We will use the SharePoint ‘When an item is Created or Modified’ Trigger to start our flow. After starting the Flow creation with this trigger, add an ‘HTTP’ action. To get the Latitude and Longitude information, make a get Request to Google Maps API using the Eircode coming from the SharePoint item as Address filter:

In this case, the search is filtered to be just in Ireland as you can see in the parameters.

The results we get from Google Maps API are in the following JSON structure:

Note we have the latitude and longitude under geometry/location. So, to manage that information easily, create two variables ‘Latitude’ and ‘Longitude’, of type Float.

Assign the value coming from the HTTP request body to the variables using an expression for each (which will navigate through the JSON object to access the data):

Latitude: float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lat’])

Longitude:float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lng’])

Now that you have your values properly assigned, check if one of them differ from the existing ones in SharePoint, in case it is different it means the item needs to be updated. Then add an ‘Update List Item’ Action and set it to update your list items with the new Latitude and Longitude values to finish.

 The final layout of the flow will be the following:

Now, as soon as your items are updated or created in the SharePoint list, the information will be ready to be used in the Map visual on Power BI with the SharePoint list as data source.

Have fun!

Extending Conditional Operators in Flow

If you have worked with Flow conditions, you would have noticed that the Flow designer filters the list of operators you can use based on the data type of the field. However this does not mean that you cannot use other operators on that particular field.

For instance if you use a Date/Time field on your condition, the designer would not give you the option to select the following operators:

  • greater than
  • greater than or equal to
  • less than
  • less than or equal to
Date field basic conditional operators

However you can use operators that are not displayed in the basic mode. You can use the advanced mode to extend the Flow conditional operators.

Extend Conditions on Advanced Editor

If you save and re-open the Flow, the extended operator is now displayed in the basic mode as well.

Extended conditions when re-opening

Here are the possible operators you can use in Flow:

contains: contains(attributename,value)

does not contain: not(contains(attributename,value))

equals: equals(attributename,value)

does not equal: not(equals(attributename,value))

starts with: startsWith(attributename,value)

does not start with: not(startsWith(attributename,value))

ends with: endswith(attributename,value)

does not end with: not(endswith(attributename,value))

greater than: greater(attributename,value)

greater than or equal: greaterOrEquals(attributename,value)

less than: less(attributename,value)

less than or equal: lessOrEquals(attributename,value)


Bonus Content:

Most of you who have worked with Flow would have come across situations where you had to use multiple or complex conditions in your flow implementations. One of the most time consuming aspects I found was implementing grouped conditions. One way to achieve this is to have multiple nested condition blocks. However this can get really messy if you have lots of conditions. The recommended way to achieve this would be to build your filter criteria in advanced mode and combine all conditions into one Flow condition step/block. I have noticed some users having difficulties building filter criteria in the advanced mode due to various reasons.

To simplify this process I decided to build a XrmToolBox plugin that can convert FetchXML filters to Flow conditions. The idea is to help the users by allowing them to build the conditions in D365 advanced find UI and export the FetchXML to the plugin and generate the equivalent Flow condition. This is currently in test mode and has the ability to convert some of the basic FetchXML conditional operators to Flow. I will publish this soon for everyone to use. But if anyone would like to help me test and give feedback prior to the release please feel free to contact me.

XRM toolkit plugin

P.S. Azure LogicApps (big brother of Flow) currently supports building grouped conditions using the designer (without having to use an advanced mode). Lets hope this is on the road map for Flow too.

My Quest to D365 Saturday Stockholm

Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to meet a load of new people for the first time which was AMAZING!

These events are so important for the community because they are often the only opportunities some of the community members really have to interact with other customers, partners ISVs and Microsoft employees. I love running into people that have encountered the same issues as I have, that way I know I’m not going bonkers and we can work on a solution together.

The crowd was great! There were many enthusiastic people in the audience who were really getting involved in the sessions, looking for information and really testing all of the speakers knowledge. You can find the list of sessions and speakers HERE.

I big reason I really enjoyed this event was the different layers and levels of content being shared across the sessions. The sessions were split into three tracks, these being:

Applications (Dynamics 365 CE), Dev (Dynamics 365 CE) and Business & Project Management. This gave participants the opportunity to stick to a single, themed, track or weave between tracks. This is pretty much how my experience went. I went to at least 1 session from each track. I wanted to get a flavour for everything. I also got to see some wizardry from folks like Julie Yack, Gus Gonzalez, Nick Doelman and Gustaf Westerlund.

Other presenters and panellists included Sara Lagerquist, Jonas Rapp, Fredrik Niederud, Katherine Hogseth, Mark Smith and Antti Pajunen. Each delivering some amazing content based on their experiences with Dynamics 365 and Power Platform.

There was a plethora of information and content being shared between speakers and passionate attendees. Everything from Microsoft Portals and Social Engagement to developing your own XrmToolBox tools (Careful with the spelling here….HAHA) was being talked about. I personally got involved in a number of Power Platform conversations, which suited me just fine because that’s kinda what I’m doing at the moment.

I had the pleasure of running 2 sessions. One in the Application track and one in the Dev track (I am no developer… Don’t judge me). The 2 sessions were:

  1. Power Platform – How it all fits together (Download Here)
  2. Building your first Canvas App with CDS, and Azure (Download Here)

Apparently people don’t like the first 2 rows.. great crowd though! Thanks to everyone that attended. Try work out what Mark is doing in the background there! HAHA!

The first presentation focused on the different elements of the Power Platform and the way it all works together. Many Dynamics 365 users often worry a bit about this because it seems so large and complicated, but it really isn’t once you have wrapped your head around the different technologies. To highlight the way the different elements of the technology worked together I included a Roadside Assist demonstration that was created during the PowerApps & CDS Hackathon Those Dynamics Guys and Hitachi Solutions Europe hosted together.

My second presentation consisted some of the “Do’s and Don’ts” around building your first Canvas App with your customer. I followed the presentation with the following:

  1. Adding several fields to a custom entity in the Common Data Service (CDS)
  2. Importing some data
  3. Creating a new canvas app
  4. Connecting the Canvas app to the CDS
  5. Adding in the Azure translation service to the app
  6. Publishing the app

The actual canvas app I created with the little model driven app solution, including data is available HERE.

The below pic gives of the impression that I am about to start having a conversation with my own hand, like an invisible Muppet. May be a great trick for my next demo 😀

One of the BEST sessions that I have been in was the “CAGE MATCH” moderated by the one and only Julie Yack,  This was EPIC fun! We were split into teams of 5 and given problems by the audience to resolve. It was a little daunting being in the presence of some of these long time MVPs, BUT, THE SHOW MUST GO ON, so we got stuck right in. Unfortunately, the team I was in didn’t take the win 🙁 Its cool, I am preparing my battle cards for the next one!

All in All, it was a fantastic event and a great opportunity to network with this amazing Dynamics and power Platform community that we all have grown to know and learn from. A MASSIVE thank you to the sponsors of the event!

Also, a big thanks to all of the folks that hung out after the event and enjoyed several beverages with me. Was a great time and I’d love to do it again 🙂

Here are some more delightful images from the day 🙂 My camera skills aren’t great so i had to grab a few from social. Thanks to those that grabbed pics in my session! I hope that this encourages more people to attend these events because I genuinely gain so much from being there.

Nick Doelman Smashing his Portals Presentation

One of my favourite Finns – Antti

Julie Yack doing her presentation on Social Engagement

MORE of the awesome Julie Yack

WHAT?? ANOTHER ONE of my favourite Finns – KIMMO!

JOOONNNAASS RAPP!!! The Legend!

We were all so excited! mark, Jonas and me 🙂

 

 

MICROSOFT FLOW BASICS AND LIMITATIONS WHEN WORKING WITH DYNAMICS 365

In this post I will be covering some Microsoft Flow basics and limitations when working with Dynamics 365. This will help you determine which Flow plan and/or connectors suites best for your needs.

Connecting to your Dynamics 365 instance

Firstly let’s look at the connectors for Dynamics 365. You have two options when it comes to connecting to a D365 instance.

  1. Dynamics 365 connector

D365Connector

The Dynamics 365 connector provides limited access to the Dynamics 365 organisation.

For more info on trigger events and actions please visit: https://docs.microsoft.com/en-us/connectors/dynamicscrmonline/

  1. Common Data Services (CDS) connector

CDSConnector

Provides access to the org-based database on the Microsoft Common Data Service.

For more info on trigger events and actions please visit: https://docs.microsoft.com/en-us/connectors/runtimeservice/

Now let’s do a side by side comparison between some of the notable features:

Feature Dynamics 365 Connector CDS Connector
Trigger Flow on create Available Available
Trigger Flow on updates Available Available
Trigger Flow on specific attribute updates Not availableLimited to record level updates only

* Which means you will have to take extra measures if you have to update the triggering record within the same flow. This is to stop the flow from triggering infinitely.

Available
Change Tracking limitations Requires Change Tracking to be enabled in D365 Change Tracking is not required
Define level of scope for the Flow trigger Not availableLimited to Organisation level only Available

  • Organisation level
  • Parent: Child Business Unit level
  • Business Unit level
  • User level
Trigger Flow on deletes Available Available
Manually trigger when a flow is selected Not available Available
Action: Create Note (annotation) for specified entity record Manual Special simplified action is available
Action: Retrieve all Notes (annotations) for the provided entity Id Manual Special simplified action is available
Action: Retrieves file content for specified Note (annotation) Manual Special simplified action is available
Connector Type Standard Premium(Only available in Flow Plan 1 and 2)

Triggers

Let’s have a look at the trigger event screens of each connector. I have selected the “When a record is updated” trigger event for the screenshots.

Dynamics 365 connector:

D365Trigger

CDS Connector:

CDSTrigger

CDS connector will give you the option to select the Scope for event triggers. Scope can be set to Organisation, Parent: Child Business unit, Business Unit or User level. This is similar to the native workflow engine in D365.

In addition to the scope you will also have the option to select attribute filters. Attribute filters will ensure the event trigger is only invoked when the specified attributes are updated.

Points to consider when using update triggers:

  • Update event triggers are invoked on Update requests to the record. Event triggers would NOT check whether any attribute values are being changed or not. As long as the update request is successful the Flow would be triggered.

What does this mean?

For update triggers at record level, the flow would still be invoked even if the update request has not made any changes to the record (Applies to D365 connector and CDS connector both)

For update triggers with attribute filters, the flow would be invoked even if the update request is setting the attribute its existing value (Applies to CDS Connector)

Flow Plans

Now that we have covered triggers and actions let’s have a look at Flow Plans. Currently Flow offers 3 plans.

Flow Free Flow Plan 1 Flow Plan 2
  • 750 runs per month
  • Unlimited flow creation
  • 15-minute checks
  • 4,500 runs per month
  • Unlimited flow creation
  • 3-minute checks
  • Premium Connectors
  • 15,000 runs per month
  • Unlimited flow creation
  • 1-minute checks
  • Premium Connectors
  • Org policy settings
  • Business process flows

You can check out Microsoft Flow Plans page for more information.

Limits and configuration in Microsoft Flow

Documentation from Microsoft provides more information on current request limits, run duration and retention, looping and debatching limits, definition limits, SharePoint limits or IP address configuration.

For current limits and configuration details please visit Microsoft Docs here.

There are also some limitations in the Flow designer UI compared to the native workflow designer in D365. One of them being the ability to design grouped conditional statements. Currently Flow does not provide grouped conditions to be configured in Basic mode. Which means you will have to use the advanced mode to build your conditional statements. I have noticed that LogicApps have already added the ability to group conditional statements in the basic designer and hopefully this is on the roadmap for Flow too.

Flow:

FlowCondition

LogicApps:

LogicAppsCondition

Even with these limitations Flow offers a lot more than the native D365 workflow engine.

You can checkout Microsoft Flow Documentation page for more information and how-to guides.

I would also highly recommend watching “What the Flow” vlog series by Elaiza if you wish to learn more about Flow and how to transition from native D365 workflows to Flow.

D365 Social Analytics Solution

As promised! Demoed at our D365 Saturday Summer Boot Camp session on Replacing Dynamics workflows with Flow.

This solution gathers tweets matching a specified hashtag saving them into a custom entity in Dynamics. A second flow then uses the Cognitive Service api to extract useful information from the tweets such as sentiment, key phrases and also translate the tweet if it’s not in English. This blog post contains the two flows as well the solution used in Dynamics with brief instructions on how to put everything back together

 

Dynamics solution

Contains a Social Analytics custom entity with some magic in the background!

Unmanaged.zip

Managed.zip

Flows

GetTweets.zip (Gets Tweets matching hashtags and creates records in social analytics entity)

DynamicsSocialAnalyticsV2.zip ( On create of a record in the Social analytics entity, use Text API to get sentiment, translation if not English, key words and update back into Dynamics)

Setup

Install the unmanaged  or managed solution into your instance which ever floats your boat 🙂.

Text Analytics API Key

You will need an azure subscription on with a Cognitive Text Analytics API service .You can get a trial API key with t 5000 executions for 7 days ( A free azure subscription will not limit you to the 7 days). Go to https://azure.microsoft.com/en-gb/try/cognitive-services/?api=text-analytics

Make sure Text Analytics is selected and hit Get API Key – Choose guest and get started.

You should eventually end up with the your API key and endpoint as shown in the image below which will be needed later on in the flow.

Social Analytics Flow

Head to your flow environment https://flow.microsoft.com. Go to My Flows , you should see Import in the top right, hit that . Upload  and import the flow DynamicsSocialAnalyticsV2.zip. You will need to fix the connections to your Dynamics instance. For Text Analytics, select “Select during Import” , Create a new connection, search for Text Analytics, select it and enter one of your Keys and your Endpoint URL. Come back to the import screen, refresh the list and select your new Text analytics connector. Do the same  for the Translator connection, named “Microsoft Translator” you shouldn’t need an API key for that. Once all the connectors have been fixed import the flow.

Once complete you should be able to see the Dynamics Social Analytics flow. Edit the flow and point both the Dynamics triggers, at the start and the update action all the way at the bottom to your instance by clearing out the org name, selecting yours and then the Social Analytics entity provided in the installed solution.

Before

After

Get Tweets Flow

Import GetTweets.zip flow. Fix the connections again by adding a twitter and your Dynamics connections.After upload , you will need to fix the create Dynamics record action at the bottom of the flow as before. Replace #D365Saturday to your favourite hashtag and bob’s your uncle, you can duplicate this flow if you wish to track multiple hashtags.