Power BI

How to detect car engine anomaly by analyzing engine noise?

Following to my article on “Starting your exciting journey of Connected Field Service and Azure IoT Hub“, I started working on a practical scenario about measuring noise in your surrounding and generating alerts in #PowerPlatform. In this article I want to summarize all resources required to implement such a scenario and my learnings. I hope this will add on the resources available in the community so you can use it as a walkthrough to implement a practical scenario.

In this article, you will see what architectural components are required to implement this simple scenario using Azure Iot and Connected Field Service. This article focuses on what is happening underhood of the platform. Luckily with Connected Field Service Application, you have everything managed behind the scene and you don’t need to worry much but this walkthrough enables to you understand what options you have in such scenarios.

Scenario

The scenario is about connecting MXChip IoT DevKit to your car or any place with noise and analyze the noise level by recording and sending the noise in form of Wave stream to an Azure IoT Hub. The Azure IoT Hub sends the data to an #Azurefunction which calculates the noise level using a simple formula and the function calls a #MicrosoftFlow to create alerts in #PowerPlatform. This can lead to number of endless scenarios.

  • The function for calculating the noise level from a wave file is extremely simple as well. There are so many scientific information which you can read here, here, here and here.
  • Calculating the noise level is not an easy task. There are many considerations involved and if you want to have the real working model, you will need to work on analyzing audio files which is beyond the scope of this demo.
  • It is possible and desirable to calculate the noise level in the device and send only the alerts to Azure IoT. This will reduce the traffic and the load on your Azure. However, for the sake of experiment I am sending all the noise data to Azure and calculate the noise level in Azure function.
  • In this demo, I am not listening to the noise all the time. I start recording on press of button A. I send the noise data to Azure on press of button B. I made this change to the scenario to demonstrate working with buttons in MX Chip and also reduce the traffic to Azure.

Architecture

The architecture of this sample is very simple. I am using an IoT Hub and Azure function to calculate and propagate the IoT events to the #PowerPlatform. On the device side, there is an Arduino application running which listens to noises and sends the noise to the Azure function.

A very comprehensive architecture of a connected field service is created in the below diagram which can simply be implemented using the #ConnectedFieldService application. However, I just wanted to implement it in a simpler way. Full details of the #ConnectedFieldService architecture can be seen in this documentation.

Components

The logical diagram of components is demonstrated below:

Ardiuno App

This component is a very program which reads the input from Audio, ButtonA and ButtonB of the device and does the following:

  1. On startup, it initializes the device and gets ready to listen to surrounding noise. It also checks the connectivity to Azure.
  2. On press of ButtonA , it records and surrounding noise and stores the stream in a buffer.
  3. On press of ButtonB, it sends the stream in the buffer to Azure.

To implement this part of the application, you will need to take following actions:

  1. Setup your device MXChip device. Please refer to this link to start.
  2. Setup your Visual Studio environment. Please refer to this link.
  3. You will need to learn how to deploy your code to the MXChip device. The simple way to upload your code your code to the device is to bring your MXChip device to Configuration mode. This means everytime you want to upload your updated code, Press A (and keep pressing) and then press reset (while still pressing A). Then release reset (While still pressing A) and then release A. Now you are ready to upload your code.
  4. If you want to debug your code in the device, you can refer to this link.

Here is my sample code:


#include "AZ3166WiFi.h"
#include "DevKitMQTTClient.h"
#include "AudioClassV2.h"
#include "stm32412g_discovery_audio.h"
#define MFCC_WRAPPER_DEFINED
#define MODEL_WRAPPER_DEFINED
//Constants and variables- Start//
enum AppState
{
APPSTATE_Init,
APPSTATE_Error,
APPSTATE_Recording,
APPSTATE_ButtonAPressed,
APPSTATE_ButtonBPressed
};
// variables will change:
static AppState appstate;
static int buttonStateA = 0;
static int buttonStateB = 0;
static bool hasWifi = false;
static bool hasIoTHub = false;
AudioClass &Audio = AudioClass::getInstance();
const int AUDIO_SIZE = 32000 * 3 + 45;
char *audioBuffer;
int totalSize;
int monoSize;
static char emptyAudio[AUDIO_CHUNK_SIZE];
RingBuffer ringBuffer(AUDIO_SIZE);
char readBuffer[AUDIO_CHUNK_SIZE];
bool startPlay = false;
void SendMessage(char *message)
{
// Send message to Azure
if (hasIoTHub && hasWifi)
{
char buff[512];
// replace the following line with your data sent to Azure IoTHub
snprintf(buff, 512, message);
if (DevKitMQTTClient_SendEvent(buff))
{
Screen.print(1, "Sent...");
}
else
{
Screen.print(1, "Failure...");
}
delay(2000);
}
else
{
// turn LED on-off after 2 seconds wait:
Screen.print("NO BUTTON DETECTED");
delay(1000);
Screen.clean();
}
}
void setup()
{
// put your setup code here, to run once:
memset(emptyAudio, 0x0, AUDIO_CHUNK_SIZE);
if (WiFi.begin() == WL_CONNECTED)
{
hasWifi = true;
Screen.print(1, "Running!!!");
if (!DevKitMQTTClient_Init(false, true))
{
hasIoTHub = false;
return;
}
hasIoTHub = true;
// initialize the pushbutton pin as an input:
pinMode(USER_BUTTON_A, INPUT);
pinMode(USER_BUTTON_B, INPUT);
appstate = APPSTATE_Init;
}
else
{
hasWifi = false;
Screen.print(1, "No Wi-Fi");
}
}
void loop()
{
// put your main code here, to run repeatedly:
Screen.clean();
// while(1)
{
// read the state of the pushbutton value:
buttonStateA = digitalRead(USER_BUTTON_A);
buttonStateB = digitalRead(USER_BUTTON_B);
if (buttonStateA == LOW && buttonStateB == LOW)
{
//SendMessage("A + B");
}
else if (buttonStateA == LOW && buttonStateB == HIGH)
{
// WAVE FORMAT
Screen.clean();
Screen.print(0, "start recordig");
record();
while (digitalRead(USER_BUTTON_A) == LOW && ringBuffer.available() > 0)
{
delay(10);
}
if (Audio.getAudioState() == AUDIO_STATE_RECORDING)
{
Audio.stop();
}
startPlay = true;
}
else if (buttonStateA == HIGH && buttonStateB == LOW)
{
// WAVE FORMAT
if (startPlay == true)
{
Screen.clean();
Screen.print(0, "start playing");
play();
while (ringBuffer.use() >= AUDIO_CHUNK_SIZE)
{
delay(10);
}
Audio.stop();
startPlay = false;
SendMessage(readBuffer);
}
else if (buttonStateA == HIGH && buttonStateB == HIGH)
{
Screen.clean();
}
}
delay(100);
}
}
void record()
{
Serial.println("start recording");
ringBuffer.clear();
Audio.format(8000, 16);
Audio.startRecord(recordCallback);
}
void play()
{
Serial.println("start playing");
Audio.format(8000, 16);
Audio.setVolume(80);
Audio.startPlay(playCallback);
}
void playCallback(void)
{
if (ringBuffer.use() < AUDIO_CHUNK_SIZE)
{
Audio.writeToPlayBuffer(emptyAudio, AUDIO_CHUNK_SIZE);
return;
}
int length = ringBuffer.get((uint8_t *)readBuffer, AUDIO_CHUNK_SIZE);
Audio.writeToPlayBuffer(readBuffer, length);
}
void recordCallback(void)
{
int length = Audio.readFromRecordBuffer(readBuffer, AUDIO_CHUNK_SIZE);
ringBuffer.put((uint8_t *)readBuffer, length);
}

Azure function

This is the simplest of all. All you have to do is to receive the stream and calculate the noise level. This can be very sophisticated but it is out of scope of this article.


using IoTHubTrigger = Microsoft.Azure.WebJobs.EventHubTriggerAttribute;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.EventHubs;
using System.Text;
using System.Net.Http;
using Microsoft.Extensions.Logging;
using System;
namespace IoTWorkbench
{
public static class IoTHubTrigger1
{
private static HttpClient client = new HttpClient();
[FunctionName("IoTHubTrigger1")]
public static void Run([IoTHubTrigger("%eventHubConnectionPath%", Connection = "eventHubConnectionString")]EventData message, ILogger log)
{
log.LogInformation($"C# IoT Hub trigger function processed a message: {Encoding.UTF8.GetString(message.Body.Array)}: " + System.Text.Encoding.Default.GetString(message.Body));
byte[] buffer = message.Body.ToArray();
short sample16Bit = BitConverter.ToInt16(buffer, 0);
double volume = Math.Abs(sample16Bit / 32768.0);
double decibels = 20 * Math.Log10(volume);
log.LogInformation(decibels.ToString());
}
}
}

Handshaking

In order the device to send messages to the Azure function, the device must know the endpoint in which it should send the data. You can take steps in the this link to register your device with Azure function. It is all about using Azure IoT Workbench.

References

https://docs.microsoft.com/en-us/dynamics365/field-service/developer/connected-field-service-architecture

Photo by Steinar Engeland on Unsplash

All the voices

“A woman with a voice is by definition a strong woman. But the search to find that voice can be remarkably difficult.”  – Melinda Gates

 

Companies love dashboards. The idea of progress, of something to announce, is like a drug. Naturally, companies use data and dashboards to measure diversity.  With one click, we can see how many people of what origin, education and sexual identity are employed anywhere within that company.  What those dashboards can’t tell you, no matter what your amazing PowerBI skillz (sic) may be, is the actual effectiveness and impact of that diversity down to the team or individual level.   Data and dashboards struggle with the intangible, with context. (I say this with all due respect to data scientists and my “blue” colleagues.)  Dashboards struggle to tell you if all those amazing voices that the company has invested so much in recruiting are actually being heard. This is the nuance of inclusion.

This is where checking the box on the dashboard stops and the application of the sought-after differing points of views begins.  And honestly, this is where so many teams fail.   The representation is in the room, but the team culture hasn’t evolved, the manager is still talking at people, the environment isn’t functioning.  The loudest voice still stands out.  Suggestions are quickly brushed aside until repeated by another more well-known contributor.  Questions are directed at the wrong person.  And then people just shut down, go back to their old ways, and that highly sought-after talent leaves.  Oh well, she wasn’t a good fit anyway. 

The pressure on groups to produce results quickly isn’t going away.  This intangible nuance of hearing all voices is easily pushed aside in the name of speed since it can be very difficult to measure. Worse yet, incorporating all the voices can actually slow things down at first, while in the end making the output so much better. How to show that the end justifies the means?

I propose that the best way to measure something is to start with a remarkable subset.

Enter the #msdyn365 community at 365 Saturday.  For me, it started in Dublin.  Actually, it started way before then, it just became more deliberate in Dublin thanks to the event organizers (looking at you, Janet and Raz) then took further shape in London and most recently solidified in Glasgow.  At these all day events (on a Saturday, just like the name implies), informal groups of women at various stages of career gathered for an hour under the umbrella of Women in Technology (#wit), not quite sure what to expect.

Each session has been different, because as with many things, the conversation is a result of the sum of the amazing diverse parts.  Topics varied, yet it all came down to one overarching theme: communication.  Whether that be the how, the when or the why of when to use our voices.  We talked about #confidencehacks, about how to establish ourselves without crossing a line that makes us uncomfortable (and practicing not caring about making others uncomfortable), about connecting and expanding our networks, and then most importantly we talk about amplification – how we can help others’ voices be heard.  All voices, not just female.

Note: There are so many other cultural considerations here, for which I lack a point of reference.  There is also a whole discussion to be had about how people consume, digest and respond to information.  For example, the work culture that I grew up in was as follows: get in a room, review a PowerPoint, have a passionate discussion where the loudest voice usually wins, determine next steps, assign actions items, repeat.  That format doesn’t work for all.  What about the voice of the incredible introvert in the room that needs time to digest the info, consider all sides, and then voice their opinion?

And there is the other amazing thing about our #msdyn365 community.  Others want to know how they can help.  Sure, I was teased about “super-secret lunches” by male colleagues.  I saw that for exactly what it was – curiosity and a sincere wish for dialogue.  Why is it necessary to have a “womens’ anything”? Shouldn’t it just be about hiring the best person for the job?  How should we feel about this?  We all treat each other with respect, right? Isn’t it up to individuals to make themselves heard?

Truth is, I agree with everything above.  Inclusion, by its intent, is about everyone.  And therefore, everyone has a responsibility to feed this culture and in the end everyone will benefit. We all can and should help amplify the voices of others. What I love about getting small groups of women together is that the coaching and dialogue that happens in a really safe environment then goes out into the diverse world and multiplies. It starts with a subset. Never underestimate the ripple effect of small actions.

Fifty percent (50%) of the speakers at 365Saturday Scotland identified as female.  Fifty percent.  That is crazy insane goodness.  It did not just happen.  This was the result of a community (led by Marc, Janet, Claire , Iain and so many others) rallying to make sure that opportunities were presented and seized, that a safe place was created and maintained, and that voices were heard.  Shouldn’t that just happen naturally?  Yes, ideally someday the flywheel will be spinning with such momentum that this will be the case (oh, and 50% of the attendees will also be women… work to do there as well).  Then the focus will become how to maintain and feed that system.  The moment you take your eye of something, you risk losing the muscle memory. Omission by unintentional oversight does not remove responsibility.

There is a meme about equity vs equality running around our social media feeds.  The one that show people of different heights trying to watch a baseball game over a fence.  The size of the boxes they are standing on depicts the difference between being treated equally (same sized box) and equitably (different sized boxes raising all to the same level).  The lesser known version has a twist – it shows what it would look like if there was no fence at all.

This is the nuance of inclusion.  This is how the #msdyn365 community is working to remove the fence.  It starts with these conversations, these opportunities. Listening to all the voices takes time and deliberate effort.  This community is all in.

Raise your voices. 🙂

Carissa

 

Power BI & Emoji

This video will showcase on below

  • How to add emoji in Text BOX in Power BI reports.
  • How to add emoji using UNICHAR() function.
  • How to create your own reference table in Power BI for emoji.

Please feel free to answer the question ❓❓❓ at the end of the video 😃😃😃.

Also the question is mentioned below. Please guess the message communicated in below picture using emoji.

Please send your answers let’s see who gets it correct. 👍🏋️‍♂️🎁

Hope this helps !!!

Please download the Power BI file from Power Platform Bank using below link.

Power BI & Emoji

Streaming Data Sets into Dynamics 365 Customer Engagement

In this post, we are going to look at the challenge of how to display streaming data sets directly onto a Dynamics 365 Customer Engagement form. While there already exists a way to embed Power BI dashboards and reports within Dynamics 365 Customer Engagement, these are not on a form level. To see how to do this currently, have a look here. When followed, you should observe results similar to the following, where a dashboard is initially displayed and then you can click though to the underlying report(s):
 
 
What you’ll notice from this is that these are personal dashboards that lack the ability to be contextually filtered. So to resolve this, we are going to create a Web Resource that has the ability to display a contextual (and streaming) dashboard on a Dynamics 365 Customer Engagement form!
 
To get started, lets have a look at what this will look like architecturally:
 
 
From the architecture, you should notice that we need to create a custom HTML Web Resource that will serve as a placeholder for the Power BI dashboard. When the form loads, we are going to use JavaScript to process the incoming parameters which can include both configurations and contextual data based on the record (form) that the Web Resource is being rendered on. The JavaScript will then call a reusable Dynamics 365 Action that will consume the incoming parameters before calling a Dynamics 365 Plugin. This plugin is necessary as it will help us execute a token exchange with the Azure Key Vault based on the currently logged in user. This token is then used in retrieving a specific secret which contains the required configurations necessary to render the Power BI report contextually and in an authenticated state back on the Dynamics 365 Customer Engagement form.
 
Simultaneously, the Power BI dashboard will be receiving a continuous stream of data from an MX Chip (IoT Device) that is connected to an Azure IoT Hub. This stream of data is provided through the Stream Analytics service which continually processes the incoming data and is able to send it as an output direct to Power BI before it is visualised. For reference, the Stream Analytics Job should look something similar to this:
 
 
You will notice that there is a dedicated Power BI output in the above and that we have limited the Stream Analytics job just to look for our MX Chip device. We also need to include a bit of DAX to format the incoming IoTAlert data to be a bit more readable. Examples of the incoming data, the DAX, and the Power BI configs are below:
 
 
As a result of this, we should now be able to see the streaming data set on the Dynamics 365 Customer Engagement form after a bit of Power BI visualisation magic as follows:
 
 
As we have parameterised the initial Web Resource on the form, this Dashboard is able to pre-filter visuals should we wish, and can also easily be embedded on the form and record type of your choosing! The following video demonstrates the complete pattern in action:

 

Using Eircode (Ireland Postcodes) to get Geolocation from Google Maps with Microsoft Flow

Let’s say you have a SharePoint list storing information with an Eircode column (postal code for addresses in Ireland), and you want to use that information in Power BI later to generate a map organizing all your items by location. Unfortunately, the Power BI maps don’t work fine with Eircodes, so how could we get the locations with the most precise information?

Let’s use Microsoft Flow and Google Maps API for that!

(Sorry Microsoft for not using Bing Maps)

Before start building the Flow, get the Google maps developer API key in: https://developers.google.com/maps/documentation/javascript/get-api-key

And include two new columns of type number and automatic decimal places in the SharePoint list: Latitude and Longitude

With those two things set up, it’s time to begin the Flow creation.

We will use the SharePoint ‘When an item is Created or Modified’ Trigger to start our flow. After starting the Flow creation with this trigger, add an ‘HTTP’ action. To get the Latitude and Longitude information, make a get Request to Google Maps API using the Eircode coming from the SharePoint item as Address filter:

In this case, the search is filtered to be just in Ireland as you can see in the parameters.

The results we get from Google Maps API are in the following JSON structure:

Note we have the latitude and longitude under geometry/location. So, to manage that information easily, create two variables ‘Latitude’ and ‘Longitude’, of type Float.

Assign the value coming from the HTTP request body to the variables using an expression for each (which will navigate through the JSON object to access the data):

Latitude: float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lat’])

Longitude:float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lng’])

Now that you have your values properly assigned, check if one of them differ from the existing ones in SharePoint, in case it is different it means the item needs to be updated. Then add an ‘Update List Item’ Action and set it to update your list items with the new Latitude and Longitude values to finish.

 The final layout of the flow will be the following:

Now, as soon as your items are updated or created in the SharePoint list, the information will be ready to be used in the Map visual on Power BI with the SharePoint list as data source.

Have fun!

My Quest to D365 Saturday Stockholm

Recently I attended the Dynamics 365 Saturday event in Stockholm and I have to say, what an excellent event. I have never been to Stockholm, so I was already massively excited about this. I also got to meet a load of new people for the first time which was AMAZING!

These events are so important for the community because they are often the only opportunities some of the community members really have to interact with other customers, partners ISVs and Microsoft employees. I love running into people that have encountered the same issues as I have, that way I know I’m not going bonkers and we can work on a solution together.

The crowd was great! There were many enthusiastic people in the audience who were really getting involved in the sessions, looking for information and really testing all of the speakers knowledge. You can find the list of sessions and speakers HERE.

I big reason I really enjoyed this event was the different layers and levels of content being shared across the sessions. The sessions were split into three tracks, these being:

Applications (Dynamics 365 CE), Dev (Dynamics 365 CE) and Business & Project Management. This gave participants the opportunity to stick to a single, themed, track or weave between tracks. This is pretty much how my experience went. I went to at least 1 session from each track. I wanted to get a flavour for everything. I also got to see some wizardry from folks like Julie Yack, Gus Gonzalez, Nick Doelman and Gustaf Westerlund.

Other presenters and panellists included Sara Lagerquist, Jonas Rapp, Fredrik Niederud, Katherine Hogseth, Mark Smith and Antti Pajunen. Each delivering some amazing content based on their experiences with Dynamics 365 and Power Platform.

There was a plethora of information and content being shared between speakers and passionate attendees. Everything from Microsoft Portals and Social Engagement to developing your own XrmToolBox tools (Careful with the spelling here….HAHA) was being talked about. I personally got involved in a number of Power Platform conversations, which suited me just fine because that’s kinda what I’m doing at the moment.

I had the pleasure of running 2 sessions. One in the Application track and one in the Dev track (I am no developer… Don’t judge me). The 2 sessions were:

  1. Power Platform – How it all fits together (Download Here)
  2. Building your first Canvas App with CDS, and Azure (Download Here)

Apparently people don’t like the first 2 rows.. great crowd though! Thanks to everyone that attended. Try work out what Mark is doing in the background there! HAHA!

The first presentation focused on the different elements of the Power Platform and the way it all works together. Many Dynamics 365 users often worry a bit about this because it seems so large and complicated, but it really isn’t once you have wrapped your head around the different technologies. To highlight the way the different elements of the technology worked together I included a Roadside Assist demonstration that was created during the PowerApps & CDS Hackathon Those Dynamics Guys and Hitachi Solutions Europe hosted together.

My second presentation consisted some of the “Do’s and Don’ts” around building your first Canvas App with your customer. I followed the presentation with the following:

  1. Adding several fields to a custom entity in the Common Data Service (CDS)
  2. Importing some data
  3. Creating a new canvas app
  4. Connecting the Canvas app to the CDS
  5. Adding in the Azure translation service to the app
  6. Publishing the app

The actual canvas app I created with the little model driven app solution, including data is available HERE.

The below pic gives of the impression that I am about to start having a conversation with my own hand, like an invisible Muppet. May be a great trick for my next demo 😀

One of the BEST sessions that I have been in was the “CAGE MATCH” moderated by the one and only Julie Yack,  This was EPIC fun! We were split into teams of 5 and given problems by the audience to resolve. It was a little daunting being in the presence of some of these long time MVPs, BUT, THE SHOW MUST GO ON, so we got stuck right in. Unfortunately, the team I was in didn’t take the win 🙁 Its cool, I am preparing my battle cards for the next one!

All in All, it was a fantastic event and a great opportunity to network with this amazing Dynamics and power Platform community that we all have grown to know and learn from. A MASSIVE thank you to the sponsors of the event!

Also, a big thanks to all of the folks that hung out after the event and enjoyed several beverages with me. Was a great time and I’d love to do it again 🙂

Here are some more delightful images from the day 🙂 My camera skills aren’t great so i had to grab a few from social. Thanks to those that grabbed pics in my session! I hope that this encourages more people to attend these events because I genuinely gain so much from being there.

Nick Doelman Smashing his Portals Presentation

One of my favourite Finns – Antti

Julie Yack doing her presentation on Social Engagement

MORE of the awesome Julie Yack

WHAT?? ANOTHER ONE of my favourite Finns – KIMMO!

JOOONNNAASS RAPP!!! The Legend!

We were all so excited! mark, Jonas and me 🙂

 

 

Where does CE PSA fit if I have Finance and Operations?

Updated last: 23/12/2018

This is a live blog post that will be updated with changes that are applied to the application – I’ll also update it with input from the community too. 

Right, I thought it’d be best to write a quick post on this topic as it is a question I receive quite regularly which is along the lines of…. “Hey Will, I see you’ve been working on Customer Engagement PSA – I don’t really understands how that would fit in with an organisation that has Finance and Operations system or at all”.  Then I take a deep breath and I say something along these lines…

(There are a few version of this response depending on what the business does)

PSA flow:

What we must remember is PSA is there to ultimately help the prospect to cash process, but hey we hear and read “Prospect to Cash” thrown around a lot and it doesn’t help explain anything, what I mean with this is as follows;

  1. the ability to turn someone you may have been in contact with to a Lead
  2. then qualify said Lead to an Opportunity
    1. During the opportunity process you will start, hopefully, creating a proposal and to really provide a precise as can be quote it is best to create a project with a thorough work break-down structure along with associated costs (expenses, role costs etc.) then to import this structure along with associated costs into the contract to provide a quote.
  3. Submit the quote to the customer and hopefully mark it as won – or maybe you may have to create another until you ultimately, hopefully, win
  4. The quote then turns into an Order/Contract with an associated project and all this richness can then be synced across to Finance and Operations – the contract will be pulled across along with the associated project details; project name, project contract associated, actual start date, work breakdown structure (if you’ve assigned resources then these can be brought across too) etc.

Where to place your personnel in a PSA & FinOps stack implementation:

Now the more interesting piece is where do you ask your employees to enter their Time and Expenses, where do you ask the Project Manager to carry out their tasks and where do you ask the Resourcing Manager to sit?

Now we must remember PSA – IS NOT A FINANCE SYSTEM, IT IS NOT TRYING TO BE A FINANCE SYSTEM, IT’S PURPOSE IS NOT TO DEAL WITH ANYTHING RELATED TO ACCOUNTING AND FINANCE, the purpose is to provide a buffer between account management and back office tasks such as the accounts department and to provide more granularity to items such as quoting (remember this is from a perspective when Finance & Operations exists as part of the implementation).

However, what it does do well is to provide the ability to price up quotes thoroughly thanks to this project creation functionality and it also performs some project processes well that can then be handed over for further processing.

Now let’s take a quick dive into where to place the Project Managers, Employees and Resourcing Managers.

Employees– now, personally, as an employee I prefer the user interface in CE  for entering Timesheets and Expenses rather than Finance and Operations – it is more aesthetically pleasing. However, there are limitations around expenses – there are no expense policies out of the box so this would need to be provided via customisation.

Along with other workflow requirements, and let’s face it expense workflows (from my experience implementing systems, especially in global systems) can be incredibly complex which will also be better suited for Finance and Operations as PSA only allows one level approval when in reality multi-level and conditions are required.

PSA does have the ability to bring in the hours you entered last week, or the appointments/projects you’ve been assigned in the resource scheduler but Finance and Operations allows this too.

What I’m getting at here is it is best to stick with Finance and Operations and if you wish to make the user interface more kinder on the eyes then use the mobile application functionality or throw together a PowerApp.

Resourcing Manager– now this is where I lean towards PSA, as long as you sync proficiency models, skills, characteristics, roles, cost prices, sales prices etc. between Finance and Operations and CE PSA (or if you’re company is using talent then have a network of the three Talent>PSA>FinOps) then I much prefer the Scheduling board within PSA and the way you submit requests to be fulfilled. Look at the screenshot below and how glorious it is, colours, pictures, charts – PSA has it all (you can even use the map functionality- living the dream)!

Project Manager– now this depends on the organisation, PSA allows the PM to manage their project team, monitor cost absorption (effort tracking as well), look at project estimates, submit resourcing requests (all this also exists within Finance and Operations)- but if you want your PM to also invoice clients, perform a more advanced level of WIP adjustments then this role will suit Finance and Operations.

Also the dashboards are not that brilliant in PSA – yes you can use PowerBI embedded functionality but Finance and Operations has brilliant out of the box reports, as well as enhanced areas such as the Project Manager Workspace (provides an overview of their project related activities as well as allows them to initiate their most frequent tasks) as well as PowerBI integration – soooooo…..

General Finance points related to PSA functionality: PSA does let you push through flexible journals, you can export actuals (or integrate them), you can adjust actuals (as well as few adjustment histories) and you can invoice through funding sources and billing rules (not as advanced as Finance and Operations) set out on the project contract.

Important to note that there is no out of the box functionality to tie Purchase Orders to projects, thus this is not wrapped up and summed into items such as const consumption etc. a journal can be used for this in the mean time but creating the PO in FinOps and then pushing that across as a journal to keep track in PSA may be one route (dependant on if your PMs sit there if not it really does not matter). Furthermore to this there is no commitment or encumbrance accounting to keep track of the financial health of a project with regards to Purchase Orders.

Another key part of project management is budget control. Unfortunately there is no budget control that sits within PSA only a cost consumption meter so this will have to be validated/tracked through Finance & Operations but the validation will only occur post transaction if you choose to leave T&E within PSA (not a wise move).

Conclusion:

So let’s conclude – PSA DOES HAVE A FIT within the full suite of Dynamics 365 and for organisations that uses both CE and Finance and Operations if it is used for it’s intended purpose which in my eyes is to assist with quoting proposals and assisting with some of the non-accounting project processes to allow that smooth transition from sales to delivery.

And one more thing….. if the company DOES NOT have finance and operations but another Accounting system that does not include project management and they also require a sales system then PSA is a great fit!!!!

 

Dynamics 365 Saturday – Dublin 2018

The Dynamics 365 Saturday 2018 took place in Microsoft Dublin (One Microsoft Place) and was hosted by Janet Robb and her Team. Congratulations on Janet for a great work on organising such well-structured event.

Lots of cool sessions provided, as well as workshops for PowerApps and CRM for the ones who were interested.

 

Even though I’m not a Dynamics consultant, to me as a SharePoint/Office 365 Consultant, it was very cool to see how the same stack of products (Power platform + Azure) that we use on top of SharePoint/Office 365 is been adopted in the Dynamics Community as well. As mentioned on the keynote, for all D365 specialists it’s been a constant work to keep up to date with tons of new features been released (I can assure that for Office 365 as well), and an event like this helps to connect the community and to get a summarized overview of what’s been there in the market.

Special thanks to our Dynamics Guys who came from the UK and smashed in their presentations:

Chris Huntingford – Dynamics 365 for Marketing

Kylie Hill – My Favorite new D365 Features

It was a pleasure to meet those legends in person.

Also, the interaction from the community posting and sharing content and pictures on social media (Hashtag #365SatDub) was really nice, cool to see such engagement and nice pictures from everyone on Linkedin and Twitter.

Looking forward to the next #365SatDub!!

Demystifying the Common Data Service & PowerApps – Hackathon Summary

On the 28th of July, Those Dynamics Guys held a Hackathon to help people demystify the Common Data Service and PowerApps. The goal of the Hackathon was to educate the audience in various elements of the Common Data Service and PowerApps and then provide them with a task that required various levels of technical and business skills to complete.

The hackathon was timed in such a way that it took place right after Microsoft Inspire and The Business Applications Summit, where various elements of the Microsoft technology were focused on. The Microsoft Power Platform got a particularly large amount of focus due to the nature of the product and how it is perceived to change the very way people within the Microsoft family (Employees, partners, ISVs) work.

There is unfortunately no way for me to make this a short post, buckle up…. This will get interesting.

For those of you that are unfamiliar with terms such as “PowerApps” and “Power Platform”, allow me to take this moment to briefly educate you. For those of you that are familiar… Skip the section full of product definitions and go on to the Hackathon bit.

The Section Full Of Product Definitions

Microsoft Power Platform is the term used to describe the Microsoft Business Application platform which includes PowerApps, PowerBI, Flow, the Common Data Service (CDS) and a series of gateways & connectors. This platform is used to customise both Office 365 and Dynamics 365 and utilises Microsoft Azure. For the less technical people, it is a platform that enables the creation of relevant applications across all areas of a business and promotes their utilisation by all types of users.

PowerApps is the term used to describe a service within the Microsoft Power Platform that enables users the ability to build and use business applications that connect you to data and work across various devices without the need for expensive software development. Basically, you can build AWESOME looking applications that work on all your devices and connect you to the data and functionality you need in order to do your job more effectively.

The Hackathon bit

On a very sunny Saturday just over 60 people from Microsoft, Various partners, users and students gathered at the Microsoft Reactor in Shoreditch to learn, teach and evangelise all about Microsoft PowerApps and the Common Data Service. They came early and enthusiastic about getting to grips with this amazing technology.

The participants were split into teams that were branded by a certain colour: green, Red, Pink, Yellow, Brown, Orange, Blue and Purple. The participants were split based on a skills survey that was filled in prior to the event. Unfortunately, some people did not arrive, which left some teams with only four people. A MASSIVE thanks to those that did arrive and give up their Saturday!

Each team was assigned a Power Platform environment with PowerApps, PowerBI and Flow enabled, as well as a configured instance of the Common data service that was fully populated with data. There was also an example of a model driven application that was linked to the demo scenario positioned below.

The Presentations

The teams were taken through various elements of the technology by some amazing folks from Microsoft. Bruce Nicholson, Anna Waight, David Reid and Craig Bird. They absolutely wowed the attendees with the Microsoft roadmap, some amazing technology and kick-ass demonstrations of what the tech could do.

My partners in crime, Will Dorrington and Kyle Hill did some fantastic demonstrations of what could be done with Canvass Apps and Model Driven Apps, which gave the audience an idea of what was possible when utilising Microsoft Power Platform to create business relevant applications.

Anna and Bruce did a stunning presentation opening up the event, talking about the Power Platform Roadmap and how it fits into the different partner communities such as partner and user. The two of them really set the mood and tone of a very interactive ad exciting day.

I took the teams through a scenario based solution that surfaced data and functionality to various user types utilising all elements of the Power Platform. The demonstration of this process can be found on the PowerApps Bank.

David Reid came on next with a brilliant presentation on the Common Data Service and how it is technically structured, as well as how to add new entities, views, fields and more. David also dipped his toes into how the CDS surfaces data within a Model Driven App. I’m sure David’s brain just oozes excellence! Download David’s presentation HERE.

Kyle came on next with a pretty sweet looking model driven app and spoke about how the data could be surfaced from the CDS within a functionally rich user interface that promotes process driven interactions with the data. Poor Kyle was massively jet lagged after just getting back from BusApps summit, so we had to heckle him as much as possible. You can download the CDS Data Structure and Model Driven App HERE. You can download instructions on how to setup your own environment with all of the relevant data HERE.

Next on were Craig and Will, which was a riot! They explored the deep functionality available within canvass apps and how data could be surfaced from CDS as well as LOADS of other data sources. This was coupled with some pretty alternative functionality, shown in the form of different connectors to different apps and data sources. There is an example Road Side Assist Canvass App that is connected to the CDS which can be downloaded HERE.

The concept was around Road Side Assistance and how you could use services within Power Platform to handle a breakdown process. The concept was very light touch due to the fact that the participants didn’t have a lot of time to form a bond and work together as a team. This is also why a large amount of the configuration was also pre-built. Road Side Assist was also selected because it’s a common scenario and most participants understood the concept as well as how the process could work, without having too much explained to them.

The Judging Criteria

Based on what was demonstrated, the goals were set for the afternoon where the participants had to deliver an innovative solution that worked with the data structure provided and show how Microsoft Power Platform could assist in the management of vehicle breakdown calls. There were two key criteria that the judges were looking for:

  1. Team work: How well did the members of the team work together to generate a cohesive story and solution.
  2. Innovation: What did teams do to build on the provided structure? How did they leverage the platform?

A massive thanks to the Hitachi Solutions team who came through and did the judging.

Sajeel Afzal

David Singh

Jesmond Giordmaina

Let’s hack Stuff!!!

The teams had lunch and got cracking during the lunch break on networking, planning and even starting the build on some of their solutions. After lunch the teams only had 3 hours to build something amazing… which THEY DID! In fact, some of the stuff that came out of the different groups was amazing. The Microsoft and Those Dynamics guys team members were roaming the floor acting as support for all questions, both technical and non-technical. The teams really put us to the test with a number of questions… Which means that the boundaries of the product were really pushed!

There were some REALLY awesome apps that were created with a massive focus on utilising the canvass apps to surface data and functionality. All the teams had very well thought out stories and themes for their apps, which made the judging really tough.

We saw everything from bright pink user interfaces (WHICH WAS AWESOME) to an actual integration to the DVLA (WTF…that’s crazy). Teams were using all sorts of combinations like Microsoft Flow, PowerBI, Microsoft Forms, SharePoint, Twilio SMS and loads more. Basically, it was a tech fest! 😀

Ultimately, there was a lot going on in the room. Lots of chatting, disagreeing (in that cuddly, healthy way), lots of tech talk and many flow diagrams going on.

Speed Dating for Apps

After the countdown clock had completed its cycle each team then had 5 min on stage to show us their awesomeness. Each team showed no fear, got up and spoke about what they had done. They told the judges their story and showed us evidence of what they had built. I gotta say, most of them put me to shame from a demo point of view. It was just amazing to see all these proud new “App Parents” showing us how cool their creations were. I almost shed a few tears of joy 😀

Without handing out participation medals, I need to say that each team smashed it! Each team had both similar functionality and functionality that really stood out from the rest of the teams. Some were technically stronger than others and some had a better story.

You can download each teams model & canvass applications from the Those Dynamics Guys PowerApp bank!

Wrap-up, high fives and Prizes

After all the presentations were done Sajeel Afzal from Hitachi Solutions, our primary sponsor, came up and spoke a bit about the magic he had seen happen in the room and how proud he and Hitachi were to be part of such a collaborative event. He pointed out that no matter if you were a partner, user or student, everyone came together to participate in a very cool event, which is what Hitachi Solutions is about.

After a VERY tough decision the judges finally decided on who the winner was. Taking into account the two key criteria, the judges selected the RED team as the winners. They were selected because they worked together as a cohesive unit, they had a strong plan, they had a great story, they were able to present a solution that utilised the Power Platform in an innovative way through the creation of a VERY strong canvass app and Model driven app. The red team also had three students in the team that had never even seen the product stack before, which just goes to show that the citizen developer tag given to PowerApps is a real thing. Great job RED team.

The red team won a Philips hue starter set each and a sweet award which can be used as a paper wright, door stop, weapon and many other useful things.

There were two other awards that were given out to two members of the blue team who the judges felt were incredibly collaborative and worked hard at bringing their teams together. Incidentally, this team had the highest number of members… So a great job to:

Ryan Mclean

Laura Graham Brown

And finally, the social award went to Marcus Mattus for his incredible contribution on social media and getting our event out there to the rest of the world.

Ultimately, what we were hoping to achieve during the hackathon was to create that “Penny Drop” moment for all types of users, which was to realise that this platform is so robust and dynamic; the sky is the limit here. Together, we can create almost anything! We don’t all need to be software developers to do this. People from all backgrounds can get stuck in here and enjoy it. It is as simple or as complicated as your knowledge and skill level allows.

 

THE PUB

We finished off the day at a pub in Shoreditch called Flight Club where many delicious pints were consumed and many high fives and handshakes were portioned out amongst the participants for a job well done.

A Big Thanks is in Order

To each and every participant, thank you so much for giving up your Saturday to come and play with technology and learn about how epic the Microsoft product stack really is!

THANK YOU to Hitachi Solutions UK for sponsoring this event. We could not have done this without you. You were amazing in fronting the cash needed to do this, putting up with my nonsense and must rolling with the event because you believe it grows the community and the perception of the product and how it can be used. You are amazing!

To The Cognitive Group… Thanks for the delicious drinkies and snacks at the bar at the very end. They were needed! AND for the amazing VR headset that went to Ryan… he loves it!!

Anna, Bruce, David and Craig from Microsoft, you were unbelievable! Thank you so much for giving up your time to us and for sharing all of your amazingness. It was amazing having you there representing such a strong brand.

The Microsoft Reactor team; a massive thanks for all of your hard work and for helping us through the event process. You were fantastic, the venue was fantastic and you were incredibly hospitable. Amazing job.

To the TDG Team… great job boys! Hugs! 😀

Microsoft Cognitive Services with PowerApps – Snap It and Translate

Microsoft cognitive services gives rich Artificial Capabilities into the applications. These capabilities can be used with PowerApps when building Apps. Cognitive services can help to make your Apps Smarter.

Artificial Intelligence and Data science go in hand in hand. Microsoft packaged in all capabilities and packages those together to create easy to consume API’s such as Cognitive Services and Bot Framework.

Microsoft provides 23 Cognitive Services API’s. It’s categorises into five areas namely Vision, Speech, Language, Knowledge and Search. For the experiment below I have used Computer Vision & Microsoft Translator API’s. It helps to detect text content from an image using OCR (Optical Character Recognition) & Translate text to different languages.

You can find further information on cognitive services information by following link below.

https://azure.microsoft.com/en-us/services/cognitive-services/

When creating Connections inside PowerApps would require getting an Account Key and Site URL. Use below link to get it.

https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/

If you do not have an Azure Account yet you can use the 7-day trial to get started.

Below Image depicts the simple flow of the actions involved with the app.

Set properties as below.

Image

Image = First(colPhoto).Url

Camera

Onselect = ClearCollect(colPhoto,Camera1.Photo)

Text BOX

Text = varOCR

Overflow = Overflow.Scroll

Capture Text Button

Onselect = Set(varOCR,ImageOCR.Run(First(colPhoto).Url).ocrtext)

To Translate

Text = MicrosoftTranslator.Translate(varOCR,Dropdown1.Selected.Value)

Use the MS flow to recognize the Optical Character Recognition.

Special Note: Images can be saved in different formats. Computer Vision API is looking for binary format of the Image therefore below expression must be used in the Image Content box in OCR to Text to convert to binary format.

dataUriToBinary(triggerBody()[‘Createfile_FileContent’])

Now you can start creating your own Translator Buddy app. 😊

Skills for the PowerApp developer

It seems to me that PowerApps will prove to be a game-changer by enabling businesses and organisations to build bespoke apps designed to meet very specific needs.

The curious thing about PowerApps is that the product is pretty well where it needs to be as of today but the major blocker for a global revolution is that there simply aren’t enough people out there with anything like the skills that they would need to be able to build robust business product.

So let’s assume that you can’t buy in PowerApps skills or maybe that you have some but need to grow some more.  What would you be looking for in an individual to identify them as a potential PowerApps developer?  In this article I’m going to outline some of the skills that I would be looking for when identifying candidates with the potential to be high performers.

Excel

It feels kind of strange to start with a product rather than personal skills but in this case the Excel formula structure and Excel cell construction is so closely related to that of PowerApps that is the number 1 skill.

The Excel and PowerApps teams work closely together so that any new formulas created are aligned to each other.  With this in mind this also opens the door to the many years of formulas created by individuals within excel that can theoretically be borrowed or referenced to enhance a PowerApp.

Maths

PowerApps is the product that is the most mathematical that I’ve ever seen.  Variables can be easily created that can be used in an algebraic fashion.  X and Y properties exist to govern the position of everything that you see on a screen.

Empathy

You need to be able to consider your products from the perspective of other users.  If people don’t like your product they won’t use it, and you should be aiming for them to love them.

Logic

To my mind there are 2 kinds of logic.  Firstly, the more mathematical/excel based ‘If X=1 to this otherwise do that’ and secondly there is the type of logic where you put yourself in the shoes of a user or administrator and be able to see whether or not a form (as in business world this is the most likely application) has a logical flow.

Design

Having an eye for design is a skill people have to a greater or lesser extent.  To be fair it doesn’t come naturally to me, but I do know that from making lots of products some designs work better than others and it is a skill that can be learned.

Initiative

PowerApps won’t go to you, you have to go to PowerApps.  If you create a blank app it will stay blank until you do something.  Additionally, many technical solutions may require techniques that feel like work-arounds mainly because the problems you have been facing haven’t been faced before, so looking them up on Google won’t necessarily yield any results.

Lifelong Learning

The product is ready, but is changing all the time.  Good sources for keeping up to date are twitter and the PowerApps blog, but even the latter tend to focus on ‘big’ updates whereas some of smaller adjustments can still be game changers.  Keep up with it.  I’ll be honest in saying that there are times when staying at the cutting edge can be difficult to manage versus traditional tools and skills that stay very similar over long periods of time.

Perseverance

Some problems will require high levels of determination to find the exact syntax you need to solve your problem and when those fail you may need to pull back stick your head up and work out if there ways around the problem or if the items is even needed (i.e. do you really need to collect the data at all or is it a nice to have)

Patience

Not necessarily my forte, but patience may be required of yourself as you may not have the skills that you need at your fingertips and similarly the same may be true of those around you.  You’ll also need a level of patience on the part of your sponsors or users.  I can almost guarantee that unless they are very familiar with digital form building you may produce a product to the exact specification requested but nevertheless you, and they, know that it’s not quite right and that adjustments, sometimes significant may be required.

Use the community

The PowerApps community at this stage is currently populate by a relatively small but passionate group of individuals building highly imaginative solutions designed to stretch the product.  Make use of these by joining https://powerusers.microsoft.com, follow people on twitter posting using the #powerapps or @PowerApps references.  By all means take a look at my youtube channel www.youtube.com/dataspinners and don’t forget to join https://dynamics365society.uk which contains a PowerApps bank that you can make use of once you sign up (for free).

Go on a course

Personally, I’m quite happy with an online course as typically these are completed over a longer period of time which for me is a better way of allowing the learning to sink in.  The best single free resource is https://courses.edx.org where DAT207x is a very useful starting point.  You can search up courses run in a classroom setting, but you will need to ensure that you invest time in utilitizing the product one you’ve completed it as otherwise you’ll lose your knowledge quite quickly.

Concluding remarks

This article was intended for hirers or individuals trying to make some sense as to the range skills needed for PowerApps excellence.  Hopefully, it isn’t too daunting.  Whatever you do, start small but build usable solutions.  This particular tool offers us astonishing scope for us to run our businesses in efficient ways and the more we can grow as a community the closer we will all get to a culture of efficiency the impact of which may be widespread, more operations, better scheduling less wastage all by putting the right information in front of the right people at the right time.

 

 

PowerApps – Camera Integration Part 2

So we left the last article (Part 1 – if you have not read Part 1 please do before moving onto Part 2) with a glorious self app that involved putting a camera element into a new screen then configuring the ‘OnSelect’ command to capture the image from the camera and store it in a collection. Which looked a little something like this:

 

Please note I foolishly forgot to ‘Save’ my app (more a rush to start the weekend and grab a beer) so some of the control and component names may have changed but please just engage logic for this.

 

Displaying the collection to show the person using the app the image they have just captured:

Now the next stage is to allow the users of the application to quickly see the image they have just taken. The first step in this glorious journey is to add an image under the Camera component go to ‘Insert’>Media>Image and place it under the camera element – which should look something like this:

I think it’s important that I publicly apologise for the amount the readers of this article has to keep seeing my face, sorry ;-).

Now we need to instruct the image to show what is in the collection (remember we are using the ‘ClearCollect’ formula for storing the captured camera image in the collection which means only one will be stored at a time, so this is the main reason for choosing an ‘image’ component rather than a ‘gallery’). Select the image and go to the advance settings and set the ‘Image’ as: Camera1.Photo (previous article would be Camera3) as shown below:

Now the last piece that i’d like us to is to flip the image horizontally, to achieve this select the image and go to the ‘Design’ grouping area withing the advanced options and then select “more Options”:

Then scroll down to the ‘FlipHorizontal’ property and change it from ‘False’ to ‘True’ this will flip the image horizontally and marry up with the camera.

This finishes Part 2, Part 3 will focus on sending this image via Flow to an email address as an attachment as well as storing it on SharePoint!

If you like the articled then please do ‘Like’ it below and if you have any questions please use the questions functionality in the main menu of the site.