How to detect car engine anomaly by analyzing engine noise?

Following to my article on “Starting your exciting journey of Connected Field Service and Azure IoT Hub“, I started working on a practical scenario about measuring noise in your surrounding and generating alerts in #PowerPlatform. In this article I want to summarize all resources required to implement such a scenario and my learnings. I hope this will add on the resources available in the community so you can use it as a walkthrough to implement a practical scenario.

In this article, you will see what architectural components are required to implement this simple scenario using Azure Iot and Connected Field Service. This article focuses on what is happening underhood of the platform. Luckily with Connected Field Service Application, you have everything managed behind the scene and you don’t need to worry much but this walkthrough enables to you understand what options you have in such scenarios.

Scenario

The scenario is about connecting MXChip IoT DevKit to your car or any place with noise and analyze the noise level by recording and sending the noise in form of Wave stream to an Azure IoT Hub. The Azure IoT Hub sends the data to an #Azurefunction which calculates the noise level using a simple formula and the function calls a #MicrosoftFlow to create alerts in #PowerPlatform. This can lead to number of endless scenarios.

  • The function for calculating the noise level from a wave file is extremely simple as well. There are so many scientific information which you can read here, here, here and here.
  • Calculating the noise level is not an easy task. There are many considerations involved and if you want to have the real working model, you will need to work on analyzing audio files which is beyond the scope of this demo.
  • It is possible and desirable to calculate the noise level in the device and send only the alerts to Azure IoT. This will reduce the traffic and the load on your Azure. However, for the sake of experiment I am sending all the noise data to Azure and calculate the noise level in Azure function.
  • In this demo, I am not listening to the noise all the time. I start recording on press of button A. I send the noise data to Azure on press of button B. I made this change to the scenario to demonstrate working with buttons in MX Chip and also reduce the traffic to Azure.

Architecture

The architecture of this sample is very simple. I am using an IoT Hub and Azure function to calculate and propagate the IoT events to the #PowerPlatform. On the device side, there is an Arduino application running which listens to noises and sends the noise to the Azure function.

A very comprehensive architecture of a connected field service is created in the below diagram which can simply be implemented using the #ConnectedFieldService application. However, I just wanted to implement it in a simpler way. Full details of the #ConnectedFieldService architecture can be seen in this documentation.

Components

The logical diagram of components is demonstrated below:

Ardiuno App

This component is a very program which reads the input from Audio, ButtonA and ButtonB of the device and does the following:

  1. On startup, it initializes the device and gets ready to listen to surrounding noise. It also checks the connectivity to Azure.
  2. On press of ButtonA , it records and surrounding noise and stores the stream in a buffer.
  3. On press of ButtonB, it sends the stream in the buffer to Azure.

To implement this part of the application, you will need to take following actions:

  1. Setup your device MXChip device. Please refer to this link to start.
  2. Setup your Visual Studio environment. Please refer to this link.
  3. You will need to learn how to deploy your code to the MXChip device. The simple way to upload your code your code to the device is to bring your MXChip device to Configuration mode. This means everytime you want to upload your updated code, Press A (and keep pressing) and then press reset (while still pressing A). Then release reset (While still pressing A) and then release A. Now you are ready to upload your code.
  4. If you want to debug your code in the device, you can refer to this link.

Here is my sample code:


#include "AZ3166WiFi.h"
#include "DevKitMQTTClient.h"
#include "AudioClassV2.h"
#include "stm32412g_discovery_audio.h"
#define MFCC_WRAPPER_DEFINED
#define MODEL_WRAPPER_DEFINED
//Constants and variables- Start//
enum AppState
{
APPSTATE_Init,
APPSTATE_Error,
APPSTATE_Recording,
APPSTATE_ButtonAPressed,
APPSTATE_ButtonBPressed
};
// variables will change:
static AppState appstate;
static int buttonStateA = 0;
static int buttonStateB = 0;
static bool hasWifi = false;
static bool hasIoTHub = false;
AudioClass &Audio = AudioClass::getInstance();
const int AUDIO_SIZE = 32000 * 3 + 45;
char *audioBuffer;
int totalSize;
int monoSize;
static char emptyAudio[AUDIO_CHUNK_SIZE];
RingBuffer ringBuffer(AUDIO_SIZE);
char readBuffer[AUDIO_CHUNK_SIZE];
bool startPlay = false;
void SendMessage(char *message)
{
// Send message to Azure
if (hasIoTHub && hasWifi)
{
char buff[512];
// replace the following line with your data sent to Azure IoTHub
snprintf(buff, 512, message);
if (DevKitMQTTClient_SendEvent(buff))
{
Screen.print(1, "Sent...");
}
else
{
Screen.print(1, "Failure...");
}
delay(2000);
}
else
{
// turn LED on-off after 2 seconds wait:
Screen.print("NO BUTTON DETECTED");
delay(1000);
Screen.clean();
}
}
void setup()
{
// put your setup code here, to run once:
memset(emptyAudio, 0x0, AUDIO_CHUNK_SIZE);
if (WiFi.begin() == WL_CONNECTED)
{
hasWifi = true;
Screen.print(1, "Running!!!");
if (!DevKitMQTTClient_Init(false, true))
{
hasIoTHub = false;
return;
}
hasIoTHub = true;
// initialize the pushbutton pin as an input:
pinMode(USER_BUTTON_A, INPUT);
pinMode(USER_BUTTON_B, INPUT);
appstate = APPSTATE_Init;
}
else
{
hasWifi = false;
Screen.print(1, "No Wi-Fi");
}
}
void loop()
{
// put your main code here, to run repeatedly:
Screen.clean();
// while(1)
{
// read the state of the pushbutton value:
buttonStateA = digitalRead(USER_BUTTON_A);
buttonStateB = digitalRead(USER_BUTTON_B);
if (buttonStateA == LOW && buttonStateB == LOW)
{
//SendMessage("A + B");
}
else if (buttonStateA == LOW && buttonStateB == HIGH)
{
// WAVE FORMAT
Screen.clean();
Screen.print(0, "start recordig");
record();
while (digitalRead(USER_BUTTON_A) == LOW && ringBuffer.available() > 0)
{
delay(10);
}
if (Audio.getAudioState() == AUDIO_STATE_RECORDING)
{
Audio.stop();
}
startPlay = true;
}
else if (buttonStateA == HIGH && buttonStateB == LOW)
{
// WAVE FORMAT
if (startPlay == true)
{
Screen.clean();
Screen.print(0, "start playing");
play();
while (ringBuffer.use() >= AUDIO_CHUNK_SIZE)
{
delay(10);
}
Audio.stop();
startPlay = false;
SendMessage(readBuffer);
}
else if (buttonStateA == HIGH && buttonStateB == HIGH)
{
Screen.clean();
}
}
delay(100);
}
}
void record()
{
Serial.println("start recording");
ringBuffer.clear();
Audio.format(8000, 16);
Audio.startRecord(recordCallback);
}
void play()
{
Serial.println("start playing");
Audio.format(8000, 16);
Audio.setVolume(80);
Audio.startPlay(playCallback);
}
void playCallback(void)
{
if (ringBuffer.use() < AUDIO_CHUNK_SIZE)
{
Audio.writeToPlayBuffer(emptyAudio, AUDIO_CHUNK_SIZE);
return;
}
int length = ringBuffer.get((uint8_t *)readBuffer, AUDIO_CHUNK_SIZE);
Audio.writeToPlayBuffer(readBuffer, length);
}
void recordCallback(void)
{
int length = Audio.readFromRecordBuffer(readBuffer, AUDIO_CHUNK_SIZE);
ringBuffer.put((uint8_t *)readBuffer, length);
}

Azure function

This is the simplest of all. All you have to do is to receive the stream and calculate the noise level. This can be very sophisticated but it is out of scope of this article.


using IoTHubTrigger = Microsoft.Azure.WebJobs.EventHubTriggerAttribute;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.EventHubs;
using System.Text;
using System.Net.Http;
using Microsoft.Extensions.Logging;
using System;
namespace IoTWorkbench
{
public static class IoTHubTrigger1
{
private static HttpClient client = new HttpClient();
[FunctionName("IoTHubTrigger1")]
public static void Run([IoTHubTrigger("%eventHubConnectionPath%", Connection = "eventHubConnectionString")]EventData message, ILogger log)
{
log.LogInformation($"C# IoT Hub trigger function processed a message: {Encoding.UTF8.GetString(message.Body.Array)}: " + System.Text.Encoding.Default.GetString(message.Body));
byte[] buffer = message.Body.ToArray();
short sample16Bit = BitConverter.ToInt16(buffer, 0);
double volume = Math.Abs(sample16Bit / 32768.0);
double decibels = 20 * Math.Log10(volume);
log.LogInformation(decibels.ToString());
}
}
}

Handshaking

In order the device to send messages to the Azure function, the device must know the endpoint in which it should send the data. You can take steps in the this link to register your device with Azure function. It is all about using Azure IoT Workbench.

References

https://docs.microsoft.com/en-us/dynamics365/field-service/developer/connected-field-service-architecture

Photo by Steinar Engeland on Unsplash

Improve efficiency of Call centers using Dynamics 365 and Azure cognitive services

Photo by Hrayr Movsisyan on Unsplash

I am Fascinated by sophistication of Azure services and how they help us to improve our solutions and extend the way we can solve customer problems. Recently I had a requirement to implement  a dynamics 365 solution to enable a call center to capture cases while their operators are offline.

One solution was to provide a self-service portal to customers to log the cases when Call center operators are offline. But in this case the customer was looking for something very quick to implement and having the ability to link incoming cases with their call center channel and derive some reporting based on it.

Approach

I started looking at Azure services and see how I can use Azure cognitive services and speech recognition to help me solve this requirement and like always I Azure did not disappoint me. In this post I would like to share my experience with you and take you to the steps that you would need to create such a solution. Of course possibilities are endless. However, this post will give you a starting point to begin your journey.

I have seen solutions where telephony systems send voice recordings of callers as an email attachment to a queue in CRM. The CRM then converts that queue item to a case and attaches the voice recording as note to the case. The challenge with this solution is the call center operators have to open attachments manually and have to write the description of the case after listening to the audio file. This means their time is spent on inefficient activities whereas they should be utilize in better ways.

Another problem with this approach is size of attachments. As time goes by, audio attachments will increase the database size impacting the maintenance of solution.

Scenario

Our scenario is based on the fact that call center agents are not working 24 hours a day.

While agents  are offline customer should still be able to contact call center record the voice messages to create cases.

We will use the following components:

  1. Azure Blob to receive recorded audio files from telephony system.
  2. Azure cognitive services to listen to recorded audio files and translate the content to a text message. The audio file will be saved in  Azure blob (which is cheaper than CRM database storage).
  3. Azure function (with Azure Blob Binding) to recognize the text from the audio file and extracts the case description.
  4. Dynamics 365 Web API to create a case in CRM using the description extracted from Azure Cognitive services.  We can also add blob metadata like filename, etc. to case properties.
Solution Architecture

The full source code is available at GitHub

However, the main code snippet to perform conversion is below:

 public static async Task <string> RecognitionWithPullAudioStreamAsync ( string key, string region, Stream myBlob , ILogger log )

        {

            // Creates an instance of a speech config with specified subscription key and service region.

            // Replace with your own subscription key and service region (e.g., "westus").

            var config = SpeechConfig.FromSubscription(key, region);

            string finalText = string.Empty;

            var stopRecognition = new TaskCompletionSource<int>();

            // Create an audio stream from a wav file.

            // Replace with your own audio file name.

            using ( var audioInput = Helper. OpenWavFile ( myBlob ) )

            {

                // Creates a speech recognizer using audio stream input.

                using ( var recognizer = new SpeechRecognizer ( config , audioInput ) )

                {

                    // Subscribes to events.

                    recognizer. Recognizing += ( s , e ) =>

                    {                       

                    };

                    recognizer. Recognized += ( s , e ) =>

                    {

                        if ( e. Result. Reason == ResultReason. RecognizedSpeech )

                        {

                            finalText += e. Result. Text + " ";

                        }

                        else if ( e. Result. Reason == ResultReason. NoMatch )

                        {

                            log.LogInformation ( $"NOMATCH: Speech could not be recognized." );

                        }

                    };

                    recognizer. Canceled += ( s , e ) =>

                    {

                        log. LogInformation ( $"CANCELED: Reason={e. Reason}" );

                        if ( e. Reason == CancellationReason. Error )

                        {

                            log. LogInformation ( $"CANCELED: ErrorCode={e. ErrorCode}" );

                            log. LogInformation ( $"CANCELED: ErrorDetails={e. ErrorDetails}" );

                            log. LogInformation ( $"CANCELED: Did you update the subscription info?" );

                        }

                        stopRecognition. TrySetResult ( 0 );

                    };

                    recognizer. SessionStarted += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession started event." );

                    };

                    recognizer. SessionStopped += ( s , e ) =>

                    {

                        log. LogInformation ( "\nSession stopped event." );

                        log. LogInformation ( "\nStop recognition." );

                        stopRecognition. TrySetResult ( 0 );

                    };

                    // Starts continuous recognition. Uses StopContinuousRecognitionAsync() to stop recognition.

                    await recognizer. StartContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    // Waits for completion.

                    // Use Task.WaitAny to keep the task rooted.

                    Task. WaitAny ( new [ ] { stopRecognition. Task } );

                    // Stops recognition.

                    await recognizer. StopContinuousRecognitionAsync ( ). ConfigureAwait ( false );

                    return finalText;

                }

            }

        }

Important considerations:

  1. [This point is optional, if you use Web API to create cases in CRM] You will need use Multi-tenant configuration, if your Azure Function Tenant and the tenant in which your CRM API is registered, are different. If your Azure function tenant and the tenant in which your CRM API is registered, you can use Single Tenant configuration.
  2. The input file from the telephony to Azure blob must be in a specific format. The required format specification is:
Property Value
File Format RIFF (WAV)
Sampling Rate 8000 Hz or 16000 Hz
Channels 1 (mono)
Sample Format PCM, 16-bit integers
File Duration 0.1 seconds < duration < 60 seconds
Silence Collar > 0.1 seconds

 

4. You can use ffmpeg tool to convert your recording to this specific format. For your testing, you can download and use the tool as below:
Download ffmpeg from this link.
Use the command: ffmpeg -i “<source>.mp3” -acodec pcm_s16le -ac 1 -ar 16000 “<output>.wav”
5. My sample in GitHub covers input in one single chunk of audio. However, if you wish to have continuous streaming, you will need to implement the         StartContinuousRecognitionAsync method.
6. The azure function should be configured to be blob trigger.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

PowerApp Admin Tools

I’ve been working on XrmToolBox Tools for a bit now, both speaking and posting on the huge number of cool tools and how we can build our own. I’m still working on a few XrmToolBox related projects, but when I started diving into Power Apps, I immediately wondered if I could replicate some Tools in Canvas or Model Driven Apps. Wouldn’t it be cool if we could build out a suite of some administrator or developer tools as Power Apps?

In my post Building XrmToolBox Tools, Part 2, we build an example Tool that allows admins to view the list of Users with a Security Role assignment. It’s a relatively simple tool but it can be pretty useful if you need a quick check on a Role for migrations, troubleshooting, etc. This seemed like as good a candidate as any for a new ‘Admin Tool’ Power App.

Security Role Member Manager!

The proposed functionality for the tool is pretty simple: provide a list of Security Roles in the system, and when the user selects a Role, show the list of assignments. This was meant to be written in about an hour, so that was the extent of the capabilities. Since we have a bit more time, we can add some features. How about we display the list of Teams that have the selected Role assignment and allow Adding or Removing a User or Team to the selected Role.

These requirements are a pretty good candidate for a Canvas App. We can build this using the following components:

  • Common Data Service (CDS) connector – provides the list of Security Roles
  • Office 365 Users Connector – provides the user picture
  • Gallery – bound to the Security Roles list
  • Gallery – bound to the list of Users related to the Security Role
  • Gallery – bound to the list of Teams related to the Security Role

The main screen layout and functionality is also fairly simple. Bind the main grid to to the Security Roles list, and on the select event, bind the secondary lists to the related Users and Teams. No real code behind, just some simple data binding to galleries.

Here is the initial main screen for the Power App:

Security Role Member Manager

Some challenges, some solutions

With the XrmToolBox Tool, I wrote some code to retrieve Security Roles and and User Security Role assignments using the standard SDK Query Expression methods. The list of Security Roles was a simple Retrieve Multiple while the User Role Assignment is a many-to-many relationship.

The many to many is where I stumbled a bit. When I select my Security Role, I want to retrieve the Users and Teams which are both many-to-many relations to Security Roles. The CDS connector does not list the join table as an Entity, so I couldn’t simply add a new Data Source for User or Team and filter by the selected Role. Fortunately, support has been added for many-to-many relationships in the CDS connector. Here is an excellent blog post on the feature by Greg Lindhorst, Principal Program Manager at Microsoft: Relate records in Many-to-Many relationships

So to render the list of Users and Teams, I can bind the galleries using a simple formula. The main screen gallery from which you select a Security Role is named ‘Security Roles List’. So the User and Teams gallery Items property can be set using these simple formulas respectively:

'Security Roles List'.Selected.Users
'Security Roles List'.Selected.Teams

As you can see in Greg’s post, adding and removing Users and Teams are fairly easy too. To remove a User from the selected Security Role, we need a single line formula added to our gallery button:

Unrelate('Security Roles List'.Selected.Users, ThisItem)

That statement passes the selected Security role and the currently selected User to the Unrelate formula, and we’re done!

Next Steps

I plan on a follow up post with a bit more functionality. For example, I like the inline model for selecting a User shown in the post by Greg above, but I think selecting multiple Users and Teams works better. Another nice feature will be to distinguish between Business Units. Right now, this pulls all Security Roles for the entire organization.

This sounds like an obvious one, but I also plan on adding a confirmation dialog before removing the User or Team from a Security Role. This was a bit more complicated than I had expected, so I will write up more detail on how this will be implemented.

As I was working on this sample Power App, I came across a great post User Admin PowerApp (Part 4) by d365Cooky that proposes a similar tool, but for managing Security Roles for a selected User. I like the idea of embedding this into Dynamics 365 CE. By the way, I saw the link via Guido Preite’s dynamicsweekly.com newsletter. If you have not already signed up for this, get to it!

I’ll post notes on all these updates with more detail on how it was built, including the full solution for download in a follow up post.

Powerful stuff!

I feel like I have said that a LOT recently! In a relatively short time, I built a functioning Power App that will allow administrators to manage the Security Role Users and Teams. I put this together in a few hours, including some reading on the CDS connector capabilities and designing a few screens. All of this was done using the existing connectors and no custom code outside of the standard Canvas App formulas.

This is not as complex a tool as you may find in the XrmToolBox, and I am definitely going to continue any contribution I can to the XrmToolBox! But I think this once again demonstrates how the Power Platform allow us to provide low code/no code solutions to your users.

In the meantime, as always, any comments, suggestions, or questions are appreciated!

D365UG Bristol

Man, oh MAN! I am just on the train back from Bristol after attending their first 365 User Group. and it was AWESOME!

First up, big props to the organisers @lubo @dbarber @xrmjoel @robpeledie @leembaker for doing such an amazing job. The venue was great, the food was too (including vegan options :-)) and the swag bags were epic.

But more importantly, the speakers were amazing.

First up, Mark Smith did a fantastic presentation to explain the (seemingly unexplainable) Common Data Model, which is something that sits on the CDS and exists to make sense of your data so that it is usable and, more importantly, means you can build apps faster. Considering he knew nothing about the CDM 5 days ago, this was a great presentation which goes to show how quickly you can assimilate information when the data is available in a usable format (see what I did there *winks*)

Second up was @scott-durowdevelop1-net showing us how much the Power Platform has changed in such a short space of time, and why that’s such a good thing, enabling access to a much wider audience and giving rise to things like the Citizen Developers who can effect such great change in an organisation, and how we all need to embrace change and be adaptable.

And finally, @themarkchristie with an entertaining presentation on how he bought headphones at the Microsoft store and @lubo busted them 5 minutes later – but not to worry, Virtual agent was able to help him raise a support ticket through Forms Pro, which then could be assigned to someone in Field Service, leading to the headphones being fixed (whilst being worn by Mark Smith).

I learned a lot about how to better use the tech we have openly available to us in a more inventive way, and how to work with the community to find answers to stuff that you are struggling with.

I love these community events and I hope to attend each and every one that I can.

Again, big thanks and well done to the organisers.

Love to all

Alison

(Me when I won a Flic button from @themarkchristie)

MBAS Heterogenous Storage – Atlanta – June 2019 Session Notes

MBAS Heterogenous Storage – Atlanta – June 2019

Session Notes

Graham “show-me-the-code” Tomkins

Introduction to Storage Cake: (Cake Level: Jaffa Cake)

Currently one of the ways to store actual files in CDS (CRM/XrM/XRM/Xrm/CrM/Dynamics/CE/D365) is to utilise and force notes with attachments into the GUI – relatively ugly and non-jaffa-cake-like.  This was do-able easily but the storage of attachments, either out of the back of processes i.e. azure logs, contract storage, image audio etc. was normally fed in through code.

Accessing these files proved simple but frustrating, also the one image field per entity was circumvented with, sometimes unnecessary child entities – un-normalisation is no-ones friend! There was also a size and resolution limit that restricted many fundamental uses for the image field.

New Cake: (Cake Level: Profiterole)

During the MBAS session “Heterogeneous Storage…”<link> – the speakers lifted the lid on what is coming – this article, with all it’s sarcasm is my summary of the session, which covered:

  • Licensing
  • Data Types
  • Auditing & Logs
  • New File Types

Licensing: (Cake Level: Petit Four)

Licensing <link> around Microsoft products has always been a dirty word, I find understanding quantum mechanics easier than attaining the PhD required to give a simple clear answer to most MS CRM/CDS/XRM/Dynamics/PowerApps/CE based licensing questions…

The storage aspect has got slightly more complex, however I believe this is a good thing as the previous one size fits all approach isn’t good for todays usage of the platform.  The PowerPoint gives more detail on the PowerApps storage, however the slide below relates specifically to CRM (D365/CE/XRM etc.etc.)

 

Now I believe the clients will have needed to switch to the new licensing model (automatic at the next renewal period start) in order for the above to apply.  As you can see the database size (old money: MDF file size) is now split into Data (flour and eggs), File Data (butter) and Logs (sugar).

This will most likely help remove laziness on the system deployers front where logs are cleared infrequently if ever.  It also looks like they are treating all BASE64 blobs in SQL very differently (why so much cheaper I wonder….)

Data Types: (Cake Level: Rocky Road and Custard)

From the slide extract below it’s clear that they are also going to treat indexes and search metadata differently also.  In my experience, separating this out at the SQL level can yield performance improvements – I assume this already happens in the mystical world of PowerPlatform/CRM/D365/CE/CDS/XRM SQL 😊

Relational Data is the basis of all the good stuff inside CRM – it’s most field types stored in entities and tables within the system – from the system tables all the way up to the custom tables you added to store information about your pet’s food preferences.

Files, Images, Attachments and Other Binaries is, it seems, anything we used to store as BASE64 data types in SQL, it’s your files serialised into SQL taking up crazy amounts of memory and disk space for infrequent access, all the email attachments, and will be used for the 2 new file types (see below).

Audit data etc. is the small amount of space (that never grows automatically…) we will be given to store standard system audit logs, plug-in trace logs (oh we need to make sure this is all turned off in dev…) etc.  Now I believe this is different in space to the Office 365 access logs however I’m not entirely sure.

N.B: it was alluded to, that coming data types would include observational IOT data i.e. events from devices… inter-ma-resting.

Auditing & Logs: (Cake Level: Peanut Butter Cheesecake with Cream)

With the new restrictions on size, MS are going to be giving us new tooling to help clean up the Audit logs in different ways (other than the one size fits all by date that currently exists).

By Entity will help us clean up when the PetFoodPlugin.dll has gone crazy and sequentially update all dogs to liking fish instead of chicken.

Access logs by people and systems will help correct for when my specific personal workflow has been executed across 9 million pet hamster records.

All logs by date is the good old sweep up.


New Field Types: (Cake Level: Black Forrest Gateau)

Ok so storing both images and files against current CRM records is a pain in the yaris.  Two new data types for fields (attributes if you must) are on their way

  1. File
    • Multiple Files allowed per record
    • Configurable Size limits per field
      • Upto 128mb
    • Binary Access
    • Searching Inside files is coming later

 

  1. Image
    • Full High Resolution Images Support
    • Thumbnail Generation on Upload
    • Any entity
    • Backwards Compatible
    • N.B: It was unclear to me, even with a question, if we are going to be allowed to store multiple image fields per entity – I’m going to assume that we will be allowed (as it’s currently limited to one field per entity).

 

 

Summary:

All in all, a very informative session – the team were honest about the timelines being the end of 2019 (ish) – which gives us enough time to make sure any future looking designs can incorporate and make the most the new fields.  With smart moves shuffling around data types behind the scenes, you can see they are aiming to turn on the new version of the old SharePoint File search – but I assume done properly and with a correctly controlled scale.  And to finish – some cake:

PowerApps Licensing Cheat Sheet

During the App in a Day sessions I’ve held there were often questions regarding the types of license that is required, based on the options/features that are part of the built solution. Based on the fact that an image is worth a thousand words, here’s how I’ve been able to easily demonstrate the basic differences between Office365 license, PowerApp P1 and P2 licenses. It’s not about all the nitty gritty details, but the base is there. In doubt always refer to Microsoft’s official documentation!

 

How to fix Ribbon Button issues in Unified Client Interface (UCI)

A while ago I was working on creating a ribbon button for Contact Form. I used the fabulous RibbonWorkBench to add the button to my classic form. My requirement was to run a workflow from my custom ribbon button. The post I referred to was from Scott Durow in his website.

When I changed my form from the classic Webform to UCI, I saw two strange behaviors .

The button was not anymore showing on UCI

  • Someone using the Ribbon Workbench, the Enable Rule was messed up. The button was showing in the classic form but not on the UCI form. I tried creating new solutions. I even tried isolating the button. None of these worked. Then I had to examine my “Customization.xml” file. I used the Microsoft Documentation to write enable rules again and published my solution. Once I published my solution it worked fine. To button line is that in case of any issues with your ribbon (when you try all possible options), make sure to examine Customization.xml to ensure your configuration is correct. Customization.xml is the ultimate source of truth.

Pressing the ribbon button was not doing anything

  • The ribbon button was not performing anything because the action was from a static .js library (/_static/_common/scripts/RibbonActions.js). Actions in UCI must be from web resources. static files are not supported anymore on UCI

Underhood of Dynamics 365 Portal Profile Page

Photo by Emanuel Villalobos on Unsplash
In the part 1 of the series, I shed light on some of planning and analysis activities involved in a Portal project. From this part, I will be writing about some common portal project requirements and the way I have addressed those requirements.

Site Settings

One of the most basic portal configurations happen in Site Settings. Site Settings contain some global configuration used by portal framework. The complete list of portal site settings can be found here. However, there are some settings are deprecated since this link is taken from ADXStudio product and there are some settings that you will not find in the given link and I am listing down those settings in the below table:
Setting Description
Search/Enabled A true or false value. If set to false will disable the search functionality from the portal so you will not see the magnifier sign on the primary navigation
DateTime/DateFormat The format you want to show dates in your portal. For example, for British datetime format, I use d/M/yyyy
profile/forcesignup A true or false value. If set to true will force user to update their profile after signup. It means portal will redirect the user to the Profile page after successful signup
Authentication/Registration/TermsAgreementEnabled A true or false value. If set to true, the portal will display the terms and conditions of the site. Users must agree to the terms and conditions before they are considered authenticated and can use the site.
Authentication/Registration/ProfileRedirectEnabled A true or false value. If set to false and profile page is disabled on the Portal, then Redeem Invitation workflow doesn’t work properly and keeps redirecting user to same page in place of home page.

From <https://support.microsoft.com/en-au/help/4496222/portal-capabilities-for-microsoft-dynamics-365-version-9-1-4-29-releas

Authentication/Registration/LoginButtonAuthenticationType If a portal only requires a single external identity provider, this allows the Sign-In link on the primary navigation to link directly to the sign-in page of that external identity provider
Authentication/Registration/OpenRegistrationEnabled A true or false value. If set to true allows any anonymous visitor to the portal to create a new user account.
Profile/ShowMarketingOptionsPanel A true or false value. If set to false, it hides the marketing preferences area in the contact profile

Profile page

Profile page is a custom aspx page which displays Contact entity’s “Profile Web Form” on the profile page. If you want to change the fields on this page, you must modify “Profile Web Form” on the contact entity. In addition of fields on the “Profile Web Form”, the profile pages show Marketing Preferences. Marketing Preferences can be enabled or disabled by using Profile/ShowMarketingOptionsPanel site setting.

Customising Profile Page

Profile page is a special page which you cannot customise using Entity forms and Metadata. An ordinary web page like a case form in portal rely on “Entity Forms”. The case of Profile page is different. The Profile page does not rely on Entity forms, but it re-writes the request to “Profile.aspx”. So, in case if you want to change the form behavior or add validation to fields on the screen, you will need to come with a different approach.

Scenario:

  1. The mobile phone on the profile should be in the format xxxx-xxx-xxx
  2. The profile page should be editable if the contact has NO active cases
  3. The profile page should be read-only if the contact has active cases

Implementation:

  1. Deactivate the existing Profile web page
  2. Create a new Entity Form called “Editable Profile Entity Form” interfacing “Profile Web Form” in Edit mode
  3. Create a new Entity Form called “Read-only Profile Entity Form” interfacing “Profile Web Form” in Read-only mode
  4. Create a new Web Template called “Profile Web Template” – We will talk about this template in details later
  5. Create a new Page Template named “Profile”
  6. Open the Profile Page template and set the following values
    1. Type=Web Template
    2. Entity Name= Web Page (adx_webpage)
    3. Web Template = Profile Web Template (created in the step 4)
  7. Create a new Web Page named “Profile”.
    1. The profile page’s partial URL must be “Profile”
    2. Set the parent page to “Home”.
    3. Set the Page Template to the Profile Template (created on the step 6)
  8. Open the Profile Web Template and add the following liquid template:

For adding the breadcramp, add the following liquid

{% block breadcrumbs %}
{% include 'Breadcrumbs' %}
{% endblock %}

For adding the title to the page, add the following liquid

{% block title %}
{% include 'Page Header' %}
{% endblock %}

For adding side navigation, add the following liquid

{% block aside %}
{%include "side_navigation_portal" %}
{% endblock %}

The main form will be in the Main Block

<div class="col-sm-8 col-lg-8 left-column">
{% block main %}
{% endblock %}
</div>

Now the below code is the magic behind meeting the requirement:

Use FetchXML to check if there are any active cases related to the contact:

{% fetchxml my_query %}
<fetch version="1.0" output-format="xml-platform" mapping="logical"returntotalrecordcount="true" distinct="false">
<entity name="incident">
<attribute name="name" />
<attribute name="status" />
<attribute name="createdon" />
<filter type="and">
<condition attribute="parent_contact" operator="eq" value="{{User.Id}}" />
<condition attribute="status" operator="eq" value=0 />
</filter>
</entity>
</fetch>
{% endfetchxml %}
  1.  The FetchXML block must be enclosed in the {%fetchxml my_query%} where my_query is holding the result of the query
  2. If you want to check the total records returned by the query, you must use returntotalrecordcount=true otherwise you will always get -1 in count of your records
  3. The total count of result of the query will be accessed by my_query.results.total_record_count

The final piece of code in the page template will be the following

{% if my_query.results.total_record_count > 0 %}
{% entityform name: 'Read only - Profile Entity Form' %}
{% else %}
{% entityform name: 'Editable - Profile Entity Form' %}
{% endif %}

With this simple if/else statement you can control the behavior of the profile page.

Validating Mobile phone

Since we are using Entity Forms to show profile information, we can use Entity Form Metadata to control behavior of fields on the form. To ensure our mobile number is always in xxxx-xxx-xxx format, do the following:

  1. Open Editable – Profile Entity Form
  2. From related records, go to “Entity Form Metadata”
  3. Add a new metadata record with the following properties:
  4. Type: Attribute
  5. Attribute Logical Name:  Mobile Phone (mobilephone)
  6. Find Validation section down the form and add Validation Error Message
  7. Use the ^\d{4}\s\d{3}\s\d{3}$ as regular expression to ensure the mobile phone is in the xxxx-xxx-xxx format
  8. You can tick the “Is Field Required” to make the field required on the screen

 

Converting Dynamics’s Geolocation To SQL Geolocation Using Microsoft Flow And Azure Function

Background

One of the awesome features of the Azure Search service is the ability to search information based on location. Azure Search processes, filters, and displays geographic locations. It enables users to explore data based on the proximity of a search result to a physical location. This feature is powered by SQL Server Geolocation data type. Since SQL Server 2008, developers are able store geospatial data in SQL server using Geolocation fields. Geolocation fields allow querying data with location based queries. To facilitate the Azure Search service to search within CRM accounts and contact, I had to pushed my account and contact searchable information to SQL server hosted in Azure. To copy information from Dynamics to Azure SQL server, I used Microsoft flow. Everything worked good except, copying CRM longitude and Latitude to SQL Server.

The problem

The problem with copying longitude and latitude to SQL server Geolocation field is the compatibility. When you try to insert longitude and latitude fields to Geolocation you encounter casting error.

The solution

  1. The solution I used to tackle this problem is making use of Azure Function and converting Longitude and Latitude to Geolocation type in the Azure function and return the response before the Insert action in the flow. See the below steps:
  2. Step 1 is self-explanatory.
  3. The step “CC Contact” extracts the Contact name (or any lookup name property) from a lookup.
  4. The “Http” step, calls the Azure Function to converts the CRM longitude and Latitude to SQL Geolocation field
  5. The “Insert Row” step, inserts our data to SQL server row.
Microsoft Flow
Microsoft Flow

The Azure Function

The Azure function is a very simple function. You will need to import Microsoft.SqlServer.Types Nuget package and use the below code:
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();  
       Coordinates data = JsonConvert.DeserializeObject<Coordinates>(requestBody);  
       SqlGeography point = data.GetGeography();  
       return ( ActionResult ) new OkObjectResult ( $"{point}" );  
 public class Coordinates  
   {  
     public double Longitude { get; set; }  
     public double Latitude { get; set; }  
     public SqlGeography GetGeography ( )  
     {        
       try  
       {  
         return SqlGeography. Point ( Latitude , Longitude , 4326 );  
       }  
       catch ( Exception ex )  
       {  
 // Log ex and handle exception  
         throw ex;  
       }  
     }  
   }