Dynamics 365 for Customer Engagement

Opening Dynamic CRM Entity Form by passing Query String

Photo by Luca Bravo on Unsplash
One of the awesome features of the Power Platform is its extension capabilities. We often talk about integrating Power Platform using web services, azure services or plugins but we overlook the platform client side capabilities. The Dynamics platform allows interacting with resources using Addressable elements. URL addressable elements enable you to include links to Dynamics 365 for Customer Engagement apps forms, views, dialogs, and reports in other applications. In this manner, you can easily extend other applications, reports, or websites so that users can view information and perform actions without switching applications.

Requirement

I had a requirement to open an Account entity form based on the Account Telephone number. The Dynamics platform allows only to open the entity record in edit mode only by passing the entity ID. However, my requirement was to open the entity form based on the telephone number.

Considerations

  • Opening a form in edit mode is possible only if we know the ID (or GUID) of the record. If you pass any other query string like telephone, employeeno or etc.  you will receive 500-internal error.
  • You will need an HTML webresource as intermediate component to resolve your query string and in my case telephone to the entity ID and then open the form in edit mode by passing ID.
  • The only query string name you can use to pass to the organization URL is “data”. If you use any other query string name such as employeeId, contactid and etc. will lead you to the 500-Internal Server Error.
  • You will need to use GlobalContext by calling getGlobalContext method in your web resource. The getQueryStringParameters method is deprecated. You will need to find another way to get the value of your query string. I used Andrew Butenko post to extract query string. A big shout out to Andrew Putenko. At the same time a big shout out to Jason Lattimer for his great CRMRestBuilder.

Solution

I used an HTML webresource, with a Javascript function to extract and resolve query string and then call OpenForm function to open the form.
<!DOCTYPE html>
<html lang="en" xmlns="http://www.w3.org/1999/xhtml">
<head>
    <meta charset="utf-8" />
    <title>Web Resource</title>
    <script src="ClientGlobalContext.js.aspx" type="text/javascript"></script>
    <script src="https://code.jquery.com/jquery-3.4.1.min.js"> </script>
    <script>
        function Onload() {
            var queryString = location.search.substring(1);
            var params = {};
            var queryStringParts = queryString.split("&");
            for (var i = 0; i < queryStringParts.length; i++) {
                var pieces = queryStringParts[i].split("=");
                params[pieces[0]] = pieces.length == 1 ? null : decodeURIComponent(pieces[1]);
            }

            var phone = params["data"];//formContext.data.attributes["data"];
            var req = new XMLHttpRequest();
            req.open("GET", Xrm.Page.context.getClientUrl() + "/api/data/v9.1/accounts?$select=accountid&$filter=telephone1 eq '" + phone + "'", true);
            req.setRequestHeader("OData-MaxVersion", "4.0");
            req.setRequestHeader("OData-Version", "4.0");
            req.setRequestHeader("Accept", "application/json");
            req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
            req.setRequestHeader("Prefer", "odata.include-annotations=\"*\",odata.maxpagesize=1");
            req.onreadystatechange = function () {
                if (this.readyState === 4) {
                    req.onreadystatechange = null;
                    if (this.status === 200) {
                        var results = JSON.parse(this.response);
                        for (var i = 0; i < results.value.length; i++) {
                            var accountid = results.value[i]["accountid"];
                            OpenForm("account", accountid);
                        }
                    } else {
                        Xrm.Utility.alertDialog(this.statusText);
                    }
                }
            };
            req.send();           
        }

        function OpenForm(entity, id) {
            var entityFormOptions = {};
            entityFormOptions["entityName"] = entity;
            entityFormOptions["openInNewWindow"] = true;
            entityFormOptions["entityId"] = id;
            Xrm.Navigation.openForm(entityFormOptions).then(
                function (success) {
                    console.log(success);
                },
                function (error) {
                    console.log(error);
                });
        }
</script>
</head>
<body>
<script>Onload();</script>
</body>
</html>

Review the Power Platform release plan with mvp’s

In case you missed it, the 2019 wave 2 release plan for Dynamics 365 and the Power Platform was released today.  You can read James Phillip’s blog summary at https://cloudblogs.microsoft.com/dynamics365/bdm/2019/06/10/announcing-new-features-growing-demand-for-dynamics-365-and-power-platform/ and you can download the full release notes from https://docs.microsoft.com/en-us/dynamics365-release-plan/2019wave2/.

This afternoon, I was joined by MVP’s Megan Walker, Ulrik Carlsson, and Andrew Bibby to review the release plan. Watch the video below.

 

Converting Dynamics’s Geolocation To SQL Geolocation Using Microsoft Flow And Azure Function

Background

One of the awesome features of the Azure Search service is the ability to search information based on location. Azure Search processes, filters, and displays geographic locations. It enables users to explore data based on the proximity of a search result to a physical location. This feature is powered by SQL Server Geolocation data type. Since SQL Server 2008, developers are able store geospatial data in SQL server using Geolocation fields. Geolocation fields allow querying data with location based queries. To facilitate the Azure Search service to search within CRM accounts and contact, I had to pushed my account and contact searchable information to SQL server hosted in Azure. To copy information from Dynamics to Azure SQL server, I used Microsoft flow. Everything worked good except, copying CRM longitude and Latitude to SQL Server.

The problem

The problem with copying longitude and latitude to SQL server Geolocation field is the compatibility. When you try to insert longitude and latitude fields to Geolocation you encounter casting error.

The solution

  1. The solution I used to tackle this problem is making use of Azure Function and converting Longitude and Latitude to Geolocation type in the Azure function and return the response before the Insert action in the flow. See the below steps:
  2. Step 1 is self-explanatory.
  3. The step “CC Contact” extracts the Contact name (or any lookup name property) from a lookup.
  4. The “Http” step, calls the Azure Function to converts the CRM longitude and Latitude to SQL Geolocation field
  5. The “Insert Row” step, inserts our data to SQL server row.
Microsoft Flow
Microsoft Flow

The Azure Function

The Azure function is a very simple function. You will need to import Microsoft.SqlServer.Types Nuget package and use the below code:
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();  
       Coordinates data = JsonConvert.DeserializeObject<Coordinates>(requestBody);  
       SqlGeography point = data.GetGeography();  
       return ( ActionResult ) new OkObjectResult ( $"{point}" );  
 public class Coordinates  
   {  
     public double Longitude { get; set; }  
     public double Latitude { get; set; }  
     public SqlGeography GetGeography ( )  
     {        
       try  
       {  
         return SqlGeography. Point ( Latitude , Longitude , 4326 );  
       }  
       catch ( Exception ex )  
       {  
 // Log ex and handle exception  
         throw ex;  
       }  
     }  
   } 

 

 

Implementing Enterprise Search In Power Platform

Photo by Anthony Martino on Unsplash
Photo by Anthony Martino on Unsplash
Providing good search capabilities is a key feature in modern business applications to support usability and end user satisfaction. We have seen how the search capabilities of the Dynamics platform has evolved from providing “Quick Search” and “Advanced File” to “Relevance Search”. The goal of the platform search features has been to support users to find the relevant information they need in the quickest and easiest form. These search features are out-of-the-box and easy to enable/configure/use. As the platform progresses to offer richer features to users and enable them to search better, the demand for richer and better search techniques grow, and we see instances where the platform capabilities cannot meet user demands with its out-of-the-box capabilities. Before going further about advanced search scenarios, you can read about the platform out-of-the-box search capabilities in this official documentation. In this article I share why we may decide to implement a search solution of our Dynamics solution using Azure Search Service.
In enterprise implementations, business applications are not the only systems used in the organization. We often see call center agents and sales representatives need to obtain their required information from various systems to service customers. Searching users in every system is a cumbersome job which may cause setbacks in end-user adaption. Integrating Dynamics with Azure search offers consolidation of search operations in one specialized search service with ability to connecting to various data sources and apply modern search techniques to find the most relevant data. A practical example of this scenario can be seen in one my recent experiences where the organization users had to search for user information in CRM, SharePoint, Sybase and a pool of CSV files.

Customized Search experience

To facilitate more user adoption, using customized search techniques are highly favorable. In all modern search engines, we see use of “Auto complete”, “Suggestions” and “highlighting” features which can be added to the Dynamics solutions search experience. Displaying search results by support of “Document Preview”, “Document Opening in a customized containers”, “Facets”, “Filter” and “Sorting” are examples that enhance your Dynamics solution’s capabilities.

Customized Search Behavior

The true power of search is demonstrated with different pieces of information are linked together to make sense of a bigger picture. Extracting words and sentences from documents including images and pdf files, extracting key phrases, people names, location names, languages and other custom entities with the help of AI is another unique feature that you can add to your Dynamics’s search capabilities. Another amazing search capability you can have in your Dynamics implementation is the ability to search based on geolocation information, i.e. you can search for all your partner network from CRM or get the location of your field service force. The beauty of implementing your own enterprise search lies in the fact that you can search information in your data stores and link them using AI to generate knowledge and better insight to your data.

Customized Search Result

Another need for customized search in your Dynamics solution to the ability to refine your search result profile. When you use AI in your search, the system gives you the power to see how relevant search results are to your search keywords. And by knowing this you can refine your search profiles to generate a different result for the same keywords. This way you train the AI engine to work better for you and enable users to get more accurate search results.
Architecture

Dynamics integration with Azure Search service can be integrated in the following pattern:

 

  1. Integration through web resources: These web resources will host a web application acting as a client to the search service. The web resource can be a HTML file or an iFrame hosted on forms. The important point in this approach to ensure cross-origin settings in the client application and writing your html in a secure way and according to the best practices.
  2. Integration through custom power platform controls. You may build your own custom control which sends REST requests to the Azure Search and display results by consumes REST responses. The custom control can call Azure Search services using Actions or direct REST calls to Azure Service.
  3. Azure Search works based on indexes and your first step is to push your CRM searchable data to Azure Search indexes. This can be done using Microsoft Flow, Microsoft App Logics, custom solutions or Azure Data Factory. I have used all these tools in my implementations, and you can opt to any of these tools based on your requirements.
  4. Once the data is in your data store, you can create your indexes in the Azure Search. You can go for separate indexes for each data source or combine multiple data sources in one index. Each approach has its own requirements which will need to be met either in your client web application or a separate azure compute resource. Once indexing is done, you can make use Azure Search Rest API directly or using Azure API management to expose your search service to your Dynamics solution.
Summing these all up, you see as business application products get more sophisticated and organizations move from data to big data, engineers now must look for innovative approaches to implement Dynamics Solutions. Microsoft Azure along with Dynamics platform offers necessary tools to solution architects to design such solutions.

The easy way to compare environments

Have you ever had to compare two CDS/D365 environments and see what the differences were? Maybe you want to do a data migration into an existing environment to combine environments, and you need to know what the field and metadata differences are between the two environments.

Traditionally I would export the xml and compare in Notepad+ or some other tedious method, but I discovered (thanks to Tanguy Touzard’s recommendation) that there is a tool in the XrmToolBox plugin store that makes the job much easier.

Introducing the System Customization Comparer.

This awesome tool was created by Lars Muller, and it lets you select a source and target environment

Then it will compare the metadata, showing you the differences in entities, fields, and views.

Thank you Lars for making a frustrating task less frustrating.

Virtual Entities 0x80040203 Invalid Argument Error

I stumbled upon this issue after creating a custom virtual entity data provider. Ivan Ficko has a great tutorial on this here.

The subgrid displayed records perfectly fine in the old web client, but in UCI i received the below error message “0x80040203 Invalid Argument”. After some search and only finding a single post regarding this with no answers, i decided to take matters into my own hands! Digging through my  browsers console i managed to find additional information regarding this error. Inspecting the  exception, i found the message “entity name is invalid”.

Continue reading

Flow, HTTP Actions, and Files

I am working on a new presentation sample project and I wanted to test invoking an HTTP request from a Flow. Specifically, I want to invoke a Function App from a Flow using an HTTP Flow Action. In my sample, I will kick this off when a new Note is created with an Attachment.

To quickly test calling the HTTP Action, I uses an existing Function App sample that I had worked on a few weeks ago: a small Function App that I put together to test populating a PDF template using CRM data.

Poking around with Function Apps

This sample is creatively named CRMToPDF because it retrieves a record from CRM and populates a fillable PDF form from the CRM record using iText, returning the updated PDF for download. Pretty simple in terms of code, but it was a nice proof of concept testing the iText libraries (more on that in another post!).

Since this Function App returns the PDF file as the response, I was curious as to how Flow would handle it.  Could I “download” a file from a URL in Flow and attach it to an email using the Outlook email Action?

You bet I can!

With a few short steps, I was able to grab the resulting file and attach it to an email. This isn’t a huge surprise since so many Flow connectors already deal with moving files around. But it surprised me how simple it was to accomplish what I wanted to do.

So the Flow I created is really simple:

  • Trigger on a new Note
  • Invoke an HTTP Flow Action
  • Email the resulting PDF to myself

Here are the initial Flow steps, a Trigger and HTTP Action:

Trigger on new Note record, call HTTP GET

The trigger is on ALL Notes, so this would definitely change in the real world. And the HTTP GET only includes my Function App Authorization key. In a real example, I would pass in additional parameters, such as the of the Note ID or Object ID as an additional Query or as part of the request Body.

The Outlook Email Action looks like this:

New Email using the HTTP Response

You can see that this Action is pretty straightforward. It’s just an email to myself from the Owner of the Note. In Dynamics 365 CE, this mean that the System User had to enable sending emails on their behalf which is just a value under Personalized Settings. I just filled in a few bits of other info, like the Body and Subject.

The important part for me here is setting up the Attachment: set the Attachment Name to “CRM2PDF.pdf” and the Attachment Content to the Body of the HTTP Response.

That’s it. Yep. That’s all!

I first started looking at Flow a bit last year and wrote a short post about moving a document from Dynamics 365 CE to SharePoint, Flow Examples: Note attachment to SharePoint. This turned out to be relatively straightforward and a really cool Flow but had a few quirks, like converting the Note document body using the base64ToBinary function.

When I started looking at this sample, I expected some similar required steps, but setting the Body as the Attachment Content just worked. I put this entire Flow together in about 15 minutes, and it worked on the first try! (As a developer, I NEVER expect it to work on the first try!)

This tells me that the Flow engine is aware of the content type being returned by the HTTP get and can handle it properly when moving between the actions. The Actions know how to work with the files between the source HTTP Action and the next Outlook email Action.

That sounds like another obvious comment, but it makes me happy as a developer not having to do any kind of manipulation or parsing or other coding magic! For an idea of what is being returned from the HTTP action, we can look at the Flow Test logs for the HTTP GET Action and open the Outputs:

This isn’t super complex JSON for most developers: HTTP response code, several headers, the filename, etc. But for non developers, this could present an impossible roadblock. With the Flow designer and this huge library of existing Actions, a non developer can point their Flow to a service endpoint and move files about without a single line of code.

That’s some powerful stuff.

Solution Layering

I’ve recently noticed the Solution Layers button but knew next to nothing about its functionality.  It was added to my ever growing list of, “Ok, I need to check that out when I have some time!” While on a call this past week, the Solution Layers feature came up. After a brief overview on the call and some poking around afterwards, it looks to be a useful feature for developers, business analysts, and administrators.

What are Solution Layers?

Solution Layers is not some hidden, mystery feature.  Microsoft has done a great job recently with their online documentation and the article titled View solution layers includes a nice quick explanation of Solution layers:

  • Let you see the order in which a solution changed a component.
  • Let you view all properties of a component within a specific solution, including the changes to the component.
  • Can be used to troubleshoot dependency or solution-layering issues by displaying change details for a component that was introduced by a solution change.

So the Solution Layers tool offers insight into system components and their relationships to Solution deployments. The significant bit here to me is that it shows changes to the component and when the installation or updates were introduced.

Where do I find Solution Layers?

When you select a Solution component, such as an Entity, Process, or Web Resource, or sub component such as an Entity Form or Attribute, you will now see a button labeled Solution Layers.

For example, I opened the Power Apps Checker solution in a recently provisioned demo environment.  Expanding the Entities, we can see the button on the Analysis Result Detail Entity. Drilling into the Forms list, we see the tool button available with the Information main Form.  

Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity Information Form
Solution Layers for the Analysis Result Detail Entity Information Form

If you open the Solution Layers dialog for the Analysis Result Detail Entity, we can see a one item list of Solutions.  This is a list of the Solutions to which this Entity is related.

Entity level Solution Layers
Entity level Solution Layers

Select the Solution listed and you can view the Analysis Result Detail Entity details that are related to the Solution.

 Analysis Result Detail Entity Solution Layer Details
Analysis Result Detail Entity Solution Layer Details

This view provides the list of the changed properties for the Entity when the Solution was imported in the first Changed Properties ‘tab’, and the full list of Entity properties in the All Properties tab. If we open the Information Form for this Entity, we see very similar Information: a single Solution and the detailed changes of the selected Entity Form for that Solution import. 

We only see one item in both the Entity and Entity Form levels because this Entity and all of its components are unique to this Solution. We can also see the list of Changed Properties is the same as the list of All Properties. This tells us that the Analysis Result Detail Entity was installed with Power Apps Checker solution and has not been affected by any other Solution installs.

That is some nice information, but not especially useful. The Solution Layers component really shines when we look at Entities that can be impacted by other solution imports.  For example, a system Entity Contact can be impacted by many different Solutions on your system. Or you may have a custom Entity being deploying as part of a product or an ongoing project that will see regular changes, whether through major Solution releases or hotfix style solution deployments.

Contact is a popular Entity

If we open a different solution that contains the Contact Entity, we see the real power behind this tool. If we open the solution named Sales Navigator for Dynamics 365 Unified Interface that comes with my demo environment, and view the Contact Entity Solution Layers, we see some immediate differences.

Contact Solution Layers Detail - lots of changes!
Contact Solution Layers Detail – lots of changes!

The Contact Entity has been changed by 21 separate Solutions. The first at the bottom of the list is System, but at the top we see Active as the latest. This means that the Entity or one or more Entity sub components were updated with each of these 21 Solution imports. So, how do we see more detail on all of these Entity changes?

Deltas!

If we dig deeper into the Solution components, we can see more granular detail of the changes. We can drill into the Contact Forms list for this Solution and open the Contact Form Solution Layers dialog.

In this view, we can see that the Contact Form has been updated by 11 different Solution Imports. But what has been changed? Open up a solution from the list to find out:

Contact Form Solution Layers Detail
Contact Form Solution Layers Detail

In this view under Changed Properties, we can see detailed changes that were made with the Solution Import. In this example, we see the underlying Form JSON value was updated, and if you scroll a bit, you will also see that the Form XML. With other value types, such as numbers or boolean values, it’s easy to see the changed value.

For more complex types like Form JSON or XML, you can compare the differences to the previous Solution Layer value. Simply open the previous Solution Layer from the list and view the property value under the All Properties view using a standard text diff tool such as WinDiff or Visual Studio.

Why is this a big deal?

The Dynamics 365 CE and the Power Platform with CDS now has a built in method for change tracking of various layers of the solution components. I include the Power Platform here because when you view an Entity from a Model Driven Power Apps , you have the option of switching to Classic View. In Classic View, you can view the Solution Layers exactly as if you were working within a Dynamics 365 CE solution.

This can be incredibly useful when troubleshooting issues or just managing your own deployments. With solid DevOps practices in place, you should be able to view content like this using source code control tools. But if you are working on a project for which those practices were not well established, I can see this feature as a huge help for developers, business analysts, or system administrators.

I recommend reviewing the article listed above and playing around with the feature. For example, check out changes to solution components like Workflows where you can view the changes to the underlying XAML that contains the workflow logic.

I will be looking into it in more detail myself because I can see the possibility for some nice tools built around this capability!

An easier way to export data from your D365 environments

Exporting/Importing data can sometimes be a tedious and time-consuming task but thanks to this feature in PowerApps, exporting data from multiple entities is as easy as ever. Huge thanks to our PowerApps SME from Barhead, Mary Rose Bagtas, for helping out. 🙂

 

To quickly export data, go to https://web.powerapps.com

Click on Data -> Entities

 

Then click on Export Data

 

Select the entities you want to export, then click on Export Data again.

 

Download the exported data and you’re good to go. 🙂

 

Enable Unified Interface Only in Dynamics 365

Users can now enable Unified Interface Only in Dynamics 365 environments. Enabling this feature allows user to land directly in the app available to their security roles or navigate to existing apps by selecting on the app tile.

However, please take note that this setting is defaulted to NO in your environment. You can enable/disable this feature in two ways:

  • In Customer Engagement, go to Settings > Administration > System Settings > General tab. Under Use the new Unified Interface only (recommended), select Yes for Enable only the Unified Interface.

In the Power Platform Admin center, go to Environments and select an environment. Go to Settings > Behavior > Interface settings and then turn on Use Unified Interface only.

What happens to the legacy web client app?

The legacy web client app, also known as Dynamics 365 – custom, will be hidden from end users when a new environment is provisioned. But it will always visible to those with System Administrator and System Customizer roles, and to other custom roles with similar privileges.

Read more: https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/enable-unified-interface-only#how-to-enable-unified-interface-only-mode

Official Documentation: https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/enable-unified-interface-only

 

 

Privacy Preferences can now be found on Power Platform Admin Center

I recently encountered a requirement to update the Privacy Preferences as part of our post-deployment checklist. Now I noticed that the Privacy Preferences options is missing from where it used to be located: Settings -> Administration -> Privacy Preferences.

Now that same setting can be found on the Power Platform Admin Center by going to https://admin.powerplatform.microsoft.com/ and navigating to Environment -> Environment Name -> Settings -> Privacy & Security

 

 

Hack4Good – My First Hackathon

Hack4Good Group Photo

TL;DR

I’ll warn you – this is a long read! To summarise though – this Community is beyond awesome and the Hack4Good event just proved that we can genuinely change the world.

The Hype

When TDG announced that there was to be a hackathon in London, with the focus of it being the Non-Profit/Charity sector, I was straight in there on the registration (after which Mrs H was then informed  that I was booked in – easier to ask forgiveness than seek permission)

This was to be my first ever hackathon, a year ago I hadn’t even HEARD of hackathons, and it ticked so many boxes for me. For those who don’t know, it’s not hacking in the sense of breaking into systems etc – this is all about using software and platforms that are at your disposal to hack together a solution to a scenario within a given time limit. The most innovative, practical, deliverable, and potential-filled solution would win the day.

When the emails started to come out Chris asked (in typical CAPS LOCK STYLE) if I would lead a team. Me being me, I jumped at the chance – in for a penny, in for a pound.

And so the excitement began. Weeks turned into days, and my poor family and friends got fed up of hearing how stoked I was. When I saw this list of other team leaders, and saw the people who were on my team, I started to question my credentials. There were so many legends of the community involved – people I look up to, and follow with eagerness and anticipation.

The Buildup

At 5:30am on Saturday 16th April, loaded with snacks and tech, I headed towards the railway station. Nerves meeting with excitement, doubts meeting determination.

Arriving just before 8am I was struck by just how, on first impressions, the Microsoft Reactor in London is a strange space. Fully stocked drinks area, with stereotypical caffeine overload available, games area, and then a large open space with tables and a large video screen. It almost seemed spartan in its simplicity.

As everyone started to arrive, and we set up our various laptops and devices, that open space suddenly became this hive of technology and potential.

Hugs and Hellos were dished out with abandon, and cries of “It’s so good to meet you at last” were deafening in their abundance. I moved from person to person and finally got to meet people who I’d talked to online or who I’d been following for ages. I was even surprised to find people who wanted to meet me!

The Morning

With typical fervour and energy the trio of Chris Huntingford, Kyle Hill and William Dorrington (who had come over for the start despite having removal lorries outside his front door!) kicked off the day.

A surprise video message from James Phillips, Corporate Vice-President of Microsoft, impressed upon all of us just how much the community is noticed by Microsoft and raised the expectations of all in the room another notch. If our dials were at 11 before that video, they were at 12 afterwards – and even Spinal Tap didn’t get to 12!

I’ll be honest at this point and admit that I can’t remember who presented exactly what and when – my mind was a maelstrom of ideas and planning.

The engaging Architect and Storyteller Alex Rijnoveanu (@WomanVsTech) from Microsoft delivered enthusiasm and encouragement.

The very funny, and trying-not-to-be-as-sweary, Sarah Critchley (@crmcat)presented in a way that only she could – with an idea about helping out stray cats using powerapps and other bits.

m-hance presented alongside Solent Mind, and that I related to what they did in a huge way because of the work I see in my day job at St. Andrew’s Healthcare. It was a sobering presentation in many ways, but also opened up our eyes as to “the art of the possible”.

Saurabh Pant and Sameer Bhangar had flown in from Microsoft (yes, all the way from Seattle) just for this event and then through away their planned roadmap presentation to give us all a major pep talk and stir us up even more. I have to say that the funniest thing was their very friendly (and also slightly sweary) rant about how much they had heard about Samit Saini in the past year! In so doing, it just served to show us all just what was possible – those who knew Samits journey smiled and laughed, and those who didn’t had their eyes opened to a new level of potential.

Quantiq presented some of the work they had done with the Leonard Cheshire charity and also give a glimpse of their toolkit for healthcare and the ideas kept flowing. As I look around at the other teams I could see people taking notes, typing away, and whispering to each other. This hackathon was going to be competitive, but boy was it going to deliver some amazing results.

I’ll apologise now to all the presenters as I haven’t done you justice in my few words, and I may have mangled your presentations up, but believe me when I say that all the presentations hit home with all of us listening. Those presentations took our plans, determination, and enthusiasm up to levels you just wouldn’t believe if you weren’t there!

Let The Hacking Commence

With a final presentation to lay down the rules of engagement, and also to make it clear that stopping for lunch was most definitely not an option, the starters gun was fired and the 4.5 hours of planning, building, and preparing began.

The buzz in the room was electric as each team discussed and planned out their scenario and then grabbing whiteboards and white space to map out what a solution could look like.

I’ll be writing more about the Team White proposal in the coming days, as there is more to come from that, but we settled on a solution that would utilise so much of the stack but would be able to be modularised and deployed as a “solution-in-a-box” approach.

With my amazing team of Penny, Josh, Denis and Raj we set about building Microsoft Forms, PowerApps, Dynamics 365 solutions, Flows, and the concept of the Hololens. Oh yes, Gadget King Raj had brought us a Hololens – and that just expanded the possibilities for us. We weren’t looking at gimmicks and tech-for-techs-sake, we were looking at a genuinely life-changing solution using some amazing software and hardware.

With a soundtrack of some amazing 80’s rock being pumped out (and yes, thanks Chris for Rickrolling us!), everyone was doing something. If you could have harnessed the energy in that room at that point you would have been able to power half of London.

Floor walkers popped by each of the teams each one listening and absorbing before offering advice, help, suggestions and more – but what was even more amazing was that the teams were all talking to each other. You read that right, the teams all talked to each other.

There was sharing of scenarios, encouragement, suggestions for improvement or additions, and helping hands. This was a competition that was like no other. This was a competition in which we ALL wanted to see every team achieve their goals. I’m a mildly (ok, seriously) competitive person at times and yet there was no sense of barging past each other to reach the finish line. This was collaboration and cooperation in competition towards a common goal.

The Winners

And with 4 and a half hours gone in the blink of an eye, the race was run. It was time to do the 5(ish) minute speed-dating presentation of the solutions.

As each team stepped up and shared I really do not know how I held it together. These were genuine scenarios, delivered with innovative solutions, and by passionate people.

Every last one.

We all watched, applauded and cheered. None of us could separate the competition. Judging was going to be tough, and so it proved.

With our hosts waffling as much as possible whilst the judges adjudicated, we all sat wondering just who it would be. We all wanted to win, but we all knew that whoever did win was fully deserving of it.

With the decision made, the announcement came that Team Grey (who had flown over from Germany to take part!) had won with an app for rounding up as you ordered food or transport and donated this to your charity of choice. Writing that makes it sound simplistic, but if you think of the implications of it you soon realise that it has massive potential.

It Is NOT Over!

The final speeches and thank you’s were made, the applause leaving hands feeling rather raw and sore, but this isn’t the end. Every proposition in the room has legs, and every person in the room knew that this couldn’t stop just because the clock had run down.

Saturday saw the start of something, the spark that starts a fire. We all felt it and reading all the posts on twitter and LinkedIn etc after the event just reaffirms that determination.

We saw not a glimpse, but rather a bright shining beacon of the power of the community. I go on and on (and on) about Community but what happened in that room on Saturday, with just a part of the enthusiastic and passionate community present, just proved what we can all achieve if we put our minds to it.

Here TDG we have the Community Collaboration Portal for working on community projects together, and there’s the Power Platform Bank for making solutions available, and then there’s all the social media channels out there as well.

Let’s turn this spark into a raging fire of change. Let’s use our collective skills to build new solutions to old problems.

Oh, and let’s do this again real soon!