PowerApps

How to Enable the PowerApps AI Builder

Hey, Will here – just a quick post. I’ve had quite a few people come up to me asking how to enable the new “AI Builder” for Power Apps on their environment – which strikes me as slightly strange as it comes enabled as default. However, here is how you do it.

 

  1. Go to: https://admin.powerplatform.microsoft.com/ 
  2. Then using the navigation pane select “Environments” then select the environment you wish to turn the ‘AI Builder’ on for:
  3. Then select “Settings”:
  4. Then select “Features”:
  5. Then enable the PowerApps AI Builder:

Now go forth and build something AWESOME!

Review the Power Platform release plan with mvp’s

In case you missed it, the 2019 wave 2 release plan for Dynamics 365 and the Power Platform was released today.  You can read James Phillip’s blog summary at https://cloudblogs.microsoft.com/dynamics365/bdm/2019/06/10/announcing-new-features-growing-demand-for-dynamics-365-and-power-platform/ and you can download the full release notes from https://docs.microsoft.com/en-us/dynamics365-release-plan/2019wave2/.

This afternoon, I was joined by MVP’s Megan Walker, Ulrik Carlsson, and Andrew Bibby to review the release plan. Watch the video below.

 

Implementing Enterprise Search In Power Platform

Photo by Anthony Martino on Unsplash
Photo by Anthony Martino on Unsplash
Providing good search capabilities is a key feature in modern business applications to support usability and end user satisfaction. We have seen how the search capabilities of the Dynamics platform has evolved from providing “Quick Search” and “Advanced File” to “Relevance Search”. The goal of the platform search features has been to support users to find the relevant information they need in the quickest and easiest form. These search features are out-of-the-box and easy to enable/configure/use. As the platform progresses to offer richer features to users and enable them to search better, the demand for richer and better search techniques grow, and we see instances where the platform capabilities cannot meet user demands with its out-of-the-box capabilities. Before going further about advanced search scenarios, you can read about the platform out-of-the-box search capabilities in this official documentation. In this article I share why we may decide to implement a search solution of our Dynamics solution using Azure Search Service.
In enterprise implementations, business applications are not the only systems used in the organization. We often see call center agents and sales representatives need to obtain their required information from various systems to service customers. Searching users in every system is a cumbersome job which may cause setbacks in end-user adaption. Integrating Dynamics with Azure search offers consolidation of search operations in one specialized search service with ability to connecting to various data sources and apply modern search techniques to find the most relevant data. A practical example of this scenario can be seen in one my recent experiences where the organization users had to search for user information in CRM, SharePoint, Sybase and a pool of CSV files.

Customized Search experience

To facilitate more user adoption, using customized search techniques are highly favorable. In all modern search engines, we see use of “Auto complete”, “Suggestions” and “highlighting” features which can be added to the Dynamics solutions search experience. Displaying search results by support of “Document Preview”, “Document Opening in a customized containers”, “Facets”, “Filter” and “Sorting” are examples that enhance your Dynamics solution’s capabilities.

Customized Search Behavior

The true power of search is demonstrated with different pieces of information are linked together to make sense of a bigger picture. Extracting words and sentences from documents including images and pdf files, extracting key phrases, people names, location names, languages and other custom entities with the help of AI is another unique feature that you can add to your Dynamics’s search capabilities. Another amazing search capability you can have in your Dynamics implementation is the ability to search based on geolocation information, i.e. you can search for all your partner network from CRM or get the location of your field service force. The beauty of implementing your own enterprise search lies in the fact that you can search information in your data stores and link them using AI to generate knowledge and better insight to your data.

Customized Search Result

Another need for customized search in your Dynamics solution to the ability to refine your search result profile. When you use AI in your search, the system gives you the power to see how relevant search results are to your search keywords. And by knowing this you can refine your search profiles to generate a different result for the same keywords. This way you train the AI engine to work better for you and enable users to get more accurate search results.
Architecture

Dynamics integration with Azure Search service can be integrated in the following pattern:

 

  1. Integration through web resources: These web resources will host a web application acting as a client to the search service. The web resource can be a HTML file or an iFrame hosted on forms. The important point in this approach to ensure cross-origin settings in the client application and writing your html in a secure way and according to the best practices.
  2. Integration through custom power platform controls. You may build your own custom control which sends REST requests to the Azure Search and display results by consumes REST responses. The custom control can call Azure Search services using Actions or direct REST calls to Azure Service.
  3. Azure Search works based on indexes and your first step is to push your CRM searchable data to Azure Search indexes. This can be done using Microsoft Flow, Microsoft App Logics, custom solutions or Azure Data Factory. I have used all these tools in my implementations, and you can opt to any of these tools based on your requirements.
  4. Once the data is in your data store, you can create your indexes in the Azure Search. You can go for separate indexes for each data source or combine multiple data sources in one index. Each approach has its own requirements which will need to be met either in your client web application or a separate azure compute resource. Once indexing is done, you can make use Azure Search Rest API directly or using Azure API management to expose your search service to your Dynamics solution.
Summing these all up, you see as business application products get more sophisticated and organizations move from data to big data, engineers now must look for innovative approaches to implement Dynamics Solutions. Microsoft Azure along with Dynamics platform offers necessary tools to solution architects to design such solutions.

Solution Layering

I’ve recently noticed the Solution Layers button but knew next to nothing about its functionality.  It was added to my ever growing list of, “Ok, I need to check that out when I have some time!” While on a call this past week, the Solution Layers feature came up. After a brief overview on the call and some poking around afterwards, it looks to be a useful feature for developers, business analysts, and administrators.

What are Solution Layers?

Solution Layers is not some hidden, mystery feature.  Microsoft has done a great job recently with their online documentation and the article titled View solution layers includes a nice quick explanation of Solution layers:

  • Let you see the order in which a solution changed a component.
  • Let you view all properties of a component within a specific solution, including the changes to the component.
  • Can be used to troubleshoot dependency or solution-layering issues by displaying change details for a component that was introduced by a solution change.

So the Solution Layers tool offers insight into system components and their relationships to Solution deployments. The significant bit here to me is that it shows changes to the component and when the installation or updates were introduced.

Where do I find Solution Layers?

When you select a Solution component, such as an Entity, Process, or Web Resource, or sub component such as an Entity Form or Attribute, you will now see a button labeled Solution Layers.

For example, I opened the Power Apps Checker solution in a recently provisioned demo environment.  Expanding the Entities, we can see the button on the Analysis Result Detail Entity. Drilling into the Forms list, we see the tool button available with the Information main Form.  

Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity
Solution Layers for the Analysis Result Detail Entity Information Form
Solution Layers for the Analysis Result Detail Entity Information Form

If you open the Solution Layers dialog for the Analysis Result Detail Entity, we can see a one item list of Solutions.  This is a list of the Solutions to which this Entity is related.

Entity level Solution Layers
Entity level Solution Layers

Select the Solution listed and you can view the Analysis Result Detail Entity details that are related to the Solution.

 Analysis Result Detail Entity Solution Layer Details
Analysis Result Detail Entity Solution Layer Details

This view provides the list of the changed properties for the Entity when the Solution was imported in the first Changed Properties ‘tab’, and the full list of Entity properties in the All Properties tab. If we open the Information Form for this Entity, we see very similar Information: a single Solution and the detailed changes of the selected Entity Form for that Solution import. 

We only see one item in both the Entity and Entity Form levels because this Entity and all of its components are unique to this Solution. We can also see the list of Changed Properties is the same as the list of All Properties. This tells us that the Analysis Result Detail Entity was installed with Power Apps Checker solution and has not been affected by any other Solution installs.

That is some nice information, but not especially useful. The Solution Layers component really shines when we look at Entities that can be impacted by other solution imports.  For example, a system Entity Contact can be impacted by many different Solutions on your system. Or you may have a custom Entity being deploying as part of a product or an ongoing project that will see regular changes, whether through major Solution releases or hotfix style solution deployments.

Contact is a popular Entity

If we open a different solution that contains the Contact Entity, we see the real power behind this tool. If we open the solution named Sales Navigator for Dynamics 365 Unified Interface that comes with my demo environment, and view the Contact Entity Solution Layers, we see some immediate differences.

Contact Solution Layers Detail - lots of changes!
Contact Solution Layers Detail – lots of changes!

The Contact Entity has been changed by 21 separate Solutions. The first at the bottom of the list is System, but at the top we see Active as the latest. This means that the Entity or one or more Entity sub components were updated with each of these 21 Solution imports. So, how do we see more detail on all of these Entity changes?

Deltas!

If we dig deeper into the Solution components, we can see more granular detail of the changes. We can drill into the Contact Forms list for this Solution and open the Contact Form Solution Layers dialog.

In this view, we can see that the Contact Form has been updated by 11 different Solution Imports. But what has been changed? Open up a solution from the list to find out:

Contact Form Solution Layers Detail
Contact Form Solution Layers Detail

In this view under Changed Properties, we can see detailed changes that were made with the Solution Import. In this example, we see the underlying Form JSON value was updated, and if you scroll a bit, you will also see that the Form XML. With other value types, such as numbers or boolean values, it’s easy to see the changed value.

For more complex types like Form JSON or XML, you can compare the differences to the previous Solution Layer value. Simply open the previous Solution Layer from the list and view the property value under the All Properties view using a standard text diff tool such as WinDiff or Visual Studio.

Why is this a big deal?

The Dynamics 365 CE and the Power Platform with CDS now has a built in method for change tracking of various layers of the solution components. I include the Power Platform here because when you view an Entity from a Model Driven Power Apps , you have the option of switching to Classic View. In Classic View, you can view the Solution Layers exactly as if you were working within a Dynamics 365 CE solution.

This can be incredibly useful when troubleshooting issues or just managing your own deployments. With solid DevOps practices in place, you should be able to view content like this using source code control tools. But if you are working on a project for which those practices were not well established, I can see this feature as a huge help for developers, business analysts, or system administrators.

I recommend reviewing the article listed above and playing around with the feature. For example, check out changes to solution components like Workflows where you can view the changes to the underlying XAML that contains the workflow logic.

I will be looking into it in more detail myself because I can see the possibility for some nice tools built around this capability!

An easier way to export data from your D365 environments

Exporting/Importing data can sometimes be a tedious and time-consuming task but thanks to this feature in PowerApps, exporting data from multiple entities is as easy as ever. Huge thanks to our PowerApps SME from Barhead, Mary Rose Bagtas, for helping out. 🙂

 

To quickly export data, go to https://web.powerapps.com

Click on Data -> Entities

 

Then click on Export Data

 

Select the entities you want to export, then click on Export Data again.

 

Download the exported data and you’re good to go. 🙂

 

Formula result view is coming to Preview environment!!

Today (3rd, Apr), one of the BIG highlight in April 2019 updates of Power Platform is released to our preview environment, called “View results of formulas and subformulas in canvas apps”.

In this post, I briefly summarise its update, How to enable/What we can/Limitation etc..

*  As of 3rd, April, this experimental feature is delivered to ONLY Preview environment.

Enable experimental feature

To use this feature, go to “Advanced settings” and turn on a toggle “Enable formula bar result view”.

 

 

 

 

 

 

 

 

 

 

 

No need to save/reload app to use feature.

View collection records

At present, to view collection records in canvas for testing purpose, we usually add tmp gallery/data table control into app, and then see a result of our formula, such as Filter, AddColumn, LookUp…

However, as compositions get more complex, it can become difficult to understand the impact of each function on the result.  This feature helps us to understand what’s happening.

To view a formula results, you will simply select some formulas/collection/variables.  Below is quite simple example, if you select collection in formula bar (colTest), you will see summary of table records.

 

 

For more practical use,  if we want to preview a result of multiple filter condition to some collection, select formula Filter(xxx, condition1, condition2…), then we can get result of operation.

This feature is not only for viewing collections/tables, but also getting result of other formula, such as Concatenate(text1, text2…):

We can now easily and quickly understand and debug our apps with this feature.

Limitation

As far as I confirmed, there are some limitation for this feature;

  • Not support data source (unable to view records of some data source directly)
  • Not support nested collection (suppressed in view)

I think each of them is not critical for usual app making.

If I will find any further limitations/updates, will update this post.

Thank you!

Hiro

 

 

 

 

How to Start Your PowerApps Journey

I have been ask this question very often – I want to learn Powerapps but HOW? Where do I get started? I am not familiar with Excel so how do I get started? Well, we all know there are lots and lots of resources out there but sometimes it gets overwhelming .

Here few simple steps to get started:

1.Initiate a PowerApps Project

The easiest way to learn is by getting your hands dirty and built something of your interest. It does not need to be a massive full fledged app, It can be something simple and functional like a game or a . The satisfaction of the getting an app up and running will motivate you to do more.

2. Try-Modify-Expand other Apps

There is a lot of complete/partial built app in the TDG PowerPlatform Bank and PowerApps App Gallery.I find you learn a lot by playing, modifying and expanding an already built App Not only it help pushes your imagination, it gives you a different perspective in terms of build and design

3. Take the EDX Powerapps course 

If you like a more structured learning, I would highly recommend to take the EDX course. It has cool tutorial by Powerapps Guru- Shane Young and hands on tutorials which covers Powerapps, CDS, and Microsoft Flow. Best part is it is free but you can also choose to pay for a verified certification.

4. Attend App-In-a-Day Training

Or even better, reach out to Microsoft to check if there are any App-In-a-Day Training in your area or convince your manager to organise one. It is always fun to do it with your colleagues and friends.

5. Ask-Ask-Ask

The Power Platform community is awesome and generous. So don’t be shy to ask if you hit a roadblock. You can either post your questions on the following platforms:

a. TDG “Ask a Question”
b. Powerapps Forum

Or simply post a question to any Powerappers on Linkedin or Twitter. 🙂

6. Collaborate

One thing I learn being part of the TDG community is sharing is caring :). Being selfish does not bring you anywhere. Do not hesitate to collaborate with anyone in the community. I have learned heaps by collaborating with the Japanese PowerApps community as well as the TDG community.

7. Powerapps Bible

There are few good resources if you want to learn more bout Powerapps Functions:

  1. The Powerapps Microsoft Doc is your go-to bible when you need a function reference.
  2. Vlogs and Blogs – You can find very useful PowerApps vid and blogs in TDG Media
  3. Twitter – Follow Powerappers like Brian Dang, Audrie Gordon, Hiro and Rory Neary and many many more

8. Practice – Practice – Practice

Yes. It takes time, patient and determination to learn something new. But it also takes lots of practice. I am not an Excel expert but I came from a programming background ( I am talking about 13 years ago so I am extremely outdated and rusty).and this is a huge learning curve for me. But anything is possible as long as you approach it with the correct mentality. So, all you need to do is approach it with the right mindset and you will get there.

Happy Power-apping! 🙂

How to create resource in azure: Cognitive Services & PowerApps Part two

This is a carry on from How to create resource in azure: Cognitive Services & PowerApps Part one.

So now we have the “Text Analytics” resource setup let’s create the CanvasApp PowerApp, head over to web.powerplatform.com and select apps- then import package, import the Text Analytics Canvas app from our TDG PowerPlatform Bank by clicking here!!!

Once the app has imported and you have opened it  go to – “View”>”Data Sources”:

Select, at the top left of the pane that opens, “+ Add Data Source” then “+ New Connection”,  then search for “Text Analytics”:

Then select “Text Analytics” then it will ask you for the “Account Key” & “Site URL” (which is actually the Endpoint URL), this can be found in Azure. Go to Azure, All Resources, then Select the “Text Analytics” resource you create in step one .

Then select either “keys” from the ‘Grab my keys’ main grouping or “Keys” from the Navigation pane “Resource Management” groupings:

Then copy your first key:

Then paste that back into the PowerApp Data Source under “Account Key”:

Now to grab the URL Endpoint, head back to Azure – go to “Overview”:

Then copy the “Endpoint”And paste that back into PowerApps in the “Site URL” field:

Then hit create.

Now you’re ready to use the app – explore the formulas and controls used, you’ll find it’s easy to replicate. If you have a question comment on this blog or reach out to William Dorrington directly via LinkedIn.

 

 

 

How to create resource in azure: Cognitive Services & PowerApps Part one

So you’ve seen all this discussion around Cognitive Services and now want a piece for yourself – you open your computer, you down a coffee and now you’re thinking “what now”?

Well don’t worry, I’ve got you!

Let’s start with Text Analytics, crack open Azure (https://portal.azure.com/#home) and select “Create a resource”:

Then in the search bar search for “Text” and then select “Text Analytics”:

Then select “Create”:

Then enter the following; Name, Subscription type, Location (locations of where the resource will be held), pricing tier and resource group allocation (for filtering and searching properties). Once happy select “Create” and you’ll be notified when the resource is ready.

Once it is completed it will be available via the notification of completion message (via link) or in “All Resource” (from the navigation pane to the left hand side of the screen), then look for “TextAnalytics” or whatever you named your resource. From here you can view your “Keys” or if you need the endpoint URL select “Overview” and then you’ll see a URL under the category of “End point” e.g. https://uksouth.api.cognitive.microsoft.com/text/analytics/v2.0 (the URL only changes dependant on the location you selected e.g. https://[location].api.cognitive.microsoft.com/text/analytics/v2.0).

Now to create the PowerApp data source, find part two here..

PowerApps Export Collection Data as CSV

This came in handy when I needed the user to be able to export data in a collection for  to be used in Excel. Sample App available in the PowerApps bank.

As an example, I have a collection called Contacts with 3 records, this is the collection I will be exporting.

ClearCollect(Contacts,{Firstname:”Charles”,LastName:”Osei”,Number:”123456″ } ,{Firstname:”Charlie”,LastName:”Bradbury”,Number:”8765432″ },{Firstname:”Joh”,LastName:”Smith”,Number:”123456″ } )

On a button. Create a collection called ExportCSV with the same columns as  the contacts collection, but pre populate the first row with the column names , this will be the CSV`s column headers.

ClearCollect(ExportCSV,{Firstname:”First Name”,LastName:”Last Name”,Number:”Number”})

 

Next step is to copy all the Contacts into the ExportCSV collection

ForAll(Contacts,Collect(ExportCSV,{Firstname:Firstname ,LastName:LastName, Number:Number}))

 

Create the CSV string into a variable by concatenating all the columns with a comma. You can add in your extra columns by  adding &”,”&

but the concat  must end with “& Char(10)”  as this separates the next line.

Set(MyString,Concat(ExportCSV,Firstname&”,”&LastName&”,”&Number& Char(10)))

 

Output the MyString variable to a multi-line text box by setting its default value to MyString. Which should look like the below.

First Name,Last Name,Number
Charles,Osei,123456
Charlie,Bradbury,8765432
Joh,Smith,123456

You can now copy and paste that text into notepad, save it as something.csv. The file can now be opened in excel,you can use the column headers as filters or to create a table.

 

 

PowerApps – Running Functions in Parallel using Timers

In a previous post I used PowerApps to query Dynamics to check if a list of email addresses exists in my instance as contacts. I used the ForAll function to lookup Dynamics for each email in my collection sequentially. This worked  fine until I had to check if 500  emails exist in  my instance containing just over 300,000 contacts.

After doing some performance bench-marking ( Beginning the search and starting a stop watch on my phone) this query took 12 mins 53 seconds to run. Would be great if multiple lookups could run in parallel. I used timers to run the exact same ForAll statements three extra times , similar to separate threads which greatly improved the speed. Although I had to make sure each email wasn’t processed multiple times.

My original statement was:

ForAll(SearchEmails,Collect(Matches,LookUp(Contacts,emailaddress1 = Result))))

SearchEmails has a column called Results which contains the emails I want to find. For every email in SearchEmails , check if it matches the email address 1 field of a contact record in Dynamics. If one is found add it to a collection called Matches.

ClearCollect(ProcessedEmails,{email:” “});

I created a collection ,ProcessedEmails, this will store emails that have already been searched for.

Edited the original ForAll statement to only lookup emails that do not exist in the processed emails collection. Before the email is looked up, it is added to the processed list so another thread/Timer doesn’t pick it up but moves onto the next email.

ForAll(SearchEmails,If(!(Result in ProcessedEmails.email), Collect(ProcessedEmails,{email:Result}); Collect(Matches,LookUp(Contacts,emailaddress1 = Result))))

The above ForAll runs  on visible of my search page, I also then created  3 timer controls  each with the same for all statement in their on timer end functions.

Timer Properties

Auto-start : true , Repeat = false.

Durations

Timer 1 4000

Timer 2 8000

Timer 3  12000

When the search page is visible,

  • The original search will run on the OnVisible function
  • Timer 1 will start searching  four seconds later
  • Timer 2 four seconds after timer 1
  • Timer 3 four  seconds after  timer 2

This was to stop all my timers starting the search at the exact same time and processing the same emails.

With the For All lookup running 4 times the search went down from 12:53 mins to 3:21

Im sure additional timers could be added to possibly improve performance even further

Full formulas used
Search Page on visible

(can be applied to a button instead but make sure you start the timers)

Clear(Matches);ClearCollect(ProcessedEmails,{email:”something”}); Set(vSearching,true);ForAll(SearchEmails,If(!(Result in ProcessedEmails.email), Collect(ProcessedEmails,{email:Result}); Collect(Matches,LookUp(Contacts,emailaddress1 = Result)))); Set(vSearching,false)

Timers OnTimerEnd

ForAll(SearchEmails,If(!(Result in ProcessedEmails.email), Collect(ProcessedEmails,{email:Result}); Collect(Matches,LookUp(Contacts,emailaddress1 = Result))))

Hack4Good – My First Hackathon

Hack4Good Group Photo

TL;DR

I’ll warn you – this is a long read! To summarise though – this Community is beyond awesome and the Hack4Good event just proved that we can genuinely change the world.

The Hype

When TDG announced that there was to be a hackathon in London, with the focus of it being the Non-Profit/Charity sector, I was straight in there on the registration (after which Mrs H was then informed  that I was booked in – easier to ask forgiveness than seek permission)

This was to be my first ever hackathon, a year ago I hadn’t even HEARD of hackathons, and it ticked so many boxes for me. For those who don’t know, it’s not hacking in the sense of breaking into systems etc – this is all about using software and platforms that are at your disposal to hack together a solution to a scenario within a given time limit. The most innovative, practical, deliverable, and potential-filled solution would win the day.

When the emails started to come out Chris asked (in typical CAPS LOCK STYLE) if I would lead a team. Me being me, I jumped at the chance – in for a penny, in for a pound.

And so the excitement began. Weeks turned into days, and my poor family and friends got fed up of hearing how stoked I was. When I saw this list of other team leaders, and saw the people who were on my team, I started to question my credentials. There were so many legends of the community involved – people I look up to, and follow with eagerness and anticipation.

The Buildup

At 5:30am on Saturday 16th February, loaded with snacks and tech, I headed towards the railway station. Nerves meeting with excitement, doubts meeting determination.

Arriving just before 8am I was struck by just how, on first impressions, the Microsoft Reactor in London is a strange space. Fully stocked drinks area, with stereotypical caffeine overload available, games area, and then a large open space with tables and a large video screen. It almost seemed spartan in its simplicity.

As everyone started to arrive, and we set up our various laptops and devices, that open space suddenly became this hive of technology and potential.

Hugs and Hellos were dished out with abandon, and cries of “It’s so good to meet you at last” were deafening in their abundance. I moved from person to person and finally got to meet people who I’d talked to online or who I’d been following for ages. I was even surprised to find people who wanted to meet me!

The Morning

With typical fervour and energy the trio of Chris Huntingford, Kyle Hill and William Dorrington (who had come over for the start despite having removal lorries outside his front door!) kicked off the day.

A surprise video message from James Phillips, Corporate Vice-President of Microsoft, impressed upon all of us just how much the community is noticed by Microsoft and raised the expectations of all in the room another notch. If our dials were at 11 before that video, they were at 12 afterwards – and even Spinal Tap didn’t get to 12!

I’ll be honest at this point and admit that I can’t remember who presented exactly what and when – my mind was a maelstrom of ideas and planning.

The engaging Architect and Storyteller Alex Rijnoveanu (@WomanVsTech) from Microsoft delivered enthusiasm and encouragement.

The very funny, and trying-not-to-be-as-sweary, Sarah Critchley (@crmcat)presented in a way that only she could – with an idea about helping out stray cats using powerapps and other bits.

m-hance presented alongside Solent Mind, and that I related to what they did in a huge way because of the work I see in my day job at St. Andrew’s Healthcare. It was a sobering presentation in many ways, but also opened up our eyes as to “the art of the possible”.

Saurabh Pant and Sameer Bhangar had flown in from Microsoft (yes, all the way from Seattle) just for this event and then through away their planned roadmap presentation to give us all a major pep talk and stir us up even more. I have to say that the funniest thing was their very friendly (and also slightly sweary) rant about how much they had heard about Samit Saini in the past year! In so doing, it just served to show us all just what was possible – those who knew Samits journey smiled and laughed, and those who didn’t had their eyes opened to a new level of potential.

Quantiq presented some of the work they had done with the Leonard Cheshire charity and also give a glimpse of their toolkit for healthcare and the ideas kept flowing. As I look around at the other teams I could see people taking notes, typing away, and whispering to each other. This hackathon was going to be competitive, but boy was it going to deliver some amazing results.

I’ll apologise now to all the presenters as I haven’t done you justice in my few words, and I may have mangled your presentations up, but believe me when I say that all the presentations hit home with all of us listening. Those presentations took our plans, determination, and enthusiasm up to levels you just wouldn’t believe if you weren’t there!

Let The Hacking Commence

With a final presentation to lay down the rules of engagement, and also to make it clear that stopping for lunch was most definitely not an option, the starters gun was fired and the 4.5 hours of planning, building, and preparing began.

The buzz in the room was electric as each team discussed and planned out their scenario and then grabbing whiteboards and white space to map out what a solution could look like.

I’ll be writing more about the Team White proposal in the coming days, as there is more to come from that, but we settled on a solution that would utilise so much of the stack but would be able to be modularised and deployed as a “solution-in-a-box” approach.

With my amazing team of Penny, Josh, Denis and Raj we set about building Microsoft Forms, PowerApps, Dynamics 365 solutions, Flows, and the concept of the Hololens. Oh yes, Gadget King Raj had brought us a Hololens – and that just expanded the possibilities for us. We weren’t looking at gimmicks and tech-for-techs-sake, we were looking at a genuinely life-changing solution using some amazing software and hardware.

With a soundtrack of some amazing 80’s rock being pumped out (and yes, thanks Chris for Rickrolling us!), everyone was doing something. If you could have harnessed the energy in that room at that point you would have been able to power half of London.

Floor walkers popped by each of the teams each one listening and absorbing before offering advice, help, suggestions and more – but what was even more amazing was that the teams were all talking to each other. You read that right, the teams all talked to each other.

There was sharing of scenarios, encouragement, suggestions for improvement or additions, and helping hands. This was a competition that was like no other. This was a competition in which we ALL wanted to see every team achieve their goals. I’m a mildly (ok, seriously) competitive person at times and yet there was no sense of barging past each other to reach the finish line. This was collaboration and cooperation in competition towards a common goal.

The Winners

And with 4 and a half hours gone in the blink of an eye, the race was run. It was time to do the 5(ish) minute speed-dating presentation of the solutions.

As each team stepped up and shared I really do not know how I held it together. These were genuine scenarios, delivered with innovative solutions, and by passionate people.

Every last one.

We all watched, applauded and cheered. None of us could separate the competition. Judging was going to be tough, and so it proved.

With our hosts waffling as much as possible whilst the judges adjudicated, we all sat wondering just who it would be. We all wanted to win, but we all knew that whoever did win was fully deserving of it.

With the decision made, the announcement came that Team Grey (who had flown over from Germany to take part!) had won with an app for rounding up as you ordered food or transport and donated this to your charity of choice. Writing that makes it sound simplistic, but if you think of the implications of it you soon realise that it has massive potential.

It Is NOT Over!

The final speeches and thank you’s were made, the applause leaving hands feeling rather raw and sore, but this isn’t the end. Every proposition in the room has legs, and every person in the room knew that this couldn’t stop just because the clock had run down.

Saturday saw the start of something, the spark that starts a fire. We all felt it and reading all the posts on twitter and LinkedIn etc after the event just reaffirms that determination.

We saw not a glimpse, but rather a bright shining beacon of the power of the community. I go on and on (and on) about Community but what happened in that room on Saturday, with just a part of the enthusiastic and passionate community present, just proved what we can all achieve if we put our minds to it.

Here TDG we have the Community Collaboration Portal for working on community projects together, and there’s the Power Platform Bank for making solutions available, and then there’s all the social media channels out there as well.

Let’s turn this spark into a raging fire of change. Let’s use our collective skills to build new solutions to old problems.

Oh, and let’s do this again real soon!