PowerApps

How to resolve “error Executing the api /eventhubs” in #MicrosoftFlow?

While trying to connect Microsoft Flow to Azure Event Hub, you cannot retrieve the Event Hub name and instead you get “Error Executing the api /eventhubs” error. The event hub connector in Flow, allows you to connect to event hub using connection strings and get notified as soon as a new event in available in the hub. However, there are certain things you will need to know.

Event Hub Namespace vs. Event Hub

An Event Hubs namespace provides a unique scoping container, referenced by its fully qualified domain name, in which you create one or more event hubs. So Event Hubs are inside Event Hub Namespace. Both of the Event Hub Namespace and Event Hub have their own connection string which can be used to access these resources. However, it is important to know that the Microsoft flow connector for Event hub accepts the Event Hub Namespace’s connection string rather than Event Hub resource’s connection string.

Error “Executing the api /eventhubs”

You will see the below error while trying to use Event Hub resource.

The solution is to use Event Hub Namespace’s connection string.

To confirm whether your connection string is associated with your Event Hubs namespace or with a specific event hub, make sure the connection string doesn’t have the EntityPath parameter. If you find this parameter, the connection string is for a specific Event Hub “entity” and is not the correct string to use with your logic app.

Reference

https://docs.microsoft.com/en-us/azure/connectors/connectors-create-api-azure-event-hubs

How To Start Your Exciting Journey Of “Connected Field Service” And “Azure Iot Hub”?

After my last article I spent some time to explore Azure IoT and Connected Field service. I have been watching presentations and demos on YouTube and I saw many good demonstrations about the capabilities of Azure IoT Hub and Connected Field service. However, I wanted to explore more and see how things really work under hood of the platform. So, in this and few upcoming posts I am summarising my key learnings for the benefit of fellows who want to start.

Note: I structured this blog series based on my learning journey. It is designed based on questions of CRM consultants who wants to start their Connected Field Service Journey.

Let’s start by some key definitions which you will hear a lot in your journey and better be familiar at the start:

IoT Device

IoT Device is a piece of hardware with a small circuit (called Microcontroller), capable of connecting to the internet using WIFI or Mobile services. The purpose of the microcontroller is to capture data from the surrounding and send it to consume services over the internet. In other words, IoT device is a minicomputer capable of sending (and sometimes processing) data over the internet.

Raspberry PI / Windows IoT

The IoT device mentioned above can be a small circuit to react to an event like switch on/switch off or can be very powerful to process images in the device itself. When we develop very powerful devices capable of processing data and applying complex logics, we will need a program to run the logic or process data. To enable this processing, we need a lightweight operating system to host our application. The operating system suitable for operating these powerful devices could be Raspberry PI or Windows IoT.

Azure Internet of Things (IoT)

Azure IoT is series of managed services combined to connect, monitor and control smart IoT enabled devices. So basically, the device’s sensors detect events from surrounding environments and pass the event and data using Microcontroller to Azure IoT service. Azure IoT receives the data from millions of devices and sends it for further processing services for giving the data meaning.

Azure IoT Hub

IoT Hub is a central message hub for bi-directional communication between devices and Azure IoT. So, imagine Azure IoT hub as a heart of Azure IoT service.

Azure IoT Edge

As we move towards future, we will have more IoT devices connected to the Azure IoT hub. Each of these devices will send raw messages to Azure IoT hub for processing. This is ok since the Azure IoT hub is a scalable service, but would not it be more efficient if we use the powerful hardware to pre-process data in devices before sending it to the Azure IoT hub? Azure IoT Edge enables us to write programs to process data inside device before sending it to azure and/or send only the necessary data/events. This will help to offload processing in Azure and reduce traffic. If you have a powerful tool why not use it?

IoT Device Twin

Device twin is logical representation of the device status in Azure. For example, you have a smart bulb which is installed in the Branch A. The device twin can contain the location so you can query twins and identify installed devices in the Branch A.  Or you can store the firmware version in Device Twins and at the time of firmware upgrade, you can upgrade only old firmware versions.

Connected Field Service

The Azure IoT gives you the technology enablement to interact with devices. However, the real value of IoT emerges when you give a business context to it. The connected field service gives the business context to your IoT data. The connected field service comes with an accelerator. When you install the connected field service from the AppSource, you will get all entities and required components in your Dynamics environment. It gives you the ability to configure Dynamics to interact with devices, listen to them and be notified when they don’t feel right.

I have seen IoT Central. What is it and how different is it with Azure IoT Hub?

According to the Microsoft documentation, Azure IoT Central is a software as a service (SaaS) solution that uses a model-based approach to help you to build enterprise-grade IoT solutions without requiring expertise in cloud-solution development. However, Azure IoT Hub is the core Azure PaaS that both Azure IoT Central and Azure IoT solution accelerators use. IoT Hub supports reliable and secure bidirectional communications between millions of IoT devices and a cloud solution.

So, in summary if you have Azure Skills and you want flexibility to expand/manage your solution go for IoT Hub. But if you want an accelerator without extension requirement, then IoT Central is the way forward.

Where do I start my journey?

Where to begin
Where to begin – Learning Path

If you want to start directly from Dynamics you simply can follow instructions here

Otherwise follow the following steps:

  1. Provision your Azure IoT Service (Link)
  2. Create a simulated device (Link)
  3. Send and receive messages (Link)

I have a smart door lock. I want to be notified when it is unlocked. Where do I start?

Azure IoT Hub integrates with Azure Event Grid so that you can send event notifications to other services and trigger downstream processes.To implement this scenario, you will need to register your smart door lock in Azure IoT as mentioned above. Once the device is registered, you will need to define Unlock event in IoT service to enable the service to receive Unlock events. By default, the Azure IoT provides 4 event types:

  • Device Created
  • Device Deleted
  • Device Connected
  • Device Disconnected

However, for our scenario we will need to create a custom event Unlock. To define the custom event you can follow this link.

How do I route incoming message to Dynamics?

Once you define your events above, you will need to create a routing channel for the desired events. Once the Unlock events is detected in the IoT, you can define your routing to send your messages to an endpoint. You can configure your routing as per this link.

How to detect if my device disconnects from IoT?

It is very simple, as mentioned above, the Disconnection event is a default event which is triggered once your device is disconnected. All you must do it configure a service to react to the disconnection event.

How to orchestrate events from IoT to Dynamics?

If you are not using Connected Field Service solution from AppSource, you can configure a Flow to send messages across. There are triggers for IoT hub and events which you can use.

IoT Flow Template
IoT Flow Template

What kind of messages can be transmitted between device and IoT?

Power Platform 24 Live!

We recently completed the first ever 24 hour event exclusively focused on the Power Platform – Power Platform 24. The Dynamics and Power Platform community has fantastic events literally across the globe. As amazing as these in person events are, not everyone can attend as either a speaker or attendee. We wanted to remove geography as an obstacle!

Moving to a virtual format allows the team to include both speakers and attendees who may otherwise miss the opportunity to share with and learn from the community. For organizers, the added benefit is lower overhead because you don’t need to secure a venue, coordinate speaker travel, provide prizes, or feed anyone. All said, lots of pros for this virtual event format.

Our first event went off without a hitch! So maybe we had a very minor glitch or two, but the sessions were fantastic and the event ran like a well oiled machine all because of amazing speakers and a group of dedicated organizers.

I personally learned much from the experience, so I wanted to share some thoughts for those that might want to organize a similar event. This post covers a bit about the approach to organizing the event and some tools used to run the show.

Organizing the Event

The event came together fairly quickly after the idea was thrown out in a group conversation: anyone interested in hosting a virtual Power Platform event? The response was of course, “Heck yeah!” The volunteers immediately began throwing around ideas. We had a lot to discuss but most of it boiled down to these main topics:

  • How do we choose speakers?
  • How do we host each presentation?
  • How do we register attendees?

Most of the organizers have some experience running in person events, virtual events, or both so we started with some best practices in mind. We also brought experience with a variety of tools based from past events. This experience gave us a nice head start, we then simply needed to choose what works best for a virtual event spanning a full 24 hours!

Choosing speakers

The team chose sessionize.com as the platform for a call for speakers and vetting the submissions. If you have not used the platform, definitely check it out. Sessionize is offers excellent tools for both organizers and speakers to to organize event submissions and manage sessions across multiple events. Another huge plus is that for the service is free for free community events. Sessionize features alone could take up a full post!

Once you lock down your call for speakers, we used sessionize to categorize, review, and rate submissions. The organizing team reviewed each of the more than 70 submissions, ranking them based on the information provided by the speaker. This was honestly one of the hardest part of the process because we received so many excellent submissions.

We considered multiple tracks because of the number of submissions, meaning we could run two or three concurrent hour long sessions. This was our first 24 hour event, so we chose a single track of 24 one hour sessions, starting at 8:30 AM EST and running through 8:30 AM the next day.

Hosting the event

This was the big decision: What platform do we use to host the event? We can all list a dozen virtual event platforms in just a few minutes, but that doesn’t actually make things easier. We ended up choosing Teams and a Teams Live Event. This makes sense as this is a Microsoft Power Platform event, but here is an excellent article that Purvin Patel shared which helped make our decision: Produce a live event using Teams. This article outlines how to set up a Team Live Event and details around Producer roles.

The Teams Live event setup means assigning users to a producer role where they can monitor a control the live stream, Q&A channel, manage the event notes, and start/stop the event. Another important feature is the ability to record each session. This may not be a requirement for other events, but we wanted to provide recordings for both attendees and speakers. This is an excellent feature but the limitation of 4 hours per recording is something to keep in mind if you choose this platform. We needed to keep this limitation in mind when scheduling the sessions and producers.

Using a Teams Live event requires Office 365 and Teams licenses. Fortunately, the XrmVirtual crew is already delivering live events using Teams, so they offered to run the event for us. We now had a chosen platform, so we needed to decide how to run the sessions.

Delivering the Sessions

We broke the 24 hours into six blocks of 4 hours and we took volunteers as producers for each segment, which worked perfectly with our 4 hour cap on recording. A producer was logged in the speaker during the session to handle connected issues, answer or raise questions, and manage transitions between speakers.

This meant that we posted six 4 hour Team Live events that ran in sequence. Once these were established, individual invites were sent to each speaker with a link for the correct block of time. This was all handled by the XrmVirtual team and I felt it worked out great as both a speaker and a producer. It was easy for me but I know it took a lot of time to set up!

At the start of each session, a producer logged in to Teams with the correct account, share any slides that were required at the time, and kick off the session, and began recording. The speaker could then just shared their screen and delivered the session. Once the session was complete, the producer shut down the event to end recording while the next producer was already up an running with the next speaker.

Registering event attendees

Registering event attendees seems pretty important, so why is it last in the list?

Well our solution for registering users for the event was pretty simple: we didn’t register users. Fortunately, the Teams Live event platform allows users to connect without a prior registration and post questions anonymously. We had no need to track any user info, manage cancellations, etc. Attendees could jump on to catch a session and disconnect when done.

This could be an issue with different virtual delivery platforms but it did not seem to be an issue for us. I believe we averaged about 100 attendees per session which is a pretty nice number. I’ve had in person sessions with only 5 people, so 100 is pretty nice! We had some excellent questions by attendees which really adds to the delivery. And of course, attendees who missed the live session can jump online and view the recorded sessions on demand!

Testing, 1…2…

One practice that made this event run so smoothly was… practice! The week prior to the event, XrmVirtual team set up test sessions to ensure speakers could connect without issue. Each speaker jumped on to the Team event, shared their screen, and tested their audio. It sounds simple, and it was, but it saved us from potential issues on the day of the event.

We also made sure that producers understood the Teams setup operates. The XrmVirtual team provided a new account from their Office organization for each 4 hour block. Each account was granted producer rights on their respective Team Live events. Having enough accounts is another item to consider if you choose a Teams Live event as a platform.

I was not the one that set up the Teams Live event for all of the sessions, but from an end user perspective, I found this event went smoothly and the Teams platform is fairly easy to use.

Thanks once again!

I will call out the organizing team here in case you want to reach out and say thanks! For me, I wanted to say thanks once again to the organizing team for gathering and vetting the speakers, setting up the infrastructure, communicating with speakers and attendees, taking time to act as producers (at really crazy hours!), processing all of the recorded videos, and advertising the event.

Thanks for simply giving up a chunk of your free time to make this event happen.

Julie Yack
David Yack
Joel Lindstrom
Aiden Kaskela
Beth Burrell
Michael Ochs
Sarah Jelinek

Everyone on the team pitched in, but I think a few special shout outs are in order – thanks to Julie for owning the meetings and technical bits with the producer setup and being online for 16 or so hours monitoring the event real time. And thanks to David Yack for spending his weekend breaking down all of the videos and hosting them for our viewing pleasure.

And thanks to all of the speakers that gave up their time to plan and provide some excellent sessions for the community! Check out the full list of speakers at the Power Platform 24 site! You can check out the recorded events now that they have been posted here!

I am looking forward to another Power Platform 24 event… Keep an eye out for the next event announcement!

Are you a Passionate #PowerPlatform developer, looking for your next big challenge?

Looking for your next big thing?
What I love about the #PowerPlatform is that we have tones of new features every 6 months and if you are an avid learner, it gives no reason to you to be stagnant and not learn new things every day. You have numerous options from creating #powerapps, #pcf component or just explore new enhancements in the #dynamics365 applications.
After spending some time on exploring some core capabilities of the #PowerPlatform and playing with the field service application, I have been thinking about what is the next big thing I should focus on. I was looking for something challenging which has an impact on us. To decide on what should be the next, I decided to take a step back and look at the options I have to decide on what to do next. I looked in technology blogs, read books and listened to many podcasts which helped me to clear my mind and shortlist some major places I could start. I would like to share what I did in the hope that it may help other folks who are also looking for their next challenge:

Industry Reports & Microsoft investments

Like it or not, business needs and drive product development. It is the business, industry and eventually profits which are driving the investment into products which we can work on implementation or extension. I think one the best sources you can get your own next best challenge is industry or economic reports. I always have an eye on #Gartner reports. I see people share #Gartner reports back and forth on LinkedIn to show how a product is doing in the competition. I look at it from a different angel. I always read the #Gartner report but looking at the weaknesses. Weaknesses in the competition drive demand for innovations and solutions. By reading reports, you will understand what are the weaknesses in a product or platform and how these weaknesses matter to end customers. The next big challenge can be addressing some of the weaknesses!
Another indicator is Microsoft investments. When I see Microsoft is investing heavily on a platform or product, it gives me a good idea on what is the next big thing. Of course, Microsoft has a big team of industry experts and awesome people familiar in various businesses. Speaking of myself, I do trust the investment numbers of big players because that is going to create a demand for my innovative mind!
What do you think about this one?

What about cross-domain ideas?

Our field is fluid. It changes every day. It is not only #PowerPlatform. It is the nature of our field. There was a time when implementing a CRM was the goal of organisations but now CRM is only one piece in the big picture! Have you even thought of creating something including multiple technologies? How about combining #Azure and #PowerPlatform? The topic of cross-domain work is the one I am very interested. I have explored Azure SearchAzure SpeechAzure Bots, Azure Documents and Azure IoT Hub alongside #PowerPlatform. The synergy between #Azure and #PowerPlatform creates a system which really serves customers better. If you are out of ideas in #PowerPlatform, then start looking in #Azure. I bet you will have a lovely time learning new things and explore how #Azure can help you to do better implementations!

Working on community Ideas

There is a long list (and it is growing day by day) of product ideas in the IDEAS PORTAL. The #PowerPlatform product team does a good job on assessing, shortlisting and working on the great ideas to release in future waves. However, it is not possible to implement all of ideas due to various reasons such as criticality, demand, priorities and etc. Passionate people in the community can assess these ideas and see if they like the idea. Once chosen, you can work on the idea and release your own source to the GitHub to spread its benefit. Publishing your source helps the idea get bigger and your code quality will be better, too!

XrmToolBox plugins

I think XrmToolBox is the single best thing that the Dynamics community has come up with. I use it in every project. It inspires me when I see friends out them develop such a lovely plugin that works like a charm. (Big shoutout to Tanguy Touzard , Jonas Rapp and Aiden Kaskela whom I follow when it comes to the XrmToolBox) I never had a major problem using the tool which tells me how many good coders are out there. Your next best thing can be a XrmToolBox plugin or an extension of an existing one. If you think your idea is small or bad, you better think  again. There is no small or bad idea. Once you float your idea, it will become a snowball. Other awesome people contribute to yours and you will see how your small idea will become a big one helping others to do better. Even if you have no small idea, you can extend the functionality of an existing plugin. You always have an option to choose from 🙂 but remember one thing: Being stagnant is NO option!
The above list is not all I explored but it shortlists some of the things you can relate to. I would love to hear from you and see what is your experience in finding your next big thing?

P.S: A friendly reminder by @rappen on how to type: XrmToolBox (https://twitter.com/rappen/status/951009993989459969)

Image by Joanna Kosinska from Joanna Kosinska

Email Sentiment Analysis in Power Platform to improve customer service

What I love about my life as a #consultant is having the opportunity to hear customer problems and responding to them with something of value which improves their business in their own industry and market. What I love about being #Microsoft #Technology #consultant is working on a technology which not only cares about end users but also makes it easier for me (or any citizen developer) to come up with solutions easy to implement and with the #powerplatform and #msflow, many things don’t even need to open my #visualstudio (which I love and open every day even if I am not coding – Sounds crazy, nah! :D). Let’s get back to the track now!

Scenario

I had a request from my customer who was getting bombarded with case emails in its support department. The customer asked me to find a solution to prioritize emails based on urgency and probability of customers getting defected.

My initial thought was that “How do I need to quantify if a customer is going to defect because they are not satisfied”? After pondering on few solutions, I could come up with the idea of “Email Sentiment” as KPI for customer defection. If a customer is not satisfied with a service, their first reaction is to send a bad email to the company (At least this what I do) before going to the social media. So I took the initial complaining email as a sign of losing customers. The next thing was how to implement the idea? And this is how I did:

Solutions

  1. The basis of the solution was to use Azure Text Analysis service to detect the email message sentiment. The underlying service being utilized was Azure Text Sentiment Analysis service.
  2. The next thing was to customize the email message entity to hold the sentiment value and potentially trigger a notification to manager or just sort emails based on their negative sentiment value.
  3. The last thing was to connect Power Platform to the Azure Text Sentiment Analysis service and get the sentiment value of email message from azure. I had two ways to implement this:
    • The first solution was to write a customer action to call the service and pass the email text to the azure endpoint. On receiving the response of the analysis service, the action would return the sentiment as its return. Finally calling the action on a workflow which triggers on the Creation of Email Activity!
    • The second solution was to use #MicrosftFlow and do everything without writing a single line of code. Obviously, I used this technique.

The solution is extremely easy because #MicrosoftFlow provides an out of the box connector to the text analysis service and all you will need to do is to provide the service key and service endpoint. Below is how my #Microsoftflow looks like:

Microsoft Flow Sentiment Analysis

Azure returns the sentiment score along with its analysis as Positive, Negative and Neutral. The API returns a numeric score between 0 and 1. Scores close to 1 indicate positive sentiment, while scores close to 0 indicate negative sentiment. A score of 0.5 indicates the lack of sentiment (e.g. a factoid statement).

In my solution, I stored sentiment value as Whole Number, so I had to cast the decimal value between 0 and 1 to a number between 0 and 100. To do this, I used Operation step to multiply the sentiment score by 100 and cast it to an integer value. So I used the below formula:

int(substring(string(mul(body('Detect_Sentiment')?['score'],100)),0,indexof(string(mul(body('Detect_Sentiment')?['score'],100)),'.')))

Note: #MicrosftFlow does not have round function so I had to convert the value to string and substring until the decimal point.

Key Points:

  1. All of the Text Analytics API endpoints accept raw text data. The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks.
  2. Your rate limit will vary with your pricing tier.
  3. The Text Analytics API uses Unicode encoding for text representation and character count calculations. Requests can be submitted in both UTF-8 and UTF-16 with no measurable differences in the character count.

Open entity records from Power BI dashboard

In my earlier post, I discussed how to show CRM entities on Power BI visual map control. The usage of Power BI dashboard on Dynamics CRM dashboards is not limited to displaying multiple entities on maps. We usually want to do more and since dashboards have little information on them, we would love to see entities in tabular format and navigate to CRM records when needed. In this post, I will discuss how we can open CRM records from a Power BI dashboard.

Scenario

Users should be able to see multiple entity types Power BI map. Users should be able to see record details in a table under the map control with the ability to open CRM records using a hyper link. I will focus on displaying records in a table with direct link to CRM entity records. After configuring the visual map control, we will need to do the following:

Note that all the required information i.e. name, etc. and complementary information i.e. entity logical name, entity ID are available in our temporary table. Refer to previous post

  1. Drag and drop a Table control underneath of our visual map control.
  2. Drag and drop the fields we would like to display on table columns.

  3. The next is adding one custom column to the table to hold hyperlink to CRM entity records and configure its type as WEB LINK.
  4. You can do this by selecting “NEW COLUMN” from the “Modeling Tab”. Remember you will need the following three components to construct the line.
    1. CRM Base URL (This will be known to you from your org URL).
    2. Entity logical name (This is what we captured in the previous post as a custom column in our temporary table).
    3. Entity GUID (This was selected also as part of entity retrieve query in the previous post).
  5. The formula for the column is:
    Link = “https://[CRM BASE URL]?pagetype=entityrecord&etn=”&’ENTITY_LOGICAL_NAME &”&id=”&’ENTITY_ID’
  6. You will need to set the field type as WEB LINK.

Display multiple entities on Power BI map control

 

Photo by Susannah Burleson on Unsplash

Recently I had to display location of multiple entities on a CRM dashboard. The requirement was to display all Workorders, Projects, Resources and Bookings in a map control so the project scheduler / field service dispatcher could see where is the location of each Workorders, Projects, Resources and Bookings on map. The bing map control works fine on individual entities which are enabled for geolocation however, in this scenario I had to plot all different entities on a single map.

My thoughts were that I could choose from one of the following methods:

  1. Use bing map control on a dashboard. Use a webresource to retrieve all entities in Workorders, Projects, Resources and Bookings. And then use a draw function to place each entity location on the bing map.
  2. The second approach was to use Power BI and its Visual Map control to plot all entities on a map. Then host the Power BI control on my dashboard. I decided to use this approach to display entities on a map control.

Power BI Map control to show multiple entities

The map control in Power BI uses one source table with longitude and latitude information to display table rows on map. The challenge with this approach is that the visual map control supports only one entity’s longitude and latitude and therefore we can only use one entity as source of the map data. In my scenario I had multiple entity types i.e. Workorders, Projects, Resources and Bookings. Each of these entities have its own longitude and latitude and we cannot use all these entities together as  a source for our Power BI Map.

The way I overcome to this challenge was to use a temporary table to union data from all Workorders, Projects, Resources and Bookings in this table and use this temporary table as the source of Power BI Map control. This is how I did it:

  1. Connect to the CRM Bookings table. This will bring all columns of the table to the Power BI.
  2. Remove unwanted columns in the Query Editor (optional).
    = Table.SelectColumns(Source,{"name", "msdyn_longitude", "msdyn_worklocation", "bookableresourcebookingid", "msdyn_latitude"})
  3. Reorder remaining columns in a way that you like to see your data (optional).
    = Table.ReorderColumns(#"Removed Other Columns",{"name", "msdyn_longitude", "msdyn_worklocation", "msdyn_latitude", "bookableresourcebookingid"})
  4. Rename column headings (optional).
    = Table.RenameColumns(#"Reordered Columns1",{{"bookableresourcebookingid", "id"}})
  5. Filter rows that you want to exclude from map (optional).
    = Table.SelectRows(#"Renamed Columns", each [latitude] <> null)
  6. Add a custom column to the query as TABLE Identifier/Category so you can identify workorder rows in the union table.
    = Table.AddColumn(#"Filtered Rows", "category", each Text.Upper("Bookable Resource Booking"))
     
  7. Change the column types (optional).
    = Table.TransformColumnTypes(#"Reordered Columns",{{"category", type text}, {"longitude", type number}, {"latitude", type number}})

If you have more than one entity, repeat the above steps for each table in your query editor.

The next step is to create a temporary table and union all the above tables data using DAX query into this temporary table.

  1. Go to Modeling Table.
  2. Click on New Table. Use the below query to fill the table (Alter table names based on your scenario),
    TempTable = UNION('Bookable Resource Booking','Bookable Resources','Work Orders','Project Sites')
  3. Drag a Map Visualisation control to the Power BI.
  4. Select “Category” or Entity Name from the TempTable as Legend. This will ensure to show your entities in different colors.
    Drag longitude and Latitude fields to the X and Y axis.
  5. Note: By default when you form tables, Power BI adds SUM function to summarize longitude and latitude. These columns with summarize functionality don’t work in maps. You must remove summarize attribute from them by choosing “Don’t summarize”.


How to upload Powerapps Audio into sharepoint?

Problem: Recently I have a requirement to upload recorded audio from powerapps to sharepoint.

Requirements seems to be straight forward if you have a basic knowledge of Powerapps and Microsoft flow isn’t it? But there is a catch (let’s see that in sometime :-))

Just for the completeness of this blog, I will reiterate certain Hows.

How to Record and listen the Audio in powerapps?

In powerapps we have a media control named MicroPhone to record the audio, and another media control named Audio to listen to the recorded audio.

Further on how to do that:

In the Microphone control (I named as MyMic) Onstop property collect the recorded audio as follows:

ClearCollect(collInspectionAudio, MyMic.Audio);

In the Audio Control Media property, place the collection which you have used to collect the recorded audio earlier as follows:

First(collInspectionAudio).Url

We now know how to record and listen the audio within powerapps. Now lets discuss the actual challenge on how to upload it to sharepoint.

Analysis

We use Microsoft flow to upload to sharepoint however in my microsoft flow it looks straight forward to pass the audio as First(collInspectionAudio).URL to my sharepoint file content as shown below.

when I execute the Powerapps and hence my flow it is creating an audio file in my sharepoint, however its not playing. when I see the outcome of the Flow i found it strange because File content looks as below:

When I further drilldown into the problem by checking the datatypes in flowstudio. I realized that in Createfile action of Sharepoint connector expects Binary format hereas powerapps send its audio content in byte format.

Solution:

So the only way to get around is accept Byte parameter from Powerapps and use it to upload file in sharepoint.

For that, I used Outlook connector  to send an email. When I ask for attachment parameters in powerapps it can send in byte format.

Note: If you have a requirement to send email of the audio file along with uploading in sharepoint then that’s amazing. If you don’t have such emailing requirement then another slight hack is required.

  1.  Create a condition which never becomes true. (in my case 100 equals 200 which never becomes true ;-))
  2. In the true section call Outlook connector and ask for parameters for Attachment Name and Attachment Content.
  3. In the false section call Sharepoint Connector and create a file by using same parameters which we have in Step 2.

Final working Microsoft Flow looks like this:

Warm Regards,

Pavan Kumar Garlapati

 

 

Want to auto-tweet ? Here’s a way to do it with a Flow

I figured this one was valuable enough to copy over here 😉

As per my recent blogpost.

While I’m away

The trigger for this post is simple.. I’m heading into a well deserved summer vacation where I want to remain active in some of the social medias while being able to just relax and enjoy life!

So, I figured that building an automated Flow which would capture tweets of interest (based on keyword search) and then retweet them out with a specified delay could well do the job!

This would allow me to put content in my feed, adding value to the original tweets by issuing my message later on, thus maybe appearing on someone’s radar who would not have caught the original post.

Issue

The twitter connector, as of now, doesn’t offer an action to retweet nor does it allow to include direct references to users / authors (@xxx). So in order to point back to the original post I had to build the twitter URL to it. I also included a signature note where I thank the author and notify the readers of this issue.

I’ve added this functionality as one of my project goals. Creating a custom connectors will probably allow me to resolve this. Right now, no time, so let’s just do it in a quick and dirty way and solve this out when I’m back on the saddle after drinking all the margaritas in the world 😉

Flow

Overall, here are the core functions that my Flow will execute:

  • Act on tweets that use predefined keywords, for example:
    • #powerapps
    • #powerplatform
    • #poweraddicts
  • Only process original tweets (I don’t want to overflood the feed by considering RTs)
  • Only process other people’s tweets (that’d be kind of awkward if I’d process my tweets)
  • Manage and process French and English tweets only
  • Keep track of the tweets I re-posted as a reference for later BI / processing

Prerequisite

Since I want to have a log of the tweets I posted with the references to the original post as well as the posted message the Flow generated, I created an entity in CDS.

Flow details

  • Trigger: Twitter – When a new tweet is posted

  • Criteria: set the Search text criteria to
    • #powerapps OR #powerplatform OR #microsoftflow OR #poweraddicts”

  • Initialize variables required in the flow
    • baseMsg : the new message to post (based on the language of the original tweet)
    • linkToTweet : the URL to the original tweet
    • proceedTweet : Boolean flag to allow to stop the Flow if not a supported language

  • Only proceed on original tweets (if RetweetCount = 0)

  • Only proceed with tweets from other authors (if TweetedBy is not equal to ZePowerDiver)

  • Look for tweets in French to adapt the message accordingly (if TweetLanguageCode = fr)

  • Assemble the message and assign it to the baseMsg variable
    • insert the linkToTweet variable for the URL to the original tweet
    • insert the User detail’s full name (Original tweet user full name)
    • insert their tagname too (Tweeted by)

  • If not in French, then validate if it’s in English (if TweetLanguageCode = en)

    • If so, assemble the message and assign it to the baseMsg variable
      • insert the linkToTweet variable for the URL to the original tweet
      • insert the User detail’s full name (Original tweet user full name)
      • insert their tagname too (Tweeted by)

 

  • Otherwise, set the proceedTweet variable to false so the post will not be processed at next step

  • at the outcome of the previous steps, validate that proceedTweet is still True

  • if we can proceed, then inject a delay of (n) hours, as per your preference, then post the tweet using the baseMsg variable as the Tweet text
    • For the injection of the delay, the calculation is based on current time as follow:
    • addHours(utcNow(),1)

  • Last step is to insert a new record in my CDS entity to track history

Last minute modifications

After testing the flow, letting it run for a while I made a few adjustments to ensure the posted tweet would be of a maximum of 280 characters, otherwise an error is thrown by the connector.

  • First modification: I added a call to bit.ly to generate a shorter url to the original tweet

  • Second modification: I shortened the baseMsg in both languages to be right to the point nothing more

  • Third modification: as a safety net, I applied a substring function to the baseMsg variable submitted to the Twitter action
    • substring(variables(‘baseMsg’),1,280)

And then the failure!

After implementing those last minute changes I looked back at the generated tweets and noticed that the preview feature that you normally get with the embedded links disappeared!

So, to quickly resolve this, I’ve removed the bit.ly part of the connector, but left the substring part.

Next step

To resolve the constraints regarding the RTs and tagging, which are due to anti-spam rules from twitter, I will attempt to follow Tomas Poszytek’s blog post (that I’ve seen on my twitter radar yesterday while finishing up the details of this article)

Keep on diving!

PowerApp Admin Tools

I’ve been working on XrmToolBox Tools for a bit now, both speaking and posting on the huge number of cool tools and how we can build our own. I’m still working on a few XrmToolBox related projects, but when I started diving into Power Apps, I immediately wondered if I could replicate some Tools in Canvas or Model Driven Apps. Wouldn’t it be cool if we could build out a suite of some administrator or developer tools as Power Apps?

In my post Building XrmToolBox Tools, Part 2, we build an example Tool that allows admins to view the list of Users with a Security Role assignment. It’s a relatively simple tool but it can be pretty useful if you need a quick check on a Role for migrations, troubleshooting, etc. This seemed like as good a candidate as any for a new ‘Admin Tool’ Power App.

Security Role Member Manager!

The proposed functionality for the tool is pretty simple: provide a list of Security Roles in the system, and when the user selects a Role, show the list of assignments. This was meant to be written in about an hour, so that was the extent of the capabilities. Since we have a bit more time, we can add some features. How about we display the list of Teams that have the selected Role assignment and allow Adding or Removing a User or Team to the selected Role.

These requirements are a pretty good candidate for a Canvas App. We can build this using the following components:

  • Common Data Service (CDS) connector – provides the list of Security Roles
  • Office 365 Users Connector – provides the user picture
  • Gallery – bound to the Security Roles list
  • Gallery – bound to the list of Users related to the Security Role
  • Gallery – bound to the list of Teams related to the Security Role

The main screen layout and functionality is also fairly simple. Bind the main grid to to the Security Roles list, and on the select event, bind the secondary lists to the related Users and Teams. No real code behind, just some simple data binding to galleries.

Here is the initial main screen for the Power App:

Security Role Member Manager

Some challenges, some solutions

With the XrmToolBox Tool, I wrote some code to retrieve Security Roles and and User Security Role assignments using the standard SDK Query Expression methods. The list of Security Roles was a simple Retrieve Multiple while the User Role Assignment is a many-to-many relationship.

The many to many is where I stumbled a bit. When I select my Security Role, I want to retrieve the Users and Teams which are both many-to-many relations to Security Roles. The CDS connector does not list the join table as an Entity, so I couldn’t simply add a new Data Source for User or Team and filter by the selected Role. Fortunately, support has been added for many-to-many relationships in the CDS connector. Here is an excellent blog post on the feature by Greg Lindhorst, Principal Program Manager at Microsoft: Relate records in Many-to-Many relationships

So to render the list of Users and Teams, I can bind the galleries using a simple formula. The main screen gallery from which you select a Security Role is named ‘Security Roles List’. So the User and Teams gallery Items property can be set using these simple formulas respectively:

'Security Roles List'.Selected.Users
'Security Roles List'.Selected.Teams

As you can see in Greg’s post, adding and removing Users and Teams are fairly easy too. To remove a User from the selected Security Role, we need a single line formula added to our gallery button:

Unrelate('Security Roles List'.Selected.Users, ThisItem)

That statement passes the selected Security role and the currently selected User to the Unrelate formula, and we’re done!

Next Steps

I plan on a follow up post with a bit more functionality. For example, I like the inline model for selecting a User shown in the post by Greg above, but I think selecting multiple Users and Teams works better. Another nice feature will be to distinguish between Business Units. Right now, this pulls all Security Roles for the entire organization.

This sounds like an obvious one, but I also plan on adding a confirmation dialog before removing the User or Team from a Security Role. This was a bit more complicated than I had expected, so I will write up more detail on how this will be implemented.

As I was working on this sample Power App, I came across a great post User Admin PowerApp (Part 4) by d365Cooky that proposes a similar tool, but for managing Security Roles for a selected User. I like the idea of embedding this into Dynamics 365 CE. By the way, I saw the link via Guido Preite’s dynamicsweekly.com newsletter. If you have not already signed up for this, get to it!

I’ll post notes on all these updates with more detail on how it was built, including the full solution for download in a follow up post.

Powerful stuff!

I feel like I have said that a LOT recently! In a relatively short time, I built a functioning Power App that will allow administrators to manage the Security Role Users and Teams. I put this together in a few hours, including some reading on the CDS connector capabilities and designing a few screens. All of this was done using the existing connectors and no custom code outside of the standard Canvas App formulas.

This is not as complex a tool as you may find in the XrmToolBox, and I am definitely going to continue any contribution I can to the XrmToolBox! But I think this once again demonstrates how the Power Platform allow us to provide low code/no code solutions to your users.

In the meantime, as always, any comments, suggestions, or questions are appreciated!

Dynamics 365-CE Approval Dialogues Using Canvas-apps

There are scenarios where we need to configure approvals in Dynamics 365, for example, mark an account as a premium customer after approval or qualify leads after approval etc. We used dialog control to capture approval request and comments but, now dialog controls are depreciated and not advised to use for new projects.

As per Microsoft’s initial announcement

Dialogs are deprecated and are replaced by mobile task flows (available as of the December 2016 update), and business process flows. Both task flows and business process flows will continue to evolve to make the transition easier.

But either tasks flow or business process flow was not a perfect replacement for Dialog. Knowing this pain from users, Microsoft has now modified the announcement.

Dialogs are deprecated, and should be replaced by business process flows or canvas apps

Even though I knew canvas apps can be now embedded in model-driven apps, I hadn’t thought of this option until I came across this new announcement, so tried replicating my approval dialogues with a canvas app and it works fine. Pheww!!!! 🙂

For testing purpose, I replicated the dialogue for creating approval request for the Account entity.

  1. created a canvas app to create an approval request.
This sample app changes account status to pending verification and captures the comments in one custom field.

2. Now we need to call this app from account form, obtain the app ID from app details section.

select app details to get the App GUID

3. I need to call this canvas app as a popup when the user clicks a button. I created a custom button for account entity-> added a JavaScript as button action to call an HTML web-resource and embedded my canvas app in this HTML I-frame.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<html><head>
<title>Approval</title>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body style="padding: 0px; font-family: arial; overflow-wrap: break-word;" onload="LoadPowerApp()">
<center>
<iframe id="Approval" width="800" height="600"></iframe>
</center>
<script id="myScript">
function LoadPowerApp()
{
var AccountID = window.parent.opener.Xrm.Page.data.entity.getId().slice(1, -1);
document.getElementById("Approval").src=url;
}
</script>
</body>
</html>

I know you have many questions now. "https://web.powerapps.com/webplayer/iframeapp?source=iframe&appId=/providers/Microsoft.PowerApps/apps/56123673-f45c-4b96-b9e6-ece1b0a8069a&ID="+AccountID;

This is the key and I will breakdown it into parts “https://web.powerapps.com/webplayer/iframeapp?source=iframe&appId=/providers/Microsoft.PowerApps/apps/APP GUID&CUSTOM PARAMETER NAME=”+PARAMETER VALUE

App GUID I explained in step 2, now regarding a custom parameter, I deliberately didn’t mention it when we discussed the app creation and kept for this section. When we open this canvas from an account form(like we start a dialogue) the app needs the record GUID to update the account status

I have used form control in the canvas app and filtered the item using the ID Parameter.

4. Now try your button and you can see the magic.

You can download the sample APP from TDG Power Apps bank.

Please note this is a basic app I tried for testing purpose and needs many improvements to use in a live project. you are always welcome to discuss with on this app.

Hope this helps….. 🙂

Set canvas PowerApp date to the last day of the month

When working with canvas PowerApps, you may have a need to have some date ranges set by default. A typical scenario might be where you have 2 date filters and you want to show records that occurred between those date ranges. A scenario I had to work with recently was where the customer required the default filter dates to be the first day of the current month and the last day of the current month.

Setting a date to the first day of the month is relatively straight forward. Using a combination of the Date and the Now functions, you can set up your first day of the month variable as follows:

Set(varStart,Date(Year(Now()),Month(Now()),1))

The tricky part is getting the last day of the month. How do we know if its got 30 days or 31? What about the month of February?? What happens in a leap year!?
Turns out there is a handy undocumented “feature” when you use the Date function. If you set the day property to the number 0, it would return the last day of the previous month. So if I wanted to get the last day of February for example, my function would look as follows:

Set(varEnd,Date(2019,3,0))

Binding that variable to a date control I had, I was able to get the last day of February accurately!

date

Hope this helps.