SharePoint

How to upload Powerapps Audio into sharepoint?

Problem: Recently I have a requirement to upload recorded audio from powerapps to sharepoint.

Requirements seems to be straight forward if you have a basic knowledge of Powerapps and Microsoft flow isn’t it? But there is a catch (let’s see that in sometime :-))

Just for the completeness of this blog, I will reiterate certain Hows.

How to Record and listen the Audio in powerapps?

In powerapps we have a media control named MicroPhone to record the audio, and another media control named Audio to listen to the recorded audio.

Further on how to do that:

In the Microphone control (I named as MyMic) Onstop property collect the recorded audio as follows:

ClearCollect(collInspectionAudio, MyMic.Audio);

In the Audio Control Media property, place the collection which you have used to collect the recorded audio earlier as follows:

First(collInspectionAudio).Url

We now know how to record and listen the audio within powerapps. Now lets discuss the actual challenge on how to upload it to sharepoint.

Analysis

We use Microsoft flow to upload to sharepoint however in my microsoft flow it looks straight forward to pass the audio as First(collInspectionAudio).URL to my sharepoint file content as shown below.

when I execute the Powerapps and hence my flow it is creating an audio file in my sharepoint, however its not playing. when I see the outcome of the Flow i found it strange because File content looks as below:

When I further drilldown into the problem by checking the datatypes in flowstudio. I realized that in Createfile action of Sharepoint connector expects Binary format hereas powerapps send its audio content in byte format.

Solution:

So the only way to get around is accept Byte parameter from Powerapps and use it to upload file in sharepoint.

For that, I used Outlook connector  to send an email. When I ask for attachment parameters in powerapps it can send in byte format.

Note: If you have a requirement to send email of the audio file along with uploading in sharepoint then that’s amazing. If you don’t have such emailing requirement then another slight hack is required.

  1.  Create a condition which never becomes true. (in my case 100 equals 200 which never becomes true ;-))
  2. In the true section call Outlook connector and ask for parameters for Attachment Name and Attachment Content.
  3. In the false section call Sharepoint Connector and create a file by using same parameters which we have in Step 2.

Final working Microsoft Flow looks like this:

Warm Regards,

Pavan Kumar Garlapati

 

 

Implementing Enterprise Search In Power Platform

Photo by Anthony Martino on Unsplash
Photo by Anthony Martino on Unsplash
Providing good search capabilities is a key feature in modern business applications to support usability and end user satisfaction. We have seen how the search capabilities of the Dynamics platform has evolved from providing “Quick Search” and “Advanced File” to “Relevance Search”. The goal of the platform search features has been to support users to find the relevant information they need in the quickest and easiest form. These search features are out-of-the-box and easy to enable/configure/use. As the platform progresses to offer richer features to users and enable them to search better, the demand for richer and better search techniques grow, and we see instances where the platform capabilities cannot meet user demands with its out-of-the-box capabilities. Before going further about advanced search scenarios, you can read about the platform out-of-the-box search capabilities in this official documentation. In this article I share why we may decide to implement a search solution of our Dynamics solution using Azure Search Service.
In enterprise implementations, business applications are not the only systems used in the organization. We often see call center agents and sales representatives need to obtain their required information from various systems to service customers. Searching users in every system is a cumbersome job which may cause setbacks in end-user adaption. Integrating Dynamics with Azure search offers consolidation of search operations in one specialized search service with ability to connecting to various data sources and apply modern search techniques to find the most relevant data. A practical example of this scenario can be seen in one my recent experiences where the organization users had to search for user information in CRM, SharePoint, Sybase and a pool of CSV files.

Customized Search experience

To facilitate more user adoption, using customized search techniques are highly favorable. In all modern search engines, we see use of “Auto complete”, “Suggestions” and “highlighting” features which can be added to the Dynamics solutions search experience. Displaying search results by support of “Document Preview”, “Document Opening in a customized containers”, “Facets”, “Filter” and “Sorting” are examples that enhance your Dynamics solution’s capabilities.

Customized Search Behavior

The true power of search is demonstrated with different pieces of information are linked together to make sense of a bigger picture. Extracting words and sentences from documents including images and pdf files, extracting key phrases, people names, location names, languages and other custom entities with the help of AI is another unique feature that you can add to your Dynamics’s search capabilities. Another amazing search capability you can have in your Dynamics implementation is the ability to search based on geolocation information, i.e. you can search for all your partner network from CRM or get the location of your field service force. The beauty of implementing your own enterprise search lies in the fact that you can search information in your data stores and link them using AI to generate knowledge and better insight to your data.

Customized Search Result

Another need for customized search in your Dynamics solution to the ability to refine your search result profile. When you use AI in your search, the system gives you the power to see how relevant search results are to your search keywords. And by knowing this you can refine your search profiles to generate a different result for the same keywords. This way you train the AI engine to work better for you and enable users to get more accurate search results.
Architecture

Dynamics integration with Azure Search service can be integrated in the following pattern:

 

  1. Integration through web resources: These web resources will host a web application acting as a client to the search service. The web resource can be a HTML file or an iFrame hosted on forms. The important point in this approach to ensure cross-origin settings in the client application and writing your html in a secure way and according to the best practices.
  2. Integration through custom power platform controls. You may build your own custom control which sends REST requests to the Azure Search and display results by consumes REST responses. The custom control can call Azure Search services using Actions or direct REST calls to Azure Service.
  3. Azure Search works based on indexes and your first step is to push your CRM searchable data to Azure Search indexes. This can be done using Microsoft Flow, Microsoft App Logics, custom solutions or Azure Data Factory. I have used all these tools in my implementations, and you can opt to any of these tools based on your requirements.
  4. Once the data is in your data store, you can create your indexes in the Azure Search. You can go for separate indexes for each data source or combine multiple data sources in one index. Each approach has its own requirements which will need to be met either in your client web application or a separate azure compute resource. Once indexing is done, you can make use Azure Search Rest API directly or using Azure API management to expose your search service to your Dynamics solution.
Summing these all up, you see as business application products get more sophisticated and organizations move from data to big data, engineers now must look for innovative approaches to implement Dynamics Solutions. Microsoft Azure along with Dynamics platform offers necessary tools to solution architects to design such solutions.

Microsoft Flow – How to move SharePoint Online list items to folders

Let’s say you have a SharePoint list with folders organized by continent and you have as well a choice column with the continent name in this. You want to move automatically all the list items with the chosen continent to the specific continent folder.

The first idea would be to use Microsoft Flow to do that, but there is no out-of-the-box action that moves list items to folders. At the moment of writing this post, there is an action to move files but nothing for list items. Even thinking about using SharePoint Rest API with the ‘Send an HTTP request to SharePoint’ connector there is no endpoint to move list items to a folder.

This is not clear in the Microsoft documentation, but actually, the same Rest API endpoint and action that is available for files can be used for list items by using the item file property. In this post, it will be shown how to use SharePoint Rest API to move items with Microsoft Flow, so as soon as an item is created or edited, it will be moved to the right location.

Basically, this Flow will do after being triggered:

1.   Get current list root folder path

2.   Get current item file name and the current path

3.   Build a new file path

4.   Check if the path is actually new and if so move the item

To better understand the following steps, knowledge on SharePoint Rest API and Microsoft Flow expressions is helpful.

To start building the flow, the trigger used is ‘When an item is created or modified’.

In this example, variables are used to store the list title, destination folder name and the site Url. Because the same values will be used in different connectors, it is a better practice to not let them hard-coded in the actions. For example, if this flow is exported to another environment or copied to be used in a different list, the changes to the flow will be minimal, only the initial variables and the trigger, instead of updating a bunch of connectors with hard-coded values.

Next step is to get the list folder Url using the SharePoint Rest API, using the ‘Send an HTTP Request to SharePoint’ action with the GET method. The action was renamed to ‘GetRootFolder’ so it is easier to access its output later in expressions. All the variables actions in this example are renamed as well, just for easier maintenance.

After this action, a new variable is initialized to store the list root folder Url with the following formula (use expressions to access the JSON content that is returned by the SharePoint Rest API, with the body function and the action name you can access the content):

body(‘GetRootFolder’)[‘d’][‘ServerRelativeUrl’]

Next action is to get the current list item File Server relative Url using SharePoint rest API like it was done with the list Root folder. In this case, it is needed to explicitly tell the Rest endpoint to load the ‘FileRef’ and ‘FileLeafRef’ properties.

The following variables are used to build the new list item folder Url, so the below formulas are used to assign values to the proper variables:

Item file name (Set with current file name only, used to build final destination path):

body(‘GetItem’)[‘d’][‘FileLeafRef’]

Item Url (Set with current item server relative Url, using the expression):

body(‘GetItem’)[‘d’][‘FileRef’]

New item Url (Root Folder/New Folder/Item File Name):

concat(variables(‘RootFolder’),‘/’,variables(‘MoveToFolderName’),‘/’,variables(‘ItemFileName’))

Then, the SharePoint ‘Send an HTTP connector to SharePoint’ is used again, to call the Files REST endpoint and move the item to the new location (yes, the same endpoint used for files).

It is validated if the File new folder path is different then the current one:

If it is, then a call to the API for the File/moveTo method is done in order to do the move:

This time a POST request is sent because we actually will request a change in this call to the endpoint, which will take care of the item move (also renamed the action to ‘MoveItem’).

After this last action is called, the item is moved to the proper folder.

The final flow will look like this:

With this flow, regardless of where the items are saved, they will be moved to the proper folder after being created/edited.

All the voices

“A woman with a voice is by definition a strong woman. But the search to find that voice can be remarkably difficult.”  – Melinda Gates

 

Companies love dashboards. The idea of progress, of something to announce, is like a drug. Naturally, companies use data and dashboards to measure diversity.  With one click, we can see how many people of what origin, education and sexual identity are employed anywhere within that company.  What those dashboards can’t tell you, no matter what your amazing PowerBI skillz (sic) may be, is the actual effectiveness and impact of that diversity down to the team or individual level.   Data and dashboards struggle with the intangible, with context. (I say this with all due respect to data scientists and my “blue” colleagues.)  Dashboards struggle to tell you if all those amazing voices that the company has invested so much in recruiting are actually being heard. This is the nuance of inclusion.

This is where checking the box on the dashboard stops and the application of the sought-after differing points of views begins.  And honestly, this is where so many teams fail.   The representation is in the room, but the team culture hasn’t evolved, the manager is still talking at people, the environment isn’t functioning.  The loudest voice still stands out.  Suggestions are quickly brushed aside until repeated by another more well-known contributor.  Questions are directed at the wrong person.  And then people just shut down, go back to their old ways, and that highly sought-after talent leaves.  Oh well, she wasn’t a good fit anyway. 

The pressure on groups to produce results quickly isn’t going away.  This intangible nuance of hearing all voices is easily pushed aside in the name of speed since it can be very difficult to measure. Worse yet, incorporating all the voices can actually slow things down at first, while in the end making the output so much better. How to show that the end justifies the means?

I propose that the best way to measure something is to start with a remarkable subset.

Enter the #msdyn365 community at 365 Saturday.  For me, it started in Dublin.  Actually, it started way before then, it just became more deliberate in Dublin thanks to the event organizers (looking at you, Janet and Raz) then took further shape in London and most recently solidified in Glasgow.  At these all day events (on a Saturday, just like the name implies), informal groups of women at various stages of career gathered for an hour under the umbrella of Women in Technology (#wit), not quite sure what to expect.

Each session has been different, because as with many things, the conversation is a result of the sum of the amazing diverse parts.  Topics varied, yet it all came down to one overarching theme: communication.  Whether that be the how, the when or the why of when to use our voices.  We talked about #confidencehacks, about how to establish ourselves without crossing a line that makes us uncomfortable (and practicing not caring about making others uncomfortable), about connecting and expanding our networks, and then most importantly we talk about amplification – how we can help others’ voices be heard.  All voices, not just female.

Note: There are so many other cultural considerations here, for which I lack a point of reference.  There is also a whole discussion to be had about how people consume, digest and respond to information.  For example, the work culture that I grew up in was as follows: get in a room, review a PowerPoint, have a passionate discussion where the loudest voice usually wins, determine next steps, assign actions items, repeat.  That format doesn’t work for all.  What about the voice of the incredible introvert in the room that needs time to digest the info, consider all sides, and then voice their opinion?

And there is the other amazing thing about our #msdyn365 community.  Others want to know how they can help.  Sure, I was teased about “super-secret lunches” by male colleagues.  I saw that for exactly what it was – curiosity and a sincere wish for dialogue.  Why is it necessary to have a “womens’ anything”? Shouldn’t it just be about hiring the best person for the job?  How should we feel about this?  We all treat each other with respect, right? Isn’t it up to individuals to make themselves heard?

Truth is, I agree with everything above.  Inclusion, by its intent, is about everyone.  And therefore, everyone has a responsibility to feed this culture and in the end everyone will benefit. We all can and should help amplify the voices of others. What I love about getting small groups of women together is that the coaching and dialogue that happens in a really safe environment then goes out into the diverse world and multiplies. It starts with a subset. Never underestimate the ripple effect of small actions.

Fifty percent (50%) of the speakers at 365Saturday Scotland identified as female.  Fifty percent.  That is crazy insane goodness.  It did not just happen.  This was the result of a community (led by Marc, Janet, Claire , Iain and so many others) rallying to make sure that opportunities were presented and seized, that a safe place was created and maintained, and that voices were heard.  Shouldn’t that just happen naturally?  Yes, ideally someday the flywheel will be spinning with such momentum that this will be the case (oh, and 50% of the attendees will also be women… work to do there as well).  Then the focus will become how to maintain and feed that system.  The moment you take your eye of something, you risk losing the muscle memory. Omission by unintentional oversight does not remove responsibility.

There is a meme about equity vs equality running around our social media feeds.  The one that show people of different heights trying to watch a baseball game over a fence.  The size of the boxes they are standing on depicts the difference between being treated equally (same sized box) and equitably (different sized boxes raising all to the same level).  The lesser known version has a twist – it shows what it would look like if there was no fence at all.

This is the nuance of inclusion.  This is how the #msdyn365 community is working to remove the fence.  It starts with these conversations, these opportunities. Listening to all the voices takes time and deliberate effort.  This community is all in.

Raise your voices. 🙂

Carissa

 

SharePoint Modern Lists: Force forms to open in full size

SharePoint list forms on modern experience lists will open in a dialog side box by design (at the moment of writing this post, no out-of-the-box options are available to change this).

In my opinion for non-customized forms that’s ok, but for PowerApps forms in SharePoint lists, the layout is not fine when the form layout becomes large, due to currently the forms inside the lists not being perfectly responsive.

Depending on the screen resolution we get unnecessary scrollbars which makes it look poorly done. If you use the form link which can be copied from the form dialog, you can open the item in a full-screen view.

It would be nice if there was an option to configure the item to open in this view out of the box, but for the moment, it is not available. Even if we change the list settings to disable dialogs, that doesn’t take effect on modern experience lists.

But we can count on SharePoint column formatting to overcome this!

With column formatting, we can use custom JSON script to change how a field is displayed, editing or adding HTML elements to the field view. You can find a full overview of column formatting here.

In the case mentioned to force the form to open in full screen, what we need to do:

  • Choose a field to change the formatting and include a link
  • Format the link based in the item ID and the list Forms URL

If the ID field allowed column formatting we could build the link directly in the ID field, but unfortunately, that’s not an option. So we’ll have to change another field formatting, but accessing the ID field value to build the URL.

You can use the Title field which is normally used to open the items to do this but keep in mind that by overriding the field formatting you’ll lose the field button to open the contextual menu.

The JSON script for the field will be something simple as we will add just one HTML element, here’s the example:

Quick explanation:

  •  elmType: the HTML tag type you are going to display
  • txtContent: value to be displayed inside the HTML Element – in this case the current field
  • attributes: a set of other attributes to be added to the HTML Element, in our case
  • href: the link URL – built dynamically using the ID field value and Current Site URL. Add the ‘Source’ parameter to the querystring so that after saving the form you are going to be redirected to the list view (change ‘/lists/yourlist’ above to be your list Site Relative path.
  •  target: _self was chosen in this case, to open the form in the same window.

In the targeted list settings, open the column you want to apply the JSON formatting, paste the JSON content under ‘Column Formatting’ and save the changes.

Now all the items in the list when opened using the chosen field will open in a full page with this little trick.

Using Eircode (Ireland Postcodes) to get Geolocation from Google Maps with Microsoft Flow

Let’s say you have a SharePoint list storing information with an Eircode column (postal code for addresses in Ireland), and you want to use that information in Power BI later to generate a map organizing all your items by location. Unfortunately, the Power BI maps don’t work fine with Eircodes, so how could we get the locations with the most precise information?

Let’s use Microsoft Flow and Google Maps API for that!

(Sorry Microsoft for not using Bing Maps)

Before start building the Flow, get the Google maps developer API key in: https://developers.google.com/maps/documentation/javascript/get-api-key

And include two new columns of type number and automatic decimal places in the SharePoint list: Latitude and Longitude

With those two things set up, it’s time to begin the Flow creation.

We will use the SharePoint ‘When an item is Created or Modified’ Trigger to start our flow. After starting the Flow creation with this trigger, add an ‘HTTP’ action. To get the Latitude and Longitude information, make a get Request to Google Maps API using the Eircode coming from the SharePoint item as Address filter:

In this case, the search is filtered to be just in Ireland as you can see in the parameters.

The results we get from Google Maps API are in the following JSON structure:

Note we have the latitude and longitude under geometry/location. So, to manage that information easily, create two variables ‘Latitude’ and ‘Longitude’, of type Float.

Assign the value coming from the HTTP request body to the variables using an expression for each (which will navigate through the JSON object to access the data):

Latitude: float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lat’])

Longitude:float(body(‘HTTP’)[‘results’][0][‘geometry’][‘location’][‘lng’])

Now that you have your values properly assigned, check if one of them differ from the existing ones in SharePoint, in case it is different it means the item needs to be updated. Then add an ‘Update List Item’ Action and set it to update your list items with the new Latitude and Longitude values to finish.

 The final layout of the flow will be the following:

Now, as soon as your items are updated or created in the SharePoint list, the information will be ready to be used in the Map visual on Power BI with the SharePoint list as data source.

Have fun!

SharePoint capped at 500 records- Data Row Limit for non-delegable queries

So I was busy creating a nice Canvas App using the brilliant Patch command when my records stopped transferring across to SharePoint at exactly 500 records.

Knowing that SharePoint can hold many more entries than this I was slightly confused – on deeper investigation I found the little gremlin blocking my patch commands (File>App Settings>Advanced Settings):

When increasing this limit it allowed my records through. I hope this helps someone out there!!!