Share Power Apps with Microsoft Teams Users

You can share an app with Microsoft Teams users since each Microsoft team also creates a Microsoft 365 group in Azure AD. The only pre-requisite is the associated Microsoft 365 group of the Microsoft Team should be securityEnabled. By default, the Microsoft team group is not security enabled. If you are trying to share a Power Apps with a Microsoft team which is not security enabled, you will not be able to add or select the group

Microsoft has documentation on how to security enable a Microsoft 365 group but you should have PowerShell installed on your computer. In this blog post, let us see how to enable securityEnabled property using Graph Explorer which is a web portal.

Pre-requisite:

  • Owner of the Microsoft Team
  • Access to Graph Explorer

Step 1: Find the Microsoft Teams Group Object ID. Login to the Microsoft Azure Active directory admin portal. Search for the Microsoft teams group using the display name of the Team. From the overview section of the group, copy the group Object Id

Step 2: Sign in to Graph Explorer. Find below the request details

Request Type: Patch

URL: https://graph.microsoft.com/v1.0/groups/groupObjectId

Request Body:

{
    "securityEnabled": true
}

If you have consent for permissions, click Modify permissions as shown above to grant consent. Once everything is set, click Run query to enable security role. Once the property is set to True, you can share the Power App with Teams users. This setting is not required to share a Microsoft cloud flow.

Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Advertisement

Get deleted SharePoint site details using Microsoft Graph API

The deleted sites are retained for 93 days and an Admin can restore them. In this blog post let us see how to get the deleted SharePoint site details using Microsoft Graph API application permission.

Step 1: Register an application in Azure AD and obtain the client id, client secret & tenant id for the registered application. Add Sites.Read.All Microsoft Graph application permission

Step 2: Find the list GUID of the Hidden List DO_NOT_DELETE_SPLIST_TENANTADMIN_ALL_SITES_AGGREGA which has all the deleted site information from the tenant.

Make a GET request to the following Graph API endpoint with the token generated from the above AD app using PostMan or using Graph Explorer if you are an Global or SharePoint administrator

https://graph.microsoft.com/v1.0/sites/tenantName-admin.sharepoint.com/lists

Replace the tenantName with name of your tenant from the above URL

Step 3: Graph API to list all Deleted Sites within the tenant

https://graph.microsoft.com/v1.0/sites/tenantName-admin.sharepoint.com/lists/1dd71312-bceb-48bb-b853-7c0d33ac5618/items?$expand=fields

Make a GET query after replacing the tenantName and the list GUID of DO_NOT_DELETE_SPLIST_TENANTADMIN_ALL_SITES_AGGREGA from Step 2.

You can filter from different fields available [Title, SiteUrl etc]. To filter based on SharePoint site URL

https://graph.microsoft.com/v1.0/sites/tenantName-admin.sharepoint.com/lists/1dd71312-bceb-48bb-b853-7c0d33ac5618/items?$expand=fields&$filter=fields/SiteUrl eq 'https://tenantName.sharepoint.com/sites/siteName'

Step 4: To do this from Power Automate or Azure logic app, follow this post

https://ashiqf.com/2021/03/16/call-microsoft-graph-api-as-a-daemon-application-with-application-permission-from-power-automate-using-http-connector/

There are activity alerts which you can setup from Security center for Deleted Site but it will send you information on the Site URL and the name of the user deleted the site, as of now it does not provide the Title, Site ID etc. So this API can provide you additional details. Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Handling nonexistent, null and multi value type properties in Parse JSON action

In Power Automate cloud flow, Parse JSON action is used to access properties in JSON content enabling you to select those properties from the dynamic content list on your subsequent actions. Typically the JSON content will be from a response to an API call. The first step after adding the action is to select the source of the JSON content and to add the schema, you can either provide a JSON schema from your request payload or the action can generate based on a sample content. If you chose to generate the schema from a sample, the schema is generated based on the first object from the sample JSON content.

For the following sample JSON content, the property

  • Name in the first element is of type string and in the second element is of type null
  • EmpNo in the first element is of type integer and in the second element is of type string
  • Country in the first element is of type string and in the second element it does not exist
[
  {
    "Name": "Mohamed Ashiq Faleel",
    "EmpNo": 123456,
    "Country": "Sweden"
  },
  {
    "Name": null,
    "EmpNo":  "EX-123456"
 }
]

the generated schema is

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Name": {
                "type": "string"
            },
            "EmpNo": {
                "type": "integer"
            },
            "Country": {
                "type": "string"
            }
        },
        "required": [
            "Name",
            "EmpNo"
        ]
    }
}
  • The Type of the properties in the above schema is based on the property values from the first element [Name – String, EmpNo – Integer & Location – String]
  • The required property only Name and EmpNo since Country is not available in the second element

In this blog post, let us see how to handle the following in Parse JSON action

  1. Null and Multi value type property
  2. Non Existent property

I have added the above sample content in an Array variable and the above generated schema in the Parse JSON action as shown below

Null and Multi value type property:

The sample content had elements with null property value [Name – second element] and the type of the property is different on each element [EmpNo – Integer and String]. If you run the flow, it will fail with the message ValidationFailed. The schema validation failed for the Parse JSON action and the output should have the following error in the output

  • Invalid type. Expected String but got Null.
    • Reason – Property Name in the second element null and not string as defined in the schema
  • Invalid type. Expected Integer but got String
    • Reason – Property EmpNo in the second element had string value and not integer as defined in the schema

There are two methods to solve this problem.

Method 1:

Add additional data types to the property Name [null] and EmpNo [string] as shown below

{
  "type": "array",
  "items": {
    "type": "object",
    "properties": {
      "Name": {
        "type": [
          "string",
          "null"
        ]
      },
      "EmpNo": {
        "type": [
          "integer",
          "string"
        ]
      },
      "Country": {
        "type": "string"
      }
    },
    "required": [
      "Name",
      "EmpNo"
    ]
  }
}

The drawback with this approach is the property values will not show in dynamic content panel.

You will have to write expression to get the value

items('Apply_to_each')?['Name']
items('Apply_to_each')?['EmpNo']

Remove the type information for the property Name and EmpNo from the schema as shown below

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Name": {},
            "EmpNo": {},
            "Country": {
                "type": "string"
            }
        },
        "required": [
            "Name",
            "EmpNo"
        ]
    }
}

This way you will not loose the selection of properties from the dynamic content

Non Existent property:

The sample content did not have the property Country in the second element. The first check is to validate the required property from the schema

"required": [
            "Name",
            "EmpNo"
        ]

If you look above which is taken from the schema, Country is not there so all is good. The expression to get the value is

items('Apply_to_each')?['Country']

Find below screenshot of the run for the property Country from the second element

To have a meaningful output, you can write an expression to show some default text (Not Available) if the property is non existent else the actual value. Find below the expression

if(empty(items('Apply_to_each')?['Country']),'Not Available',items('Apply_to_each')?['Country'])

You can use the above condition to show some default text for null values.

Summary:

The parse JSON action makes your flow cleaner with the availability of the properties from the dynamic content panel. It is always recommended to go through the generated schema of the action and do some clean up on unwanted or not going to be used properties and do updates to the different properties based on all scenarios of the expected content. Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.

Parse an array without using Parse JSON action in Power Automate cloud flow

In this blog post let us see how to access the property of an array object without using Parse JSON action.

Find below the sample array which has been initialized in an array variable

[
{
    "Name": "Mohamed Ashiq Faleel",
    "Location": "Stockholm"
  },
  {
    "Name": "Megan Bowen",
    "Location": "New York"
  }
]

Add a Apply to each control with output selected from the array variable EmployeeArray as shown below

Add the compose action inside the Apply to each control loop to access the property Name from the array. In the compose action add the following expression to get the Name value

item()?['Name']

Find below the screenshot with the expression

For country it should be

item()?[‘Country’]

The generic expression is

item()?['Property-Name']

Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Conditional Power Automate flow triggers for SharePoint Online Pages and NEWS Post

SharePoint Online Pages library is a container for different type of pages (News post, Page, Space, News Link) created in a Communication or Team site. There can be various scenarios to have a Power Automate Flow associated to a SharePoint Site pages library to handle additional processes after a Page or a News post is published. In this blog post, let us see how to

  1. Trigger the flow if a News post is published
  2. Trigger the flow only for Major versions
  3. Trigger the flow for a specific Content Type
  4. Avoid infinite trigger loop on an Item Created/Modified trigger if a page/list item is updated by the flow

using Trigger Conditions. Trigger conditions can be used on a trigger to stop your Automated Flow from running if the conditions are not met. Unnecessary flow runs can spend your quota limits based on the license types without providing any value. To begin with, create an automated cloud flow with the SharePoint trigger When an item is created or modified and configurations for the Site Pages Library. Once you provide the Site URL where your Site Pages library exists, you will notice the Site Pages library doesn’t show in the drop-down. In the List Name property, just provide the guid of the library instead.

To get the guid, browse to the Site Pages library on the SharePoint site, go to Library settings and select the value after the List= parameter on the URL after decoding.

Trigger the flow if a News post is published

There can be scenarios to trigger the Flow when a News post is created or modified. A SharePoint property PromotedState can help identify if the SharePoint page is a News post or a normal page since all the different types of pages are stored in the same library.

LabelValueWhat it means
NotPromoted0Regular Page
PromoteOnPublish1News post in draft mode
Promoted2Published News post

The trigger condition will make sure the trigger is fired only when ever there is a News Post is published or Saved as draft (All Major and Minor versions).

@equals(triggerOutputs()?['body/PromotedState'],2)

Now add the above trigger condition in the settings of the trigger as shown below

The above trigger condition will have the flow triggered for all major versions (1.0, 1.1 .. 2.0, 2.1, ..).

There can be multiple trigger conditions which accepts Boolean value (True or False), all conditions must be True for the trigger to fire.

To trigger the flow only on first Published version of the flow, add the following trigger condition.

@and(equals(triggerOutputs()?['body/PromotedState'],2),equals(triggerOutputs()?['body/{VersionNumber}'],'1.0'))

To trigger the flow only on major versions and on News post, add the following trigger condition

@and(equals(triggerOutputs()?['body/PromotedState'],2),contains(triggerOutputs()?['body/{VersionNumber}'],'.0'))

Trigger the flow only for Major versions

The following trigger condition will make sure to fire only for Major versions (1.0, 2.0, 3.0 etc) and not for minor versions aka draft version (0.1, 0.2 etc)

@contains(triggerBody()?['{VersionNumber}'],'.0')

Trigger the flow for a specific Content Type

Content types in SharePoint are a set of columns that are grouped together to serve a specific type of content (Crisis News, Marketing News etc). A Page or a News post in a SharePoint site can be associated with content types. The trigger condition for the flow to be triggered only for a specific content type is

@equals(triggerOutputs()?['body/{ContentType}/Name'], 'Name of the Content Type')

Avoid infinite trigger loop on an Item Created/Modified trigger if a page/list item is updated by the flow

In your Automated cloud flow, if you have the Created or Modified trigger with an action to update the same item then there will be an infinite trigger loop.

The Flow checker will provide you a warning Actions in this flow may result in an infinite trigger loop. To overcome the above warning, trigger condition to the rescue.

How it will be done

The update item action on the flow should use a different connection (Service Account) in the flow, other than the user who will be using the site to create or update pages. The trigger condition will make sure the flow run will not happen if the update to the Page or News post is done by the service account using the Update item action. SharePoint Library and List has the out of the box column Modified By which holds the information on who has recently updated the item be it from the SharePoint UI or through program. The trigger condition will be written based on this column Modified By, if the column value has a different value other than the service account then the flow will be triggered.

Step 1: Create a service account with password never set to expire. Licenses are not required for this account if the flow connection is going to be used only on SharePoint connectors. Password setting Never Expires will make sure the connection is not invalidated due to a password change on the account.

Step 2: Grant edit access for the service account to the SharePoint site. This step allows the account to updates to the List or Library item.Step 3: Add a new connection to the service account

Step 4: Add the following trigger condition to the SharePoint trigger if the service account does not have an Exchange Email License

@not(equals(triggerOutputs()?['body/Editor/Claims'],'i:0#.f|membership|serviceaccountupn@domain.com'))

Replace the serviceaccountupn@domain.com with actual UPN of the service account.

If the service account has email address or a license to email service, then the trigger condition should be

@not(equals(triggerOutputs()?['body/Editor/Email'],'serviceaccountemail@domain.com '))

Tip to write the trigger condition:

Before adding the condition to the trigger, evaluate the condition on a compose action using expressions and data fields selected from Dynamic content.

After the condition is added on the compose action, click Peek code

Copy the expression from the inputs parameter

The condition to be added on the trigger must be True for the trigger to fire.

Summary:

Trigger conditions are powerful if used wisely to avoid unnecessary runs. I’ve shown some examples from the SharePoint pages library but it can be used on List trigger as well. The trigger can be written based on any data available on the trigger output. Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.

How to use form-urlencoded content type in Power Automate Custom Connector

Content type x-www-form-urlencoded is used generally to send text data in a query string in the form of name value pairs separated by ampersand. In this blog post, let us see how to use the content-type

  • x-www-form-urlencoded

in a Power Automate custom connector. Refer to this post, if you would like to find out how to use it in a HTTP connector. Find below the screenshot from postman with an API from Twilio (Sample) to send a WhatsApp message with content type x-www-form-urlencoded

x-www-form-urlencoded in a Custom Connector:

The x-www-form-urlencoded content type has its form data which is encoded and sent in a single block on the HTTP request body.

Custom Connector:

To call the above API with the content type x-www-form-urlencoded in a custom connector, the first step is to create a connector from blank with the authentication type filled in (Basic, API Key etc) on the security tab. Now Add a New action to the call the above API. Click + Import from sample to enter details of the API request like Verb, URL and Headers (Content-Type application/x-www-form-urlencoded) and Body. For Body, just add {}. The content on body will sent on the Power Automate cloud flow. PFB screen shot for the action definition

After the above details are entered, click Import.

In the Request section, click the Content-Type under Headers, enter the default value application/x-www-form-urlencoded and then make it required with the visibility set to Internal. This setting will hide the parameter from the cloud flow

Make the body required. Create the connector after all the details has been entered.

Custom Connector in Power Automate Cloud Flow:

The form values to be sent on the API request body with x-www-form-urlencoded implementation must be encoded & the values must be separated by ampersand. Expression encodeUriComponent can be used to encode the form values.

In the Cloud flow, add a compose action with all the values encoded and separated by ampersand (&). Now add the custom connector action which will prompt you to create a connection. In the body section, from the dynamic content select the Outputs of the compose action.

Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.

Create a Power Automate custom connector from Postman V2 Collection

Postman collections makes the creation of custom connectors in Power Automate easier & quicker. As of time I am writing this article, to create a custom connector using Postman collection in Power Automate the version of Postman collection has to be V1. The current version of collections exported from Postman is V2. There is a NPM package by the name Postman Collection Transformer to rescue which helps converting the collection to V1 and vice versa.

Pre-Requisites:

Step 1: Install the NPM package postman-collection-transformer using the following command

npm install -g postman-collection-transformer

Step 2: Generate the Postman collection from Postman

Step 3: Run the following command to generate the V1 collection. For more information on the NPM package go through this link.

postman-collection-transformer convert --input ./Postman_collection-V2.json --input-version 2.0.0 --output ./Postman_collection-V1.json --output-version 1.0.0 --pretty --overwrite

Step 4: V1 Postman collection is ready, you can now proceed with the creation of custom connector in the flow portal.

As pointed out by Richard Wilson, there are third party portals (Requires Registration) available which helps in converting the format of the Postman collection.

Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.

How to use form-data and form-urlencoded content type in Power Automate or Logic Apps HTTP action

Content type multipart/form-data is used to send both text and binary data to the server and x-www-form-urlencoded is used more generally used to send text data in a query string in the form of name value pairs separated by ampersand. In this blog post, let us see how to use the content-type

  • multipart/form-data
  • x-www-form-urlencoded

in a Power Automate or Logic apps HTTP action to post data with an API which has implemented the content-type. Find below the screenshot from postman with a sample API

multipart/form-data in HTTP Action:

From the above screenshot, the API is called using the content type multipart/form-data. The multipart refers to the data (in the above screenshot it is To, From & Body) which is divided into multiple parts and sent to server. For each key value pair aka part, you will have to construct something like

{
      "headers": {
        "Content-Disposition": "form-data; name=\"KEY\""
      },
      "VALUE": "what ever value you would like to send"
}

Backslash is used close the Content-Disposition header value else you will get Invalid-JSON.

To call the API displayed from the above screenshot on the HTTP Action, the body of the HTTP action should have the two attributes $content-type and $multipart as shown below

{
  "$content-type": "multipart/form-data",
  "$multipart": [
    {
      "headers": {
        "Content-Disposition": "form-data; name=\"To\""
      },
      "body": "whatsapp:+123456"
    },
    {
      "headers": {
        "Content-Disposition": "form-data; name=\"From\""
      },
      "body": "whatsapp:+178910"
    },
    {
      "headers": {
        "Content-Disposition": "form-data; name=\"Body\""
      },
      "body": "Your appointment is coming up on July 21 at 4PM"
    }
  ]
}

You can upload files using the form-data content type

{
      "headers": {
        "Content-Disposition": "form-data; name=\"file\"; filename=\"fileName.png\""
      },
      "body": "file-content"
}

The file content can be the output of the SharePoint or OneDrive connector.

x-www-form-urlencoded in HTTP Action:

The x-www-form-urlencoded content type has its form data which is encoded and sent in a single block on the HTTP request body. To call the sample API from the screenshot posted at the top of this post in the HTTP Action, the form values must be encoded & the values be separated by ampersand. Expression encodeUriComponent can be used to encode the form values

Headers:

Key: Content-Type

Value: application/x-www-form-urlencoded

Body (Separated by &):

Key=Value&Key=Value

Find below screenshot for your reference

Hope you have found this informational & thanks for reading. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.

Call a SharePoint REST API as an Application in Power Automate HTTP Connector

SharePoint connector in Power Automate is very rich with various actions that can make the developers or makers life simple when it comes to interacting with SharePoint data. There might be some actions like

  • Breaking permission to a list item
  • Creating a site
  • Adding user to a SharePoint group etc

which is not possible through the SharePoint standard connector or MS Graph API as of the time I am writing this article, SharePoint REST API to rescue. The SharePoint online REST API enables developers to remotely interact with SharePoint data. There is an action Send an HTTP request to SharePoint which could come handy in many scenarios, the point to note here is the action uses the context of user aka flow creator while executing the API. In this blogpost, let us see how to call a SharePoint REST API to create a Modern SharePoint communication site as an application in a Power Automate cloud flow using the HTTP connector with the help of a Self-Signed certificate. Find below the list of steps to enable calling the SharePoint REST API using certificate credentials

  1. Creation of Self-Signed certificate
  2. Application Registration in Azure AD Portal
  3. Creation of Power Automate cloud flow with the HTTP Connector
    • Method 1: Without using Azure Key Vault
    • Method 2: Azure Key Vault to store Certificate

Pre-Requisites:

Creation of Self-Signed certificate:

The first step is to create a certificate. Refer to this blog post for instructions creating a self signed certificate using the PnP utility

https://ashiqf.com/2021/07/05/call-microsoft-graph-api-using-a-certificate-in-a-power-automate-http-connector#self-signed-certificate

Application Registration in Azure AD Portal:

Register an application in Azure AD and obtain the client id & tenant id for the registered application. In this example I have added the Sites.Read.All Application permission with Admin Consent to create the SharePoint communication site, this permission is more than enough to create the site as an Application. Grant appropriate permission based on the requirements, for e.g to break permission on list items grant Sites.Manage.All. Find below screenshot for your reference for granting permissions

To add the above created self-signed certificate, click Certificates & secrets under the Manage blade. Click Upload certificate > Select the certificate file MSFlow.cer > Add

Creation of Power Automate cloud flow with the HTTP Connector:

Let us see below how to access the SharePoint REST API to create a SharePoint site with & without using the Azure Key Vault.

  1. Method 1: Without using Azure Key Vault
  2. Method 2: Azure Key Vault to store Certificate

Method 1: Without using Azure Key Vault

In the cloud flow, add a Compose action to store the PfxBase64 value copied during the creation of the certificate. Now add the HTTP action to create a Modern Communication site

Request Type: POST

URL: https://tenantname.sharepoint.com/_api/SPSiteManager/create

Headers:

Key: accept

Value: application/json

Body:

{
  "request": {
    "Title": "Communication Site from Cloud Flow",
    "Url": "https://tenantname.sharepoint.com/sites/commsitefromPA",
    "Lcid": 1033,
    "ShareByEmailEnabled": false,
    "Description": "Description",
    "WebTemplate": "SITEPAGEPUBLISHING#0",
    "SiteDesignId": "6142d2a0-63a5-4ba0-aede-d9fefca2c767",
    "Owner": "UPNoftheSiteAdministrator@domain.com",
    "WebTemplateExtensionId": "00000000-0000-0000-0000-000000000000"
  }
}

Change the SiteDesignId for the different site teamplate Topic, Showcase, Blank

Authentication: Active Directory OAuth

  • Tenant: TenantId
  • Audience: https://tenantname.microsoft.com
  • Client ID: Azure AD Client Id
  • Pfx: Output of the compose action
  • Password: Certificate password given during the creation

Find below screenshot for your reference

Run the flow, it should be able to create the Site. Find below screenshot of the flow run

Method 2: Azure Key Vault to store Certificate

Azure Key Vault is a cloud service for storing and accessing secrets enabling your applications accessing it in a secure manner. Follow my blog article which I have written to call a Microsoft Graph API with Certificate using a Azure Key Vault to store the certificate

https://ashiqf.com/2021/07/05/call-microsoft-graph-api-using-a-certificate-in-a-power-automate-http-connector/#azure-key-vault

Summary:

Custom Connector can be used to call a SharePoint REST api in the context of the user. If you are visiting my blog for the first time, please do look at my other blogposts.

Do you like this article?

Subscribe to my blog with your email address using the widget on the right side or on the bottom of this page to have new articles sent directly to your inbox the moment I publish them.