Microsoft Flow Execute Sql Query

  

In September 2018 Microsoft announced the integration of Power Query Online for Microsoft Flow. The reasons were to offer an alternative to OData and SQL for getting data from SQL Server connector. They included a hint that other data sources will come in the future but no details when or what. ‘Nacionality’ are inserted and selected from the list of “Execute a SQL query” action tokens Because we can have multiple rows retrieved from SQL Server, the Flow designer will automatically place this “Variable – Append to string variable” inside a loop that will iterate thru each SQL row. The flow consists of these steps: Trigger the flow from an action in PowerApps. Capture the query statement from what a user had typed into a text box in the PowerApp. Execute the SQL query. Detect a 200 response for success, and send that data back into PowerApps. Whats up Friends?!In todays video I am joined by Microsoft Flow MVP - Ahmad Najjar and we are bringing you the SQL Connector! Hi, you have to declare the formalparameter as variable within the SQL query. So: DECLARE @userid VARCHAR (10); (or whatever value type) SELECT FROM a WHERE userid = @userid This way the parameters can be passed onto the query with the corresponding variable name.

We have recently shipped, in public preview, Power Query Online integration for Microsoft Flow. Using this capability allows flow makers to transform and shape their data coming from SQL Server using Power Query Online.

Why did we build this capability?

We built this capability for many reasons including:

  • An alternative to OData which can be cumbersome to use for many of our users.
  • A simpler approach to joining tables than T-SQL or writing Stored Procedures.
  • Future opportunities to include additional data sources in mashups that allow makers to build data transformations across multiple data sources.

Licensing

The Power Query functionality will be included in our Premium Connector offering, which requires a Flow Plan 1 or higher. Currently this is not enforced as part of the preview, but will be in the future.

Scenario

Let’s take a closer look at how we can use this new capability to enhance our data extraction capabilities.

In this scenario, we have an Apartment Rental company called Contoso Apartments. Naturally, customers will inform the main office when there are maintenance issues across their many properties. Much like any Work Order system, data is organized in multiple tables including Customers, Apartments and Work Orders. The Customer Service representatives are very interested in keeping their customers happy and want to proactively ensure that customers are content. If they are not, they want to quickly course-correct to avoid them leaving to live somewhere else.

We can monitor customer sentiment by ‘mashing’ up our Customer, Apartment and Work Order data from Power Query. When we do this, we have aggregated data that can be passed to a sentiment analysis tool. When we detect that a customer is unhappy, we can then publish a message to Microsoft Teams where a customer service rep can follow-up with the customer.

To build this solution we will perform the following:

  • We will add a Recurrence trigger that will run every day
  • Next, we will locate our Transform data using Power Query action which is provided as part of the SQL Server connector.
  • To connect to SQL Server, we need to create a Connection that includes a SQL Server name, Database name, Username and Password.
  • With our connection established, we can now create a Power Query query.
  • Our next step is we need to select the tables that we would like to include in our mash-up. In this case we are going to select Customers, Apartments and Work Orders.
  • We want to join these different tables so that we have enriched Work Order data that includes Customer and Apartment related data for our Work Orders. To do this, we will click on Combine tables and then select Merge queries as new.
  • When we merge, we need to select the type of join that we want to use. In this case we will select a Left Join and declare our Work Order table as our core table that we want to enrich with Customer and Apartment data.

Note: In this scenario, I ran the Merge twice. Once with Work Orders and Customers and then once again with my (WorkOrders and Customers) + Apartment.

  • Once we have merged our tables, we can now trim our dataset by only including the columns that we need.
  • Before we configure the rest of our Flow, we do need to declare our new aggregated query as the query we will Enable Load for.

Note: At this time, you can only enable 1 query that will return back to Flow. However, as we have discovered we can merge multiple queries into a single query for our use.

  • With our Power Query query configured, we can now use the result set and dynamic content, much like we can do with other connectors. In our use case, what we will do with our result set is loop through each record returned and send the Work Order Comments from the customer through the Azure Cognitive Services sentiment analysis API.
  • Next, we will evaluate the sentiment returned to see if it is less than .4 (which is really bad). When this occurs, we will add related Apartment, Customer and Work Order information with this sentiment value and add to an Array. After we have iterated through all of these recent Work Orders, we will then check the length of the array to see if we have records. If we do have records, we will convert this Array to an HTML table which we can then publish to a Microsoft Teams Channel.

Testing

We can now go and run this flow from within the Microsoft Flow maker portal or by calling it from the Microsoft Teams Flow Bot. Once the flow runs, it will publish the results of our flow in Microsoft Teams. This allows the Customer Service channel to target customers who are unhappy without performing a lot of data exploration.

Features

  • Only SQL Server is supported as a data source. This is deliberate in our first release as we do not want to expose additional data sources that are not protected by Microsoft Flow Data Loss Prevention (DLP) policies. We do want to include additional data sources, but those will be future investments.
  • We do throttle Power Query Online usage based upon:
    • 2 Hours/Day
    • 10 Hours/Week

This is based upon the amount of time it takes for your Power Query queries to execute. If these values don’t work for you, we would love to hear what they need to be.

Microsoft Flow Execute A Sql Query Results

  • As described previously, we will only output 1 query. To avoid unexpected results, ensure that you Enable Load on the desired query.

What’s Next?

Working with the Power Query team to unlock this capability has been really exciting. Both teams see an opportunity to empower Power platform users to do more using these technologies. Since Power Query is a very rich and deep platform, we would love to hear more about Power Query + Flow use cases that you envision. This feedback will help us prioritize future investments. Please comment below.

OData (Open Data Protocol) is an OASIS standard that establishes best practices for designing RESTful APIs. One of the capabilities of OData is providing the ability to filter data using a standardized method across RESTful APIs, regardless if they are vendor provided or custom developed. Since Microsoft Flow's connectors are built-upon RESTful APIs, many of our connectors support the ability to filter datasets server-side using OData. Some of the benefits of using OData include reducing the amount of data you are bringing into your flow, thus reducing the need to loop through a record set to find values of interest.

In this blog post we are going to explore some popular OData filter expressions that you can use with some of our most popular connectors including SQL Server, Dynamics 365 and SharePoint Online.

Scenario #1: Get Rows from SQL Server and filter on Customer Name

We have the following Azure SQL database with a table that contains many work orders. From Microsoft Flow, we want to return only rows where the Customer Name is equal to 'Contoso'

Inside of Microsoft Flow, we can add a SQL Server – Get Rows action. After providing a ​Table name we also have the ability to provide a ​Filter Query​. Inside this textbox we will provide a statement of CustomerName eq 'Contoso'. The breakdown of this syntax is we need to provide the name of the field in the source system (i.e. SQL Server), followed by an operator. In this case we want to use = which is represented as eq in OData. Don't use the = symbol otherwise you will get a runtime error. Lastly, we need to provide a value that we want to filter on. In this case we want to filter on Contoso.​ Since it is a string, we need to wrap it in single quotes ' '.

For the purposes of this blog post, we will wrap the results in HTML and send them via Office 365 Outlook connector so we can verify our results.

After the flow executes, our we will see our results rendered successfully and only records with a Customer Name of Contoso are displayed.

Scenario #2: Get Rows from SQL Server and filter on date

Microsoft Flow Execute Sql Query Online

In this scenario we want to filter out older records and only retrieve records that have a Work Order Create Date that is less than 30 days old. To accomplish this we will also use a flow expression that will calculate the date, 30 days ago. We will then look for any records that have a Work Order Create Date that is greater than this date. The complete expression is: WorkOrderCreatedTime gt addDays(utcnow('yyyy-MM-ddTHH:mm:ssZ'),-30). In this scenario, WorkOrderCreatedTime is our source field, gt ​represents our 'greater than' operator and addDays(utcnow('yyyy-MM-ddTHH:mm:ssZ'),-30) will calculate a date of 30 days prior.

The results only include records that are less than 30 days old.

Microsoft Flow Execute Sql Query

Scenario #3: List Records from Dynamics 365 using an AND clause

We will now move onto the Dynamics 365 connector where we can also use OData to filter out records. In this case we want to want to retrieve only records where the Account Name is Contoso Hospital AND the City is Phoenix​.

​To accomplish this we will use an AND​ clause that will let us join two statements. The first being our (Account) name being equal to 'Contoso Hospital' and secondly, our address1_city being equal to 'Phoenix'. Our complete statement is name eq 'Contoso Hospital' and address1_city eq 'Phoenix'.

When we execute our flow, we will see results only related to the Contoso Hospital in Phoenix.

Scenario #4: List Records from SharePoint Online that Starts With

In our final scenario, we are going to filter records from a custom SharePoint list. In this particular example, we have 4 records within a SharePoint List and we want to filter on all sites that start with the word 'Contoso'​.

From a flow perspective, we will include the following OData query within our SharePoint action: startswith(Title,'Contoso') where Title ​is the name of the column that we want to filter on and ​Contoso​ is the value we want to the column to start with.

​When our flow runs, we will discover that only the Site Names that beginthe word Contoso are included in our results.

Conclusion

​In this blog post we covered 4 different OData queries across 3 different connectors including SQL Server, Dynamics 365 and SharePoint Online. While the syntax is a little different than what you are used to when using T-SQL, the power available to you unlocks new ways to filter your data in Microsoft Flow. Using OData to filter at the data source will reduce execution times as it reduces the need to loop through data sets in order to find specific records. So not only is this more efficient by sending smaller messages around, but it will also allow your flows to run faster.

For more examples of OData filter expressions, please check out the following Microsoft page.