Amey Holden

View Original

Using Power Automate with Azure Queues & Cognitive Services for AI fuelled social media analytics

In a previous article [1] I shared an immersive, AI enriched social media monitoring and analysis model driven app to monitor any Twitter hashtags or mentions of your choice! This article is going to take a technical deep dive into how we leveraged Power Automate to integrate Twitter with a range of Azure services

The ins and outs behind the scenes to get from Tweet to Azure

This post will focus on the data processing and enrichment, a future post will look into how we visualised and surfaced the data in the model driven app so watch out for that coming soon.

Power Automate

Capture the tweet data, establish unique identifiers in the DataFlex Pro for the model driven app, calculate location and pass a message to an Azure Queue

Azure

We leverage Azure Functions, Queues, Cognitive Services & Cosmos DB to do the following:

  1. Consume messages from a queue with Azure Functions & integrate with Cognitive Services​

  2. Write data to Cosmos DB

  3. Write messages to a queue which can be consumed by Power Automate

  4. Visualise data in a web app

Part 1: capture Tweet data, calculate location & send to Azure

Capture Tweets with #PowerPlatform and initialise some variables to use later on and also handy for debugging or demonstrating flows

Next, we start to establish unique record identifiers for the Tweets and the Tweeter to later on when receiving the message from Azure. This starts with the Tweeter - a ‘Contact’ in Dataflex, so first we check if the Tweeter is already known to us, if so capture the ID for that record, if not create one.

When using the ‘List record’ feature, even when you limit the top count to 1 the outputs will automatically create a loop for each item returned, it gets messy and we only want the first match, to prevent this we use the formula:
first(outputs('Search_for_contact_with_twitter_handle')?['body/value'])?['contactid']

Then we want to check the output from the search to see if the contact exists, we do this by checking if the list records returned any results using the formula:
empty(body('Search_for_contact_with_twitter_handle')?['value'])

Based on this condition we then either get the existing record or create a new one

Some of the data enhancements can be made within Power Automate, Azure does not have to do all the heavy lifting. Where the data is available we can use the Bing Maps connector to calculate the longitude and latitude of the tweeters profile location. We search on city by spiting the location string to take only the words before the first comma (,) using the following formula:
first(split(triggerOutputs()?['body/UserDetails/Location'], ','))

Then convert the longitude and latitude outputs from Bing from a string to a floating point number which can be used later by our PCF map control using the following formula:
float(variables('Longitude')) & float(variables('Latitude'))

Previously blogged about this here -> https://www.ameyholden.com/articles/locate-tweets

Now we are ready to pass out a message to an azure queue that provides all the information it needs to enrich the data and send it back in a way which can be pushed back into DataFlex Pro.

Edited to add full code snippet for sending the queue message to Azure

See this content in the original post

Finally, you may also see that we do another parallel step here which is checking to see if its a tweet or a retweet - this relates to the gotcha I mentioned above on the multiple ‘Tweet Text’ holding variables. For a more detailed breakdown on this one check out my previous article [2]

Part 2: enrich and visualise data in Azure

This is where we hand over to the magic of Azure to enrich the data and create a web app visualisation, the full code package can be found on GitHub [3] but three important snippets to highlight here are:

Capture the input from the Azure Queue in which the message was sent to us, cleaning the data a bit and making it a JavaScript Object

See this content in the original post

Call the helper functions that will talk to Cognitive Services and do our machine learning 

See this content in the original post

Pop a new message on an Azure Queue for Flow to pick up

See this content in the original post

Finally we store the enriched data in Cosmos DB and surface it in an Azure Web App as a Key Phrase word cloud! We will come back to this beauty later in the next article to bring this into out Model Driven App

Whats next?

There is no point enriching the data in Azure if were not going to get something back out of it into our model driven app. The next article will explore how we receive a message from an Azure queue to update DataFlex Pro. Then build a key interface to all the data with Model Driven Apps & Power App Component Framework to display this data to users in ways which create an enjoyable user experience.

1) Retrieve and store the enriched data from Azure into Dataflex

2) Surface the data and visualisations in a Model Driven app

  • Dashboards to visualise and drill down on the tweet data with simple user driven point & click filtering

  • Many to Many relationship between Key Phrase & Tweets so drill down can be seen from both the tweet and the particular key phrase

3) Extend the model driven app user interface with PCF

[1] A GIF heavy overview on social media monitoring & analytics with the Power Platform and Azure https://www.ameyholden.com/articles/azure-javascript-pcf-social-media-monitoring-analysis

[2] Locate tweets with Power Automate & Tweet Text Gotchas - https://www.ameyholden.com/articles/locate-tweets

[3] Git Hub for the Azure magic behind the scenes from part 2 -> https://github.com/aaronpowell/event-twitter-insights