While I was working on an updated Flow I ran into a challenge where data which isn’t regularly written to Log Analytics may not be there when I need it to be. For my example, I have a Log Analytics custom log which contains the current state of the windows in my house (open or closed). The workspace I’m using only stores data for 7 days so if the state hasn’t changed in 7 days it not be available when I need the record. To handle this, I put together a simple Flow which reads the most recent log analytics records and re-writes them (effectively making sure that the record w1ill be there when I need it to be).
For background, I’m working on a Microsoft Flow which provides “intelligent” notification on whether to open or close the windows at my house. This is both for energy efficiency and it’s darn nice to get some fresh air once in a while! I have used this as my use case in these blogs posts:
- QuickTricks: How to join unrelated data types in Log Analytics
- Tips when debugging sub-queries in the New Query Language For Log Analytics
- How to send any data you want to Log Analytics without code!
- Scheduling Log Analytics queries to run in Microsoft Flow
- Creating complex queries in the new query language for Log Analytics
The Flow which does this is pretty straightforward. Below is what it does:
- Uses a Recurrence of 1 day makes sure that the data will be available when I need it to be.
- Uses the Azure Log Analytics (run query and list results) to gather the most recent data record.
- Uses the Send Data (Azure Log Analytics Data Collector) to re-write the record which was gathered.
Below are the screenshots of this configuration:
The recurrence could be once a week for something like this but since it was a small amount of data I went with once a day so that it was extremely likely to write within the one-week retention period.
Azure Log Analytics (run query and list results):
You need to provide your subscription, resource group and workspace name as well as the query which will gather your data. Note the use of the “top 1” to gather the most recent record.
Send Data (Azure Log Analytics Data Collector):
Next, we need to re-send the data back to the Log Analytics workspace by creating a JSON format. The sample below takes the two fields which I had gathered (OpenWindow_s and City_s) and re-writes them with the name of the field in front (OpenWindows and City) plus the values for the records which were retrieved from Log Analytics. We also need to provide the name of the custom log that this will be re-written to.
Tip: I use to validate my json structure.
If you need a way to re-write data into Log Analytics to avoid the data being unavailable after a certain timeframe, try using a Flow to re-write the data – it’s simple and quite effective for this requirement. This same approach could also be used to read data from one Log Analytics workspace and write it to another workspace. An example of where this could be useful is when you have multiple customer workspaces with short data retention and a separate workspace which is used for longer-term data retention.