By Mitesh Patekar
Not being a big data or business intelligence (BI) guy myself, it’s been a fascinating experience familiarizing myself with the two areas. BI can always be linked to each and every domain, whether it’s business development or making strategic decisions about application monitoring.
Being an application developer, it’s a good practice to enable trace logs inside your code. Ever wondered where these logs are stored though? It’s similar to a black box, and many developers aren’t sure how can they monitor applications’ health or check an application’s functionality, not to mention see runtime logic errors. Well, with Microsoft Azure, you can store these logs in persistent storage and analyze them with Power BI.
Power BI is one of the trending BI tools generating a lot of industry buzz. It’s Microsoft’s tool to visualize big data through modern, real-time dashboards and data visualizations. It has a super cool freeform, drag-and-drop canvas, a broad range of modern data visualizations and a user-friendly report authoring experience. It automatically retains all your data reports and dashboards, all synchronized with your data. It has scheduled refreshes, as well the ability to view data in real time. It also enables your data to be accessed from anywhere at anytime with live, interactive mobile access to your business information.
Let’s walk through using the diagnostic logs in the Azure App Service to leverage data logs stored in Azure Blob Storage and then visualize them using Power BI.
1. How to enable diagnostic logs in Azure Blob Storage for web apps
Log into portal.azure.com. Click “Diagnostic Logs” in the app service for which you want to enable trace logs. It will open a “Diagnostics Logs” blade on the right and then enable the “Application Logging (Blob).” This will dump all the logs’ information into Azure Blob Storage. Next, select the level you want to track (Verbose, Error, Warning, Information):
Then, select the “Storage Account” settings and add a new Storage account. This will create a new “Storage Account” in the same “Resource” group as that of your app service:
Next, select the storage account just created and add a new “Container.” Select the access type depending whether you’re keeping the data private or public:
The data stored in the storage account is in the format of CSV files, and there are a number of CSV files and folders created, which makes it difficult to view and monitor the logs. However, Power BI has amazing query tools designed to analyze and read this raw data.
So, let’s go through the steps to build and create magic with this data.
2. Now you have the data, but how do you view and monitor it?
You will need to download and install the latest version of Power BI desktop to use this feature.
Launch the Power BI desktop tool and click the “Get Data” option:
Choose the “Microsoft Azure Blob Storage” option and click “Connect”:
Enter the name and key of the Azure Storage account which is configured to receive diagnostics logs:
Click the check box of the container in which logs are configured to store (You can see the list of CSV files on the right.). Click “Edit” to open the query editor window:
Click on the column symbol on the “Content” column (marked with yellow) to load and open all the data from the CSV file:
After opening all content, you can execute operations like extracting year, month and day from date, column renaming, etc. Click the “Close & Apply” option to close the query editor window and load reports on the report page:
Now, you can drag and drop fields from the table to the monitor. There are different visualizations to choose from. Here, I’m choosing a filter visualization for “Year:”
Similarly, you can create a table list showing different columns from the table:
For a guide to how you can create visualizations, refer to this page.
After exploring different fields and visualizations, you can develop a report dashboard similar to the one below:
With this report, you can easily view all logs by filtering the day, month and year. You can also check for a particular error message by using the “Page Level” or “Report Level” filters on the right.
3. Publishing to the Power BI Web portal
Use the Home -> “Publish” option to publish this report to the Power BI Web app. It will ask for the Power BI login credentials. Next, enter the workspace where you want to publish:
After publishing, you can log into the Power BI portal.
You can check the newly created report and dataset marked by *:
On the Power BI web app, you can schedule refreshes up to eight times a day. For scheduling this refresh, right-click on the name of the dataset in “Datasets” option -> “Name of Dataset”:
(Note: You might need to update the Data Source credentials of the Azure Blob Storage account.)
Voilà! Now this feature will automatically load the new data into the dataset on the refresh data times you scheduled. Not as hard as you thought, right?
While your refresh schedule is only up to eight times a day, you can execute a real-time analysis while feeding data directly into Power BI. To learn even more, check out Microsoft’s Stream Analytics and Power BI dashboard page.
Lastly, follow 10th Magnitude on Twitter to stay on top of all things cloud and big data.
As always, keep calm…and cloud on.