If you’ve been working in Azure over the past couple of years, you have probably seen the term “Log Analytics workspace” (or possibly just “Log Analytics”) thrown around when looking at different resources. Log Analytics has some Azure history; it used to be part of the OMS Portal before Portal was rolled into Azure Monitor.
Essentially, Log Analytics is a centralized repository for collecting log data. As Azure has matured, Log Analytics has been leveraged more frequently for solutions as varied as:
- VM Insights (VM health and performance metrics)
- Security Center (security-related events and alerts)
- Application Insights (telemetry for Azure App Services)
- Update Management (automated update deployment to Windows/Linux servers)
- Change Tracking (track configuration and data changes on servers)
- Service Map (map out inbound/outbound connections for servers)
- Azure SQL Auditing (for access and activity on Azure SQL databases)
And many more.
Microsoft offers other Solutions in the Azure Marketplace and custom Solutions can be found on the Quickstart Templates page.
You can also build your own tables/graphs and compile them into Dashboards by building your own log queries using Kusto Query Language (KQL). If you are new to KQL, I recommend reading up on the documentation to get more of a conceptual understanding of how it works.
Configuring Log Analytics to collect log data is pretty simple. If you have servers running in Azure already, you can deploy the Microsoft Monitoring Agent (MMA) from the Log Analytics workspace itself and it will automatically install the agent/extension. If you have servers running on-prem or in another cloud environment, you can still install MMA (manually or through some other automated deployment method).
While Log Analytics can centralize and help visualize the log data you store in it, it does not come with any pre-built reports or Dashboards out of the box. So a Log Analytics workspace doesn’t do much until you install/configure Azure Solutions or build your own Dashboards using log queries (which is why it is highly recommended to get comfortable with KQL if you wish to leverage Log Analytics in your environment).
Also, it’s important to note that Log Analytics can get expensive; at a Pay-As-You-Go rate of $2.30/GB per month, you can easily rack up high Azure costs if you don’t fully understand what your Azure resources are sending to Log Analytics (looking at you, SQL Server Auditing and VM Insights). Because of this, it is very important to monitor your Log Analytics deployments closely (especially the first few days after enabling a Solution). You don’t want to get surprised by your Azure bill suddenly skyrocketing because a Solution unexpectedly starts sending 300GB of log data each month.
Luckily, monitoring log usage is easy. The Usage and Estimated Costs page provides a high-level overview of how much log data is in your workspace and how much it is costing you per month. However, if you want to dig into the details more and get specific data about what is sending log data, then you will need to head to the Log queries.
For more detail on the log usages, see the below Kusto query examples. These can be used to help analyze the log data volumes in your environment.
1. View list of Usage data and summarize by type of log data. See example below.
Usage
| where TimeGenerated > startofday(ago(31d))
| where IsBillable == true
| summarize TotalVolumeGB = sum(Quantity) / 1000 by DataType
| sort by TotalVolumeGB desc
2. Collect logs of DataType from previous query results and summarize based on specified Column(s).
DataType
| where TimeGenerated > startofday(ago(31d)) and TimeGenerated < startofday(now())
| where _IsBillable == true
| summarize count(), SizeGB=sum(_BilledSize)/1073741824 by Column1,Column2,etc
| sort by SizeGB desc
So for example, if I wanted to see all log data for the Event data type and sort it by Computer (for source machine) and Event ID (to see what is logging event data), then I would use this query:
Event
| where TimeGenerated > startofday(ago(31d)) and TimeGenerated < startofday(now())
| where _IsBillable == true
| summarize count(), SizeGB=sum(_BilledSize)/1073741824 by Computer, EventID
| sort by SizeGB desc
Or if I wanted to look at the AzureDiagnostics data type and sort by Resource ID (Azure resource sending log data) and Category (type of Azure log data being sent), then I would use the below query:
AzureDiagnostics
| where TimeGenerated > startofday(ago(31d)) and TimeGenerated < startofday(now())
| where _IsBillable == true
| summarize count(), SizeGB=sum(_BilledSize)/1073741824 by ResourceId,
| sort by SizeGB desc
If you’re not sure what columns would be relevant to you, you can run the data type by itself as a Log Analytics query. This will return all log data for that data type. You can look through it yourself to see what column types you can filter on.
So, if you just run the data type itself in the query window with no other flags/filters/lines, then it returns all logs for that data type. For example, just running a query with the keyword Event will return all Event logs in LA, and AzureDIagnostics will return any diagnostic log data collected from Azure resources (as you can see above). Running queries of this type will give you a great idea what columns are important to the question you are trying to solve and will help you create new queries that are even more precise.
More methods for monitoring and controlling costs in Azure Log Analytics can be found here.
Overall, if you keep a tight leash on the log usage, Log Analytics can be a helpful tool that can be leveraged to provide insight on resources and machines reporting to it. While it does not provide any pre-built dashboards out of the box, it does give you the freedom and tools to build your own and tailor them to your environment. In addition, some Azure Solutions in the Portal do provide a pre-configured dashboard and can help you visualize specific areas of your infrastructure. We at Anexinet have deep experience with Azure and have successfully helped countless companies like yours on their journey to and through the cloud. We would love to help you get started, please feel free to reach out and contact us with any questions.

Tim O'Sullivan
Consultant
Tim O’Sullivan is a consultant with Anexinet’s Managed Services. As a member of the Infrastructure group, Tim specializes in on-premises and cloud infrastructure management in Azure/O365.
Let’s get the conversation started
Reach out now to begin your digital transformation
+ 16,659
ZOOM MEETINGS
+ 9,789
HAPPY CLIENTS
+ 5,075
FINISHED PROJECTS
+ 133,967,432
LINES OF CODE
© 2000 - 2021 Anexinet Corp., All rights reserved | Privacy Policy | Cookie Policy
This website uses cookies
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL cookies.
Manage consent
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.