Azure Log Analytics is a platform in which you do just that: aggregate VM and Azure resource log files into a single data lake (called a Log Analytics workspace) and then run queries against the data, using a Microsoft-created data access language called Kusto (pronounced KOO-stoh) Query Language (KQL).
You’ll find that Log Analytics somehow normalizes all these different log streams into a tabular structure. You’ll also discover that KQL is similar to Structured Query Language (SQL), the data access language that is standard for relational databases.
Creating a Log Analytics workspace
The first order of business is to deploy a Log Analytics workspace. Then you can on-board as few or as many Azure resources to the workspace as you need. You can also deploy more than one Log Analytics workspace to keep your log data separate.To create a new Azure Log Analytics workspace, follow these steps:
- In the Azure portal, browse to the Log Analytics Workspaces blade, and click Add.
The Log Analytics workspace blade appears.
- Complete the Log Analytics workspace blade.
You'll need to provide the following details:
- Workspace name
- Subscription name
- Resource group name
- Location
- Pricing tier
- Click OK to create the workspace.
- Click OK to submit your deployment.
- Data ingestion limit of 5 GB per month
- 30-day data retention limit
Connecting data sources to the Azure Log Analytics workspace
With your workspace online, you’re ready to on-board Azure resources into said workspace. To connect Azure resources to the workspace, go back to Monitor Diagnostic settings, enable diagnostics, and point the log streams to your workspace.You can connect VMs to the workspace directly from the workspace’s Settings menu. Follow these steps:
- In your Log Analytics workspace settings menu, click Virtual Machines.
You see a list of all VMs in the workspace’s region. You can see which VMs are connected to the workspace and which are not.
- If necessary, use the filter controls until you see the VM you want to connect.
You can link a VM to only one workspace at a time. Below for example,the vm1 virtual machine is linked to another workspace.
Connecting VMs to an Azure Log Analytics workspace. - Select the desired VM, and click Connect.
Behind the scenes, Azure deploys the Log Analytics agent (formerly called Microsoft Monitoring Agent) to the VM.
- Verify that the VM is connected to the workspace.
You can see this information in your workspace settings. Or you can revisit your VM’s Extensions blade and verify that the MicrosoftMonitoringAgent extension is installed.
You should know that Log Analytics can on-board on-premises VMs, particularly those managed by Systems Center Operations Manager, just as it can native cloud Linux and Windows Server VMs.
You can disconnect a VM from its current workspace and connect it to another one. This operation is trivial, taking only two minutes or so to complete. To do this, simply select the VM from within the workspace and click Disconnect from the toolbar.
Writing KQL queries
You need to know a bit about how to access your Log Analytics workspace data with KQL. KQL is fast and easy to learn, and it should seem familiar to you if you’ve used Splunk Search Processing Language, SQL, PowerShell, or Bash shell.Touring the Log Search interface
You can get to the Log Search interface by opening Monitor and selecting the Logs blade. Another way to get there (is to go to your Log Analytics workspace and click the Log setting.A third method is to use the Log Analytics Query Playground, where you can work with an enormous data set, getting to know Log Analytics before generating a meaningful data set.
Follow these steps to run some sample KQL queries:
- Go to the Log Analytics portal demo.
This site is authenticated, but don’t worry: You’re using Microsoft’s subscription, not your own.
- Expand some of the tables in the Schema list.
There’s a lot in this list. Log Analytics normalizes all incoming data streams and projects them into a table-based structure.
Expand the LogManagement category; then expand the Alert table, where you can use KQL to query Azure Monitor alerts. The t entries (shown under the expanded SecurityEvent item below) are properties that behave like columns in a relational database table.
Azure Log Analytics Log Search interface - On the Log Search toolbar, click Query Explorer, expand the Favorites list, and run the query Security Events Count by Computer During the Last 12 Hours.
This environment is a sandbox. Microsoft has not only on-boarded untold resources into this workspace but also written sample queries to let you kick the tires.
- In the results list, click Chart to switch from Table to Chart view.
You can visualize your query results automatically with a single button click. Not every results set lends itself to graphical representation, but the capability is tremendous.
- Click Export, and save your query results (displayed columns only) to a CSV file.
Note the link to Power BI, Microsoft’s cloud-based business intelligence/dashboard generation tool.
Writing basic KQL queries
For fun, let’s try an obstacle course of common KQL queries. Click the plus sign in the Log Search query interface to open a new tab — a multitab interface like those in Visual Studio and Visual Studio Code.To get a feel for a table, you can instruct Azure to display any number of rows in no particular order. To display 10 records from the SecurityEvent table, for example, use the following command:
SecurityEvent | take 10Did you notice that the query editor attempted to autocomplete your query as you typed? Take advantage of that convenience by pressing Tab when you see the appropriate autocomplete choice appear.
Use the search keyword to perform a free-text query. The following query looks in the SecurityEvent table for any records that include the string "Cryptographic"
:
search in (SecurityEvent) "Cryptographic" | take 20When you press Enter, you’ll doubtless notice the pipe character (
|
). This character functions the same way here as it does in PowerShell or the bash shell. Output from one query segment is passed to the next segment via pipe — a powerful construct for sure.You can ramp up the complexity by finishing with filtering and sorting. The following code both filters on a condition and sorts the results in a descending manner based on time:
SecurityEvent | where Level == 8 and EventID == 4672 | sort by TimeGenerated descIf you’re thinking, “Wow, these KQL queries act an awful lot like SQL!” you’re right on the money. Welcome to Log Analytics!