Enhancements to purchase-related details in Cost Management for MCA customers
May 8, 2025
Windows 11 Hotpatching with Intune
May 9, 2025This article explores how to centralize logging from on-premises servers—both physical and virtual—into a single Log Analytics Workspace.
The goal is to enhance monitoring capabilities for the Azure Arc Connected Machine Agent running on these servers. Rather than relying on scattered and unstructured .log files on individual machines, this approach enables customers to collect, analyze, and gain insights from multiple agents in one centralized location. This not only simplifies troubleshooting but also unlocks richer observability across the hybrid environment.
The architecture of the solution will look like this:
The main components of the solutions are the following:
- Azure Arc Connected Machine Agent
- Azure Monitor Extension
- Data Collection Rule
- Data Collection Endpoint
- Log Analytics Workspace
Pre-requisites
- A hybrid server connected to Azure Arc
- For demo purposes, you can use on of the following resources:
- Azure Arc Jumpstart
- How to evaluate Azure Arc-enabled servers with an Azure virtual machine – Azure Arc | Microsoft Learn
- For demo purposes, you can use on of the following resources:
- An Azure subscription
Implementation
Before we can jump into the practical steps, you first need to create the following resources in Azure:
- Log Analytics Workspace (you may re-use an existing one, depending on your company policies): more information here.
- Data Collection Rule: more information here.
- Data Collection Endpoint: more information here.
Step 1: Create a custom table in the Log Analytics Workspace
More information on how to do this: Add or delete tables and columns in Azure Monitor Logs – Azure Monitor | Microsoft Learn.
For the sample logs you will need in one of the steps, use the following sample JSON file:
[
{
“TimeGenerated”: “2025-04-02T12:39:42Z”,
“level”: “debug”,
“msg”: “Running as elevated user”
},
{
“TimeGenerated”: “2025-04-02T12:39:42Z”,
“level”: “debug”,
“msg”: “Agent Command: show ”
},
{
“TimeGenerated”: “2025-04-02T12:39:42Z”,
“level”: “debug”,
“msg”: “Agent Version: 1.50.02986.2095”
}
]
Step 2: Configure the Data Collection Rule to collect custom text files
More information on how to do this: Collect text file from virtual machine with Azure Monitor – Azure Monitor | Microsoft Learn.
The Data Source configuration should look like this:
- File pattern: C:ProgramDataAzureConnectedMachineAgentLogazcmagent.log
- Table name: insert here the table name from the previous
- Record delimiter: End-of-Line
- Transform: insert the KQL Query below
source | extend TimeGenerated = todatetime(extract(@”time=””([^””]+)”””, 1, RawData)), level = extract(@”level=([^ ]+)”, 1, RawData), msg = extract(@”msg=(.*)”, 1, RawData) | project TimeGenerated, level, msg
The KQL query above transforms unstructured logs into a structured format, making them suitable for ingestion into the table created in Step 1, following the required schema.
In the Destination tab, select the Log Analytics Workspace used.
Step 3: Associate the Data Collection Rule with your Arc Server(s)
To associate a Data Collection Rule (DCR) with an Azure Arc-enabled server, refer to the official documentation: Manage data collection rule associations in Azure Monitor – Azure Monitor | Microsoft Learn. For testing purposes, manually associating the DCR with a few Arc servers is perfectly fine. However, when scaling this across multiple servers, it’s recommended to automate the process—for example, using Azure Policy. You can find more information on how to apply this policy here.
In the process of linking the DCR to the Arc Server, the Azure Monitor Extension will be enabled by default on the Arc Server, which is a requirement.
Step 4: Query data in Log Analytics Workspace
After linking the Data Collection Rule (DCR) to the Azure Arc-enabled server, it may take a few minutes for the Azure Monitor Agent (AMA) to be installed and for the first data to be ingested into Azure. Once the setup is complete, you can begin querying the logs in your Log Analytics Workspace.
To get started:
- Navigate to the Log Analytics Workspace associated with your DCR.
- In the left-hand menu, select Logs to open the query editor and explore the ingested data.
In the query window, enter the following KQL query:
| extend ArcVMid = tostring(split(_ResourceId, “/”)[-1])
| project TimeGenerated, ArcVMid, level, msg
Make sure to replace the first line with the name of your custom table name created in step 1.
Conclusion
Centralizing logs from on-premises servers into a Log Analytics Workspace using Azure Arc and the Azure Monitor Agent is a powerful step toward unified observability in hybrid environments. By moving away from isolated .log files and toward structured, queryable data, organizations gain deeper insights, faster troubleshooting, and a more scalable monitoring strategy. This approach not only simplifies operations but also lays the foundation for advanced analytics, automation, and security monitoring across diverse infrastructure.
Potential use cases
Proactive Troubleshooting
Detect and resolve issues with the Azure Arc Connected Machine Agent before they impact workloads, using KQL queries and alerts.
Compliance & Auditing
Maintain a consistent and queryable log history across all hybrid servers to support compliance reporting and audit trails.
Automation & Alerting
Set up automated alerts based on log patterns (e.g., failed connections, agent restarts) to reduce manual monitoring overhead.
Hybrid Cloud Visibility
Gain a single-pane-of-glass view across on-prem and cloud environments, enabling consistent monitoring policies and dashboards.
Limitations
Currently, log collection is limited to the Azure Arc Connected Machine Agent log files.
In future iterations, I plan to expand this solution to include logs from additional sources—such as extension log files—to provide even broader visibility and diagnostic capabilities.