Exporting Logic App Standard to VS Code
February 24, 2025List all the web apps runtime and stack version under a subscription
February 24, 2025When we deploy Azure Enterprise Scale Landing Zones, We often advise our customers to use a centralised Log Analytics workspace for all their Azure resources and configure the workspace to use the Resource-context Access Mode. With this pattern, normally the cloud administrators and security teams would have been granted access on the Log Analytics workspace level. The application teams who consume the Azure resources do not need to be granted any roles to the Log Analytics workspace.
There are many benefits of using this pattern:
- Log data, table retention can be centrally managed.
- Simply the management of AMPLS (Azure Monitor Private Link Scope).
- Simply the management of Log Analytics workspace access.
- Able to apply for discounted price via Commitment Tier (There is a minimum volume requirement for commitment tier, which small workspaces will not be able to qualify).
- Simplify the log access, monitor and alerting configurations since all the logs are in the same location.
However, one of the challenges of using a centralised Log Analytics workspace is how to calculate and charge back the data usage to different subscriptions. In most of the billing solutions, customers normally have nominated a common tag that’s associated to the internal cost centre which is used for consumption charge back.
Few days ago, I worked with my good friend Alex Verkinderen and came up with the following queries to calculate the total billable data per subscription. The queries are based on the assumption that the SolutionID
tag is used to identify the cost centre for each subscription. The queries are designed to be used in any Log Analytics workspace and leverage the capability of cross querying Azure Resource Graph. By cross querying Azure Resource Graph, we can get the subscription name and any tags of interest from the Azure Resource Graph and join the data with the Log Analytics data.
Here are the 2 queries we have developed:
Total ingested volume over last month per subscription
arg("").ResourceContainers
| where type =~ 'microsoft.resources/subscriptions'
| project SubscriptionName = name, subscriptionId, SolutionId = tags['SolutionID']
| join kind=inner hint.remote=right (
find where TimeGenerated between(startofday(ago(32d))..startofday(now())) project _BilledSize, _IsBillable, _SubscriptionId
| where _IsBillable == true
| summarize BillableDataBytes = sum(_BilledSize) by _SubscriptionId
| extend DataIngestedInGB = format_bytes(BillableDataBytes, 3, "GB")
| extend subscriptionId = _SubscriptionId
) on subscriptionId
| sort by BillableDataBytes
| project SubscriptionName, subscriptionId, SolutionId, DataIngestedInGB
Daily ingestion volume over last month for a specific subscription
arg("").ResourceContainers
| where type =~ 'microsoft.resources/subscriptions'
| project SubscriptionName = name, subscriptionId, SolutionId = tags['SolutionID']
| join kind=inner hint.remote=right (
find where TimeGenerated between(startofday(ago(32d))..startofday(now())) project _BilledSize, _IsBillable, _SubscriptionId, TimeGenerated
| where _IsBillable == true
| summarize BillableDataBytes = sum(_BilledSize) by _SubscriptionId, bin(TimeGenerated, 1d)
| extend DataIngestedInGB = format_bytes(BillableDataBytes, 3, "GB")
| extend IngestionDate = bin(TimeGenerated, 1d)
| extend subscriptionId = _SubscriptionId
) on subscriptionId
| where SubscriptionName =~ 'The-Big-MVP-Sub-2'
| sort by IngestionDate asc
| project SubscriptionName, subscriptionId, SolutionId, DataIngestedInGB, IngestionDate
The billable price can be then easily calculated based on the ingested volume and price per GB.