[Launched] Generally Available: Node auto-provisioning support in AKS
July 18, 2025Microsoft Purview provides Security and Compliance teams with extensive visibility into admin actions within Security Copilot. It offers tools for enriched users and data insights to identify, review, and manage Security Copilot interaction data in DSPM for AI. Data security and compliance administrators can also utilize Purview’s capabilities for data lifecycle management and information protection, advanced retention, eDiscovery, and more. These features support detailed investigations into logs to demonstrate compliance within the Copilot tenant.
Prerequisites
Please refer to the prerequisites for Security Copilot and DSPM for AI in the Microsoft Learn Docs.
Key Capabilities and Features
Heightened Context and Clarity
As organizations adopt AI, implementing data controls and a Zero Trust approach is essential to mitigate risks like data oversharing, leakage, and non-compliant usage. Microsoft Purview, combined with Data Security Posture Management (DSPM) for AI, empowers security and compliance teams to manage these risks across Security Copilot interactions.
With this integration, organizations can:
- Discover data risks by identifying sensitive information in user prompts and responses. Microsoft Purview surfaces these insights in the DSPM for AI dashboard and recommends actions to reduce exposure.
- Identify risky AI usage using Microsoft Purview Insider Risk Management to investigate behaviors such as inadvertent sharing of sensitive data or to detect suspicious activity within Security Copilot usage.
These capabilities provide heightened visibility into how AI is used across the organization, helping teams proactively address potential risks before they escalate.
Compliance and Governance
Building on this visibility, organizations can take action using Microsoft Purview’s integrated compliance and governance solutions. Here are some examples of how teams are leveraging these capabilities to govern Security Copilot interactions:
- Audit provides a detailed log of user and admin activity within Security Copilot, enabling organizations to track access, monitor usage patterns, and support forensic investigations.
- eDiscovery enables legal and investigative teams to identify, collect, and review Security Copilot interactions as part of case workflows, supporting defensible investigations.
- Communication Compliance helps detect potential policy violations or risky behavior in administrator interactions, enabling proactive monitoring and remediation.
- Data Lifecycle Management allows teams to automate the retention, deletion, and classification of Security Copilot data—reducing storage costs and minimizing risk from outdated or unnecessary information.
Together, these tools provide a comprehensive governance framework that supports secure, compliant, and responsible AI adoption across the enterprise.
Getting Started
Enable Purview Audit for Security Copilot
- Sign into your Copilot tenant at https://securitycopilot.microsoft.com/, and with the Security Administrator permissions, navigate to the Security Copilot owner settings and ensure Audit logging is enabled.
Microsoft Purview
To start using DSPM for AI and the Microsoft Purview capabilities, please complete the following steps to get set up and then feel free to experiment yourself.
- Navigate to Purview (Purview.Microsoft.com) and ensure you have adequate permissions to access the different Purview solutions as described here.
DSPM for AI
- Select the DSPM for AI “Solution” option on the left-most navigation.
- Go to the policies or recommendations tab turn on the following:
a. “DSPM for AI – Capture interactions for Copilot Experiences”: Captures prompts and responses for data security posture and regulatory compliance from Security Copilot and other Copilot experiences.
b. “Detect Risky AI Usage”: Helps to calculate user risk by detecting risky prompts and responses in Copilot experiences.
c. “Detect unethical behavior in AI apps”: Detects sensitive info and inappropriate use of AI in prompts and responses in Copilot experiences.
- To begin reviewing Security Copilot usage within your organization and identifying interactions that contain sensitive information, select Reports from the left navigation panel.
a. The “Sensitive interactions per AI app” report shows the most common sensitive information types used in Security Copilot interactions and their frequency. For instance, this tenant has a significant amount of IT and IP Address information within these interactions. Therefore, it is important to ensure that all sensitive information used in Security Copilot interactions is utilized for legitimate workplace purposes and does not involve any malicious or non-compliant use of Security Copilot.
b. “Top unethical AI interactions” will show an overview of any potentially unsafe or inappropriate interactions with AI apps. In this case, Security Copilot only has seven potentially unsafe interactions that included unauthorized disclosure and regulatory collusion.
c. “Insider risk severity per AI app” shows the number of high risk, medium risk, low risk and no risk users that are interacting with Security Copilot. In this tenant, there are about 1.9K Security Copilot users, but very few of them have an insider risk concern.
d. To check the interaction details of this potentially risky activity, head over to Activity Explorer for more information.
5. In Activity Explorer, you should filter the App to Security Copilot. You will also have the option to filter based on the user risk level and sensitive information type. To identify the highest risk behaviors, filter for users with a medium to high risk level or those associated with the most sensitive information types.
a. Once you have filtered, you can start looking through the activity details for more information like the user details, the sensitive information types, the prompt and response data, and more.
b. Based on the details shown, you may decide to investigate the activity and the user further. To do so, we have data security investigation and governance tools.
Data Security Investigations and Governance
If you find Security Copilot actions in DSPM for AI Activity Explorer to be potentially inappropriate or malicious, you can look for further information in Insider Risk Management (IRM), through an eDiscovery case, Communication Compliance (CC), or Data Lifecycle Management (DLM).
Insider Risk Management
- By enabling the quick policy in DSPM for AI to monitor risky Copilot usage, alerts will start appearing in IRM. Customize this policy based on your organization’s risk tolerance by adjusting triggering events, thresholds, and indicators for detected activity.
- Examine the alerts associated with the “DSPM for AI – Detect risky AI usage” policy, potentially sorting them by severity from high to low. For these alerts, you will find a User Activity scatter plot that provides insights into the activities preceding and following the user’s engagement with a risky prompt in Security Copilot. This assists the Data Security administrator in understanding the necessary triage actions for this user/alert. After thoroughly investigating these details and determining whether the activity was malicious or an inadvertent insider risk, appropriate actions can be taken, including issuing a user warning, resolving the case, sharing the case with an email recipient, or escalating the case to eDiscovery for further investigation.
eDiscovery
- To identify, review and manage your Security Copilot logs to support your investigations, use the eDiscovery tool. Here are the steps to take in eDiscovery:
a. Create an eDiscovery Case
b. Create a new search
c. In Search, go to condition builder and select Add conditions -> KeyQL
d. Enter the query as: – KQL Equal (ItemClass=IPM.SkypeTeams.Message.Copilot.Security.SecurityCopilot)
e. Run the query
f. Once completed, add the search to a review set (Button at the top)
g. In the review set, view details of the Security Copilot conversation
Communication Compliance
- In Communication Compliance, like IRM, you can investigate details around the Security Copilot interactions. Specifically, in CC, you can determine if these interactions contained non-compliant usage of Security Copilot or inappropriate text.
- After identifying the sentiment of the Security Copilot communication, you can take action by resolving the alert, sending a warning notice to the user, escalating the alert to a reviewer, or escalating the alert for investigation, which will create a new eDiscovery case.
Data Lifecycle Management
- For regulatory compliance or investigation purposes, navigate to Data Lifecycle Management to create a new retention policy for Security Copilot activities.
a. Provide a friendly name for the retention policy and select Next
b. Skip Policy Scope section for this validation
c. Select “Static” type of retention policy and select Next
d. Choose “Microsoft Copilot Experiences” to apply retention policy to Security Copilot interactions
Billing Model
Microsoft Purview audit logging of Security Copilot activity remains included at no additional cost as part of Microsoft 365 E5 licensing. However, Microsoft Purview now offers a combination of entitlement-based (per-user-per-month) and Pay-As-You-Go (PAYG) pricing models. The PAYG model applies to a broader set of Purview capabilities—including Insider Risk Management, Communication Compliance, eDiscovery, and other data security and governance solutions—based on usage volume or complexity. This flexible pricing structure ensures that organizations only pay for what they use as data flows through AI models, networks, and applications.
For further details, please refer to this Microsoft Security Community Blog: New Purview pricing options for protecting AI apps and agents | Microsoft Community Hub
Looking Ahead
By following these steps, organizations can leverage the full potential of Microsoft Purview to enhance the security and compliance of their Security Copilot interactions. This integration not only provides peace of mind but also empowers organizations to manage their data more effectively.
Please reach out to us if you have any questions or additional requirements.
Additional Resources
- Use Microsoft Purview to manage data security & compliance for Microsoft Security Copilot | Microsoft Learn
- How to deploy Microsoft Purview DSPM for AI to secure your AI apps
- Learn how Microsoft Purview Data Security Posture Management (DSPM) for AI provides data security and compliance protections for Copilots and other generative AI apps | Microsoft Learn
- Considerations for deploying Microsoft Purview Data Security Posture Management (DSPM) for AI | Microsoft Learn
- Learn about Microsoft Purview billing models | Microsoft Learn