CVE-2026-26133 – M365 Copilot
“When AI assistants expose the wrong information, sensitive data can travel farther and faster than anyone expects.”
CVE-2026-26133 is an information disclosure vulnerability affecting Microsoft 365 Copilot. The issue occurs when Copilot improperly handles certain requests or contextual data, potentially allowing sensitive information to be exposed to unauthorized users. Because Copilot aggregates and processes large volumes of organizational data to generate responses, weaknesses in how data is filtered or validated could result in unintended disclosure of confidential information within an organization.
CVSS Score: 7.1
SEVERITY: High
THREAT:
This vulnerability could allow users to retrieve information they should not normally have access to through Copilot responses. Attackers may craft prompts or queries that cause Copilot to expose sensitive organizational data, internal documents, or restricted information. In environments where Copilot has broad access to corporate data sources, the impact could include exposure of intellectual property, confidential communications, or internal business records.
EXPLOITS:
There are no confirmed reports of active exploitation in the wild at the time of disclosure. However, vulnerabilities involving AI assistants and data exposure can be exploited through prompt manipulation or carefully crafted queries designed to bypass intended access controls. Security researchers may develop proof-of-concept techniques demonstrating how such information disclosure could occur.
TECHNICAL SUMMARY:
The vulnerability arises from improper handling or filtering of contextual data within Microsoft 365 Copilot. Copilot generates responses based on content retrieved from various Microsoft 365 services such as documents, emails, and collaboration platforms. If Copilot fails to correctly enforce access boundaries or sanitize contextual data used to generate responses, a user could receive information that originates from sources they should not be permitted to access. This results in unintended disclosure of sensitive organizational information.
EXPLOITABILITY:
Affected environments include Microsoft 365 tenants using Microsoft 365 Copilot capabilities.
Exploitation may involve a user issuing specially crafted prompts or queries designed to retrieve restricted information from Copilot’s contextual knowledge sources. No additional privileges beyond standard Copilot access may be required.
BUSINESS IMPACT:
Information disclosure vulnerabilities in AI-driven systems can have serious implications for organizations. Sensitive corporate data—such as financial information, internal communications, product plans, or customer records—could be unintentionally exposed to users who are not authorized to view it. Such exposure could lead to compliance violations, reputational damage, and potential legal consequences.
WORKAROUND:
If patching or mitigation updates cannot be immediately applied:
- Restrict Copilot access to sensitive repositories where possible.
- Review Microsoft 365 data access permissions and sharing policies.
- Monitor Copilot usage and responses for unusual or excessive data retrieval.
- Apply least-privilege access controls across Microsoft 365 workloads.
Key Details
- Affected Product
- Microsoft 365 Copilot
- Attack Vector
- Network
- Attack Complexity
- Low
- Privileges Required
- None
- User Interaction
- Required
- CWE Classification
- CWE-77