Microsoft Copilot Spent Two Weeks Reading Confidential Emails It Was Supposed to Ignore

TL;DR


Summary:

- Microsoft's new AI assistant, Copilot, has a bug that allows users to bypass the company's Data Loss Prevention (DLP) system. This means that confidential emails and other sensitive information can be shared externally without the company's knowledge or approval.

- The bug in Copilot's natural language processing algorithms can be exploited to generate emails that appear to be from the user's own account, but with the content coming from the company's confidential documents.

- This security flaw poses a significant risk for businesses that rely on Microsoft's tools, as it could lead to the unintended disclosure of sensitive information and potential data breaches.

Like summarized versions? Support us on Patreon!