
Microsoft recently acknowledged an error causing its AI work assistant Copilot to access and summarise some of its users’ confidential emails by mistake.
The tech giant has pushed Microsoft 365 Copilot Chat as a secure way for workplaces and their staff to use its generative AI chatbot.
But it said a recent issue caused the tool to surface information to some enterprise users from messages stored in their drafts and sent email folders – including those marked as confidential.
Microsoft says it has rolled out an update to fix the issue, and that it “did not provide anyone access to information they weren’t already authorised to see”. However, some experts warned the speed at which companies compete to add new AI features meant these kinds of mistakes were inevitable.
Copilot Chat can be used within Microsoft programs such as Outlook and Teams, used for emails and chat functions, to get answers to questions or summarise messages.
“We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labelled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop”, a Microsoft spokesperson told BBC News. “While our access controls and data protection policies remained intact, this behaviour did not meet our intended Copilot experience, which is designed to exclude protected content from Copilot access”, they added. “A configuration update has been deployed worldwide for enterprise customers”.
The blunder was first reported by tech news outlet Bleeping Computer, which said it had seen a service alert confirming the issue. It cited a Microsoft notice saying “users’ email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat”.
The notice added that a work tab within Copilot Chat had summarised email messages stored in a user’s drafts and sent folders, even when they had a sensitivity label and a data loss prevention policy configured to prevent unauthorised data sharing.
Reports suggest Microsoft first became aware of the error in January 2026. Its notice about the bug was also shared on a support dashboard for NHS workers in England – where the root cause is attributed to a “code issue”.
A section of the notice on the NHS IT support site implies it has been affected. But it told BBC News the contents of any draft or sent emails processed by Copilot Chat would remain with their creators, and patient information has not been exposed.
Source: BBC News
Image Credit: Microsoft Copilot


