Connect with us

Innovation and Technology

Microsoft Copilot AI Poses Password Security Risk

Published

on

Microsoft Copilot AI Poses Password Security Risk

Introduction to AI-Driven Attacks

AI can be a force for good when it comes to security protections, but also, increasingly, a force for bad. The latter has recently been exemplified in a multi-stage AI-driven attack against Microsoft Teams users, for example. As the name implies, Pen Test Partners is a company that specializes in security consulting, specifically penetration testing. These are professional hackers who can find the exact same routes to compromise your systems that the most advanced attackers would look to exploit. Those threat actors are increasingly using AI-powered attacks, so it makes sense for red team hackers to do likewise.

Red Team Penetration Testers Use Copilot AI To Hack Microsoft SharePoint

Pen Test Partners took a close look at how Microsoft’s Copilot AI for SharePoint could be exploited. The results were, to say the least, concerning. Not least considering an encrypted spreadsheet that the hackers were, quite rightly, rejected from opening by SharePoint, no matter what method was employed, was broken wide open when they asked the Copilot AI agent to go get it. “The agent then successfully printed the contents,” Jack Barradell-Johns, a red team security consultant with the security company, said, “including the passwords allowing us to access the encrypted spreadsheet.”

Access to Passwords

I would strongly recommend reading the full report for all the details of how the red team hackers exploited Copilot AI for SharePoint during their engagement, but I want to focus on the access to passwords, as that’s what has really grabbed my attention, and should grab yours as well. Barradell-Johns explained that during the engagement, the red teamers encountered a file named passwords.txt, located adjacent to an encrypted spreadsheet containing sensitive information. Naturally, they tried to access the file. Just as naturally, Microsoft SharePoint said nope, no way. “Notably,” Barradell-Johns said, “in this case, all methods of opening the file in the browser had been restricted.”

Circumventing Download Restrictions

So, what did the red team hackers do? Use the read team hacking mindset and ask the Copilot AI for Sharepoint agent to go and get it instead. “The agent then successfully printed the contents,” Barradell-Johns reported, “including the passwords allowing us to access the encrypted spreadsheet.” The download restrictions that are part of the restricted view protections were circumvented, and the content of the Copilot chats could be freely copied.

Microsoft Responds To Red Team Copilot AI SharePoint Hacking Report

I reached out to Microsoft, and a spokesperson said: “SharePoint information protection principles ensure that content is secured at the storage level through user-specific permissions and that access is audited. This means that if a user does not have permission to access specific content, they will not be able to view it through Copilot or any other agent. Additionally, any access to content through Copilot or an agent is logged and monitored for compliance and security.” I then contacted Ken Munro, founder of Pen Test Partners, who issued the following statement addressing the points made in the one provided by Microsoft.

Pen Test Partners Response

“Microsoft are technically correct about user permissions, but that’s not what we are exploiting here. They are also correct about logging, but again it comes down to configuration. In many cases, organisations aren’t typically logging the activities that we’re taking advantage of here. Having more granular user permissions would mitigate this, but in many organisations data on SharePoint isn’t as well managed as it could be. That’s exactly what we’re exploiting. These agents are enabled per user, based on licenses, and organisations we have spoken to do not always understand the implications of adding those licenses to their users.” And, you’d better believe, if there are any configuration holes, then Copilot AI will find them.

Conclusion

The use of AI-powered attacks by red team hackers has highlighted a significant vulnerability in Microsoft’s Copilot AI for SharePoint. The ability of the Copilot AI agent to circumvent download restrictions and access sensitive information, including passwords, is a concerning issue that needs to be addressed. While Microsoft has responded by stating that user permissions and logging are in place to protect content, Pen Test Partners has pointed out that configuration holes can be exploited by attackers. It is essential for organizations to be aware of these risks and take steps to mitigate them.

FAQs

  • Q: What is Copilot AI for SharePoint?
    A: Copilot AI for SharePoint is a feature that uses artificial intelligence to assist users with tasks and provide information.
  • Q: How did the red team hackers exploit Copilot AI for SharePoint?
    A: The red team hackers asked the Copilot AI agent to access a restricted file, which was then printed, including passwords, allowing them to access an encrypted spreadsheet.
  • Q: What has Microsoft said about the issue?
    A: Microsoft has stated that user permissions and logging are in place to protect content, but Pen Test Partners has pointed out that configuration holes can be exploited by attackers.
  • Q: What can organizations do to mitigate this risk?
    A: Organizations should review their user permissions and logging configurations to ensure that they are adequate and that they understand the implications of adding licenses to their users.
Advertisement

Our Newsletter

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending