Your Microsoft 365 environment has MFA enabled. Roles are assigned. You’ve done the security basics. On paper, it looks ready for Copilot.

It probably isn’t.

Not because your intentional security configuration is wrong — but because underneath it, there’s a layer of sharing decisions nobody made deliberately. Links that were created and never expired. Guest accounts provisioned for a contractor six months ago and never disabled. SharePoint libraries with default permissions that nobody changed because nobody knew the default was the problem.

Microsoft Copilot respects Microsoft’s permissions model exactly. It won’t surface a file to someone who doesn’t have access. The problem is that in most Microsoft 365 environments, more people have access to more files than anyone intended — and nobody audited it before turning on a tool that can search and summarize everything.


How Oversharing Happens Without Anyone Trying

The most common blind spot I see across Microsoft 365 environments isn’t a misconfiguration someone made deliberately. It’s the configuration nobody changed.

“People in your organization” links are often set as the org-level default — and many tenants have never changed it. When a user shares a document in SharePoint or OneDrive, the default link type in tenants that haven’t been hardened is one that grants access to any authenticated member of the organization. The user clicks share, copies the link, pastes it into an email, and sends it to one person. What they actually created is a link that any colleague can redeem.

The default permission on those links is Edit, not Read. The employee who shared a budget document with their manager didn’t intend to give any colleague who receives that link write access to a budget spreadsheet. But that’s what the link does.

Here’s how the exposure plays out. That link is in an email. The email gets forwarded — to a broader team, to someone cc’d by mistake, to a distribution list. Any org member who clicks it can redeem it and gain access. Once they do, Copilot can surface that document for them. An employee who was never supposed to see that file asks Copilot a question about budgets. Copilot finds it. Copilot answers with it.

No one breached anything. The permissions worked exactly as configured. That’s the problem.


Guest Accounts Are the Quiet Side of the Same Risk

Shared links are visible when you know to look. Stale guest accounts are quieter.

Every contractor, vendor, and partner who was granted guest access to your Microsoft 365 tenant is still there until someone removes them. Most organizations provision guest accounts when the relationship starts and forget them when it ends. The account stays active. The group memberships stay intact. The SharePoint library permissions set up for that engagement are still assigned.

Copilot with Graph access traverses your SharePoint environment without distinguishing between an active employee and a guest account that hasn’t been used in eight months. If the permissions say accessible, it’s accessible.

The engineers in your organization who manage Microsoft 365 already know this. The audit hasn’t happened because it’s not on the priority list — and it won’t be until leadership makes it one.


What AI-Ready Actually Looks Like

Getting ready for Copilot isn’t a one-time project. It’s a posture shift that requires ongoing discipline. Here’s what the organizations that do it right actually change:

Harden the global sharing configuration. Set the default link type to Specific people. Change the default permission to Read. Microsoft changed the out-of-the-box default to Specific people in July 2024 — but any tenant created before that may still be running the old default, and even new tenants need this verified. These two settings stop new oversharing before it accumulates. They don’t fix what’s already there — but they stop it from getting worse.

Set link expiration policies. Sharing links should have a maximum lifetime. External links especially. Most tenants have no expiration configured, which means links created three years ago are still live and still granting access.

Audit and disable unused guest accounts. Pull the guest account report, filter for accounts with no sign-in activity in 90 days, and disable them. Then make this a quarterly process, not a cleanup you do once before a Copilot rollout.

Disable resharing. If you send someone a link, they shouldn’t be able to forward it to someone else and extend that access further. This is often enabled by default.

Require admin consent for third-party applications. Any app requesting Microsoft Graph permissions to read SharePoint or OneDrive data — including AI tools and the third-party model providers now available through Copilot — should require explicit approval before it gets access.

Treat least privilege as a standing operating principle, not a project. Every permission decision should start with the question: does this person actually need this access? That mindset doesn’t take hold from the engineering team up. It has to come from leadership down.


The Reason This Can’t Wait

Permission sprawl in Microsoft 365 has always been a governance problem. Most organizations have tolerated it because the practical consequences were manageable — the occasional file accessible to someone who shouldn’t have it, noticed when it caused a problem.

Copilot changes the consequence, not the root cause.

A forgotten “People in your organization” link used to mean one document was more accessible than intended — a risk that materialized only if someone stumbled across it. With Copilot, the risk materializes the moment that link gets forwarded to someone who wasn’t supposed to have it. They redeem it. They have access. Copilot can now answer their questions using content they were never meant to see.

The data exposure risk here isn’t about Microsoft reading your files or an LLM provider training on your content. It’s internal. It’s an employee in one department using Copilot to surface information from another department that was never meant to be shared — because someone sent an email with the wrong default link three years ago.

That’s the AI readiness problem. It isn’t glamorous. It’s an audit and a settings review and a governance conversation that most organizations have been deferring. The right time to have it is before you roll out the tool that makes the consequences visible.


Getting ready for Copilot or trying to understand where your Microsoft 365 permissions actually stand? I’m happy to walk through what a readiness assessment looks like.

Darren Bell
Darren Bell
Senior Cloud Architect & IT Leader

Senior Cloud Architect with 10+ years leading technology operations across healthcare, managed services, and regulated industries. Specializing in Microsoft 365, Azure, identity architecture, and IT cost optimization. I write about what it actually takes to build resilient, compliant, and cost-efficient operations.