Anyone else allowing this sort of AI to have access to your Teams environment ?
Is your company using AI at all, or holding off for now ?
Anyone else allowing this sort of AI to have access to your Teams environment ?
Is your company using AI at all, or holding off for now ?
Gavin / xpd / FastRaccoon / Geek of Coastguard New Zealand
LinkTree - kiwiblast.co.nz - Lego and more
Support Kiwi music! The People Black Smoke Trigger Like A Storm Devilskin
NZ GEEKS Discord______________________________
![]() ![]() |
We have Copilot licencing so allow that for users within Teams (as Copilot for 365 data is kept within your tenant) but I wouldn't be allowing a third party AI access to our data,
Unless you know exactly how to limit access, what information you are giving it access to, you are setting yourself up for a world of misery. Maybe not today, or tomorrow, but for sure.
If you want to implement something like this, you should do a course on how to control it first IMO.
Yeah I've disabled it for now.
So many users think AI is a shortcut to things, but seen lots of examples where its caused more work anyway.
Gavin / xpd / FastRaccoon / Geek of Coastguard New Zealand
LinkTree - kiwiblast.co.nz - Lego and more
Support Kiwi music! The People Black Smoke Trigger Like A Storm Devilskin
NZ GEEKS Discord______________________________
I would not be granting any third-party access to your Teams/Exchange environment without thoroughly reviewing and evaluating it. Doesn't really matter if it's AI or not, although given the current hype, I would be more concerned with companies rushing half-baked AI solutions to market.
A colleague of ours in Australia was instructed to put Copilot in their environment by the CEO. He strenuously remonstrated against such action. No limits were put in place.
Didn't take long for a lowly staff member to work out how to type questions like 'what is our CEOpaid' and the like. It was turned off much faster than it was turned on.
Proper scoping is required. Given the rate of change for AI currently, I wouldn't trust that how it works today, and how it works next week, are the same.
That just means the information was accessible in the first place. Whether AI surfaces the answer quickly, or someone goes manually hunting and finds it, the cause is security and permissions, not AI.
networkn:
A colleague of ours in Australia was instructed to put Copilot in their environment by the CEO. He strenuously remonstrated against such action. No limits were put in place.
Didn't take long for a lowly staff member to work out how to type questions like 'what is our CEOpaid' and the like. It was turned off much faster than it was turned on.
Proper scoping is required. Given the rate of change for AI currently, I wouldn't trust that how it works today, and how it works next week, are the same.
We are currently trialling Copilot at work and that is an example our CIO keeps repeating of people using it to scope out information they normally wouldnt be privy to. No one has been brave enough yet to point out that information is already published in our annual report and public disclosure documents.
gehenna:
That just means the information was accessible in the first place. Whether AI surfaces the answer quickly, or someone goes manually hunting and finds it, the cause is security and permissions, not AI.
I didn't say AI was the issue, but controlling who can obtain what information from AI is done both inside the AI Tools and the structures AI has access to.
In the case I mentioned above, the HR Folder was accessible to HR and CXO's only, of which the pleb most certainly did not have access, but inside the tool, the scope for the return of information in relation to a query was open by default. I believe, at least in copilot, it's a relatively recent change that it's off by default.
I wouldn't bet my house on defaults set by MS.
networkn:
I wouldn't bet my house on defaults set by MS.
I wouldn't bet my house on anything MS.
networkn:
gehenna:
That just means the information was accessible in the first place. Whether AI surfaces the answer quickly, or someone goes manually hunting and finds it, the cause is security and permissions, not AI.
I didn't say AI was the issue, but controlling who can obtain what information from AI is done both inside the AI Tools and the structures AI has access to.
In the case I mentioned above, the HR Folder was accessible to HR and CXO's only, of which the pleb most certainly did not have access, but inside the tool, the scope for the return of information in relation to a query was open by default. I believe, at least in copilot, it's a relatively recent change that it's off by default.
I wouldn't bet my house on defaults set by MS.
Copilot will use your own privileges to search your own information. If it's able to find stuff like that, it's been filed incorrectly, and all that's happened is that it's easier to find because something else is doing all the legwork. So it's basically removing the obscurity that has otherwise provided security.
You're absolutely right and when people talk to me about the likes of Copilot I just have to remind them that they need to be very, very confident that their house is in order before they let an AI run rampant within it. Heck, tenant-search is a good place to start. Chuck some words into your Sharepoint search, then adjust the scope to tenant-wide and see what you can find!!
As for third party AI tools for meeting notes etc - agree with the earlier commenters, you need to understand what the tool is, and isn't, doing with its access to your stuff. Letting an AI take meeting notes seems to be in hot demand, because people hate taking their own minutes. I see it as laziness personally. I also note the observations from this GP who experimented with AI for taking practice notes; without the extra time to consider the meeting, their memory of consultations suffered. This leads to time lost down track, or a more 'surface' recollection rather than genuine memory.
networkn:
but inside the tool, the scope for the return of information in relation to a query was open by default. I believe, at least in copilot, it's a relatively recent change that it's off by default.
I'd love to see some evidence of that. What you're describing should still be impossible if permissions are correct. You're implying, which I may well be misreading since I've had flu for a week, that Copilot ignores permissions and returns a response regardless, and there's a setting that lets it do that even if the permissions are correct, which isn't the case in my testing, and would be a global shitstorm if it's true. It warrants investigation if it's true, but it sounds more like whoever you're talking to didn't have permissions in place and is blaming it on Copilot.
gehenna:
networkn:
but inside the tool, the scope for the return of information in relation to a query was open by default. I believe, at least in copilot, it's a relatively recent change that it's off by default.
I'd love to see some evidence of that. What you're describing should still be impossible if permissions are correct. You're implying, which I may well be misreading since I've had flu for a week, that Copilot ignores permissions and returns a response regardless, and there's a setting that lets it do that even if the permissions are correct, which isn't the case in my testing, and would be a global shitstorm if it's true. It warrants investigation if it's true, but it sounds more like whoever you're talking to didn't have permissions in place and is blaming it on Copilot.
Microsoft is very clear about how copilot operates, and it matches what I've seen from our pilot group. Its also highlighted that our SharePoint permissions need some work, which I suspect is the case for many orgs. Anything that CoPilot has returned that is sensitive has also been found via search due to incorrect permissions.
Yeah that's what I'm saying. Anything Copilot surfaces should only be due to permissions allowing it (or just not set in the first place). The post I was replying to implies that even if permissions are explicitly disallowing access, Copilot could bypass the permissions and return the info anyway. I don't see that behaviour, but if others do then I want to know more.
![]() ![]() |