But I Don't Want a Copilot

by Melody Yankelevich

It has not been generally released yet as of press time, but Microsoft Copilot promises to have some amazing capabilities.  I know nothing of its architecture, but the capabilities alone suggest that there may be underlying issues that destroy fundamental security, legal, and other concepts.

Please check out the Copilot launch videos to see the features that may be included in the initial rollout.  For the purpose of this article, I will assume that Copilot will soon be able to fulfill a request such as "create a summary of Galaxy Quest based on my Microsoft OneDrive files."  To accomplish this, Copilot will need more than just logical access to all of my stuff.  It will need to do things like decoding, decompression, translation, and transcription, ultimately interpreting my data just as I would have.  That is precisely the problem.  Microsoft is not me.

My initial concerns include:

Excessive Access

Copilot seems to need the ability to see everything, even data that is not related to my request.  After all, it would need to process an entire file in order to determine that it is not associated with Galaxy Quest.  You may be forced to cancel your zero trust initiatives.

Legal

In the new Microsoft universe, email messages can turn into a PowerPoint presentation which turns into a Word quote which turns into a new customer on your Customer Relationship Management (CRM) system.  So if there is some kind of legal action, then how might you technically comply?  Important information and its metadata could be anywhere, so do you put a legal hold on everything associated with a user?

Regulations such as Health Insurance Portability and Accountability Act (HIPAA) prohibit unnecessary access to records.  So if I ask Copilot to "look at everything," then won't it cause a violation?  If there is a violation, then how would I detect it?

Employer Abuse

Like ChatGPT, Microsoft admits that "everything is captured in the prompt history."  This has nothing to do with my data.  This enters the realm of behavior monitoring, which some employers are eager to exploit.  Am I creating evidence of my incompetence by asking Copilot stupid questions?  Am I taking too long to solve a particular problem?

Attribution

Copilot promises to work with applications like Salesforce, but Microsoft can't access my Salesforce data.  I haven't heard anything about using typical role-based access controls, so how is this going to be accomplished?  Do they intend to use my interactive connection to Salesforce?  If that is the case, then anything that Copilot does will be attributed to me.

Incident Response

If Microsoft can see my data just like I can, then what do I do if I have a data breach?  How can I confirm that Microsoft was not somehow involved?  How might I prove that they were?

Loss of Business

Will customers abandon me due to my use of Microsoft 365, assuming that Microsoft will be privy to all of our interactions?

What Else Is It Doing?

Is Copilot directly answering my question or is it doing other things?  When I ask it to analyze my spreadsheet, is it also looking for signs of criminal activity?

In the end, you have no choice.  From what I have seen so far, Copilot is going to be enabled and you can't turn it off.

I used to be able to check my email, and even my provider could not see what I was doing.  To perform the same task today, Microsoft requires access to everything that my Active Directory permissions allow.  This sounds like a grab for all of our data, so Microsoft please explain how I am misunderstanding the way in which Copilot works.

Return to $2600 Index