13.02.2025

Microsoft Copilot for M365 and privacy: How to use it securely in your organisation

Microsoft Copilot for M365 is revolutionising the way we work with Office applications, but it also poses significant privacy challenges. Find out how organisations can use the AI assistant in a secure and compliant way, and the risks to consider. Find out what steps you need to take now to ensure data protection compliance.

Arrange a no-obligation initial consultation

What is Microsoft Copilot for M365?

Microsoft Copilot for M365 is an AI-powered assistant for Microsoft 365 applications and services (not to be confused with 'Microsoft Copilot', which can be controlled independently of M365 and directly from the browser or application). In addition to Outlook, Microsoft Copilot for M365 supports applications such as Word, Excel and PowerPoint, each of which has its own Copilot. The assistant is designed to help users complete tasks faster and increase productivity. For example, it offers the ability to create draft text, presentations or formulas in the associated application.

To provide company-specific answers, Copilot accesses M365 customer data in real time via Microsoft Graph on SharePoint. The assistant uses a combination of large language models (LLMs) - such as GPT-4, which Microsoft has licensed from Open AI - and a specific algorithm based on deep learning techniques and large data sets. Copilot matches these language models with permissioned content from Microsoft Graph and the productivity app.

In practice, the processing works as follows: the user first makes an entry in one of the Microsoft 365 applications. Based on the user's permissions, Copilot captures the user's specific business context - in particular, relevant content that the user can access - and sends the input to the LLM. The output generated there is checked by Microsoft and then sent to the application.

What are the privacy implications of using Microsoft Copilot in the enterprise?

In recent years, Microsoft has been the focus of criticism from regulators. In particular, the Data Protection Conference (DSK), a joint body of data protection supervisory authorities, commented on Microsoft 365, stating that it was hardly possible to use it in compliance with data protection regulations at the time.

In our view, it is indeed possible to ensure that M365 - including Copilot - is used in a compliant manner. Provided that the privacy challenges, contracts and configuration of M365 are addressed. There are some specific privacy issues that arise in the context of Microsoft Copilot for M365.

Main issue: access to company data via user permissions

One of the main privacy issues with Copilot is that the tool can access company data based on user permissions, which the user can also access, and use this data for its results. Individual users do not need to have  editing rights to do this. Even simple read-only permissions allow Copilot to analyse the data in question. In addition, the tool does not take into account confidentiality labels from Microsoft Information Protection, which means that confidential data can also be analysed and included in the output.

This poses a risk of unauthorised disclosure or other unauthorised processing, particularly in relation to employees' personal data. The potential valuation of company know-how - especially trade secrets - is also sensitive from an economic perspective. The following example illustrates the problem.

Example of problematic data access by Microsoft Copilot

For example, if the management of a company wants to generate an invitation for a company event, Copilot accesses a large database in the management's SharePoint based on the - usually very extensive - usage rights. This database may contain sensitive information about employees, confidential negotiations and know-how.

Since Copilot makes no distinction in terms of confidentiality, the tool evaluates all the data for a draft invitation. In the worst-case scenario, the finished invitation starts with an anecdote about an employee's pregnancy, alludes to a planned deal and makes references to poor business figures.

Another privacy issue: Linking Microsoft Copilot and Bing

Linking Copilot to Bing search raises another privacy concern. Microsoft 365 processes the company's data as a processor, even when using Copilot. However, by passing the data to the search engine, Copilot leaves this sphere and Microsoft processes the data sent to Bing on its own responsibility under data protection law.

Microsoft's own responsibility has always been cited by regulators as an argument against data protection compliance when Microsoft 365 is used by local controllers. According to the authorities, Microsoft does not adequately meet its responsibility obligations under Art. 5(2) GDPR, which is why it is not possible to lawfully use the Office suite.

What measures must be taken to use Microsoft Copilot in a privacy compliant manner?

The main focus of data protection compliance with Microsoft Copilot is to limit the scope of the data processed.

Scope of data processing - user permissions

Depending on the use case, large amounts of personal data - possibly also special categories of personal data according to Art. 9 para. 1 GDPR - are sometimes processed when using Copilot. Since access to Copilot depends on the scope of the user's authorisation, this is a good opportunity for data-saving use. Copilot configurations should be reviewed and customised in the M365 Admin Centre, taking into account internal company governance. As a general rule, no user should have more permissions than necessary. In particular, read and write permissions for confidential data should be strictly 'need-to-know'.

Scope of Use - use policy

Copilot can be used for a wide variety of applications and therefore poses a high risk to the rights and freedoms of data subjects. A policy can be used to define the permitted uses of the tool and to train employees in its use. For example, prompts should not contain personal data or sensitive expertise.

Deletion and retention periods

As a matter of principle, personal data should only be retained for as long as absolutely necessary. By implementing and enforcing company specific retention and deletion policies, Copilot users can be made aware of this and the amount of data processed can be significantly reduced.

Restricting Bing, plug-ins and third-party services

Because of the significant privacy issues associated with linking Copilot and Bing search, this feature should generally be disabled in a business context. The use of other third-party plug-ins and services for Microsoft Copilot for M365 is also problematic and should be subject to a privacy risk assessment.

Privacy Impact Assessment for Microsoft Copilot

Due to the extensive processing capabilities and specific risks of the AI tool, a Data Protection Impact Assessment (DPIA) should be conducted before using Copilot. If the organisation already has a DPIA for Microsoft 365, this can be used to complement it. A DPIA allows the relevant processing operations to be assessed and subjected to a specific risk analysis. At the same time, the documentation provides a tool for eliminating risks to the rights and freedoms of natural persons and can relieve the company in the event of a conflict.

Our recommendation for compliant use of Microsoft Copilot

Copilot can generally be used in a data protection compliant manner, but user permissions and functions need to be fine-tuned for business use. In addition to technical restrictions via the Admin Centre, users should also be trained in the specifics of data protection law. Before implementation, the use of Copilot should be reviewed as part of a DPIA, or an existing DPIA for Microsoft 365 should be amended accordingly.

We have advised many companies on the use of Microsoft Copilot and know what is important. Simply arrange a no-obligation initial consultation to discuss how our specialist lawyers can assist you with the data protection compliant use of Copilot and M365.

More news

12.03.2025

Data Act 2025: Rights & obligations for affected businesses

13.02.2025

Microsoft Copilot for M365 and privacy: How to use it securely in your organisation