Skip to main content

Shortly after the launch of ChatGPT, engineers at Samsung's semiconductor division found themselves in a challenging situation. Striving to boost productivity, they explored how generative AI tools like ChatGPT could help streamline their daily tasks. Faced with complex coding problems and tight deadlines, they began using the AI to optimise test sequences and summarise internal meeting notes. What seemed like a practical solution soon led to unintended consequences.

Confidential information—including source code and internal project details—was inadvertently shared with ChatGPT, which by default retains user inputs for model training. This retention poses a significant privacy risk as the data can potentially be accessed or used to refine AI capabilities beyond the original intent. Effectively, it becomes stored in ChatGPT's long-term memory, where sensitive information could be unintentionally incorporated into future responses for other users, leading to serious breaches of confidentiality. The repercussions were swift: Samsung imposed an immediate ban on the use of generative AI tools like ChatGPT across the company.

Samsung is not the only tech giant to take action on the use of generative AI tools by employees. According to press reports, Amazon issued a similar warning, instructing workers not to share code or confidential company information with ChatGPT after noticing examples of AI responses that appeared to reference internal Amazon data.

This incident underscores the hidden risks of using generative AI in the workplace without fully understanding the implications for data privacy. Research by McKinsey indicates that over 65% of employees turn to generative AI tools to meet deadlines and improve efficiency, but this convenience can inadvertently lead to significant data exposure.

How Microsoft Ensures GDPR Compliance with Copilot

As AI becomes more embedded in the workplace, organisations need assurances that data privacy is safeguarded. This is where Microsoft's approach with Copilot for Microsoft 365 stands out, providing a level of data privacy that matches the standards already established in Microsoft 365 services.

Through its Enterprise Data Protection (EDP) model, Microsoft has effectively addressed the data privacy challenges associated with deploying a large language model (LLM)-based assistant in an enterprise setting. For enterprise customers, Microsoft provides assurance that data handled by Copilot meets the same stringent GDPR compliance standards as SharePoint, OneDrive, and Outlook, ensuring a seamless extension of existing security measures. Enterprise Data Protection (EDP) is automatically applied to Copilot for customers logged in with their Microsoft Entra ID.

This means that Copilot is compliant with Microsoft’s privacy, security, and compliance commitments to Microsoft 365 customers, including GDPR and the European Union (EU) Data Boundary. The EU Data Boundary ensures that customer data is stored and processed within the EU, giving EU and EEA-based customers confidence that their sensitive information is handled in compliance with regional regulations. This boundary is part of Microsoft's ongoing effort to provide transparency and control over data flows, making sure that all customer data, including personal and organisational, remains within specified geographic limits.

For UK customers, data is typically processed in the home region, although Microsoft may process data in other regions. Microsoft's Data Processing Agreement outlines the commitments and controls to ensure compliance. Furthermore, where data is processed in the United States, data transfer is covered by the UK-US Data Bridge—a government-administered scheme to which Microsoft is a certified participant.

The Copilot service is hosted and operated by Microsoft on their Azure OpenAI platform, which is separate from publicly available OpenAI services. This isolation ensures that customer data and prompts remain within the enterprise environment, maintaining privacy and security. Microsoft’s robust infrastructure means that data never leaves the safety of their Azure cloud, which is critical for enterprises needing assurance that sensitive information remains within compliant boundaries.

Moreover, prompts, data, and responses within Copilot are not used to train the broader models. For enterprise clients, this means that proprietary and customer confidential data used through Copilot remains secure and is never used to improve AI performance outside of the customer’s domain.

For enterprise customers, Microsoft has long provided confidence that their Microsoft 365 data—whether in SharePoint, OneDrive, or Outlook—is managed securely and remains compliant with data privacy regulations, including GDPR. Copilot for Microsoft 365 is aligned to the same standards, meaning that organisations can integrate AI safely without compromising their compliance posture.

This approach gives Microsoft Copilot a significant competitive advantage over other solutions, such as Anthropic’s Claude or OpenAI’s ChatGPT, whose GDPR compliance might not fully meet European requirements as their processing can take place outside the EU. While those organisations have documented how GDPR compliance is achieved, it requires deep diving on multiple documents and understanding GDPR contractual clauses.

A Trusted Partner for Data Privacy

The Samsung story serves as a cautionary tale of what can go wrong, but Microsoft’s approach with Copilot for Microsoft 365 represents a good news story. It shows that AI can be implemented securely, with careful alignment to privacy standards, and provide significant value without the typical compliance concerns. For enterprises, adopting Microsoft Copilot means embracing innovation with the confidence that privacy and compliance are fully addressed—helping you leverage AI safely and effectively.

At Pivotal Edge AI we work with many clients in regulated industries who have enhanced data privacy and client confidentiality requirements. We've done the deep dive on the Microsoft Enterprise Data Protection model, to find out more and discover how Microsoft Copilot meets your data privacy needs, book a free discovery call through the form below.

Ready to find out how AI could transform your organisation?   Book a free discovery call

Tags:

Blog
Post by Geoff Davies
Mon, Oct 21, 2024