Back to Blog

AI Data Residency for UK and EU Markets: GDPR, the EU AI Act, UK GDPR, and Schrems II

What UK and EU compliance frameworks require for AI infrastructure: GDPR cross-border transfers, the EU AI Act, UK divergence, and when self-hosted inference is needed.

Morley Media Team4/15/202612 min read

This post covers the UK and EU regulatory landscape for AI data residency as of April 2026. It is a companion to our infrastructure guide on self-hosted AI, which covers the architecture, hardware selection, and operational reality of on-premise inference regardless of jurisdiction. We also have regional guides for US and Canadian markets.

The EU Landscape

The EU's approach to AI data residency is shaped by the intersection of three regulatory frameworks: the GDPR (which governs data protection and cross-border transfers), the EU AI Act (which governs AI systems based on risk classification), and the aftermath of Schrems II (which continues to complicate transfers to the US). Together, these create strong incentives for EU-based processing of AI workloads, even though none of them explicitly mandate data localization.

GDPR and Cross-Border Transfers

The GDPR does not require data to stay within the EU. What it requires is that personal data transferred outside the European Economic Area (EEA) receives protection essentially equivalent to what it receives within the EEA. The mechanisms for achieving this are Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs), or adequacy decisions from the European Commission recognizing that a recipient country provides adequate protection.

For AI specifically, two GDPR articles apply the moment personal data touches a cloud AI tool:

Article 28 (Data Processor Obligations). When you send personal data to a cloud AI provider, that provider becomes a data processor acting on your behalf. You are required to have a Data Processing Agreement (DPA) with them covering the specific data types being sent, processing purposes, and security measures.

Articles 44-49 (International Transfer Safeguards). If the AI provider's servers are outside the EU/EEA, you need appropriate transfer mechanisms. Using a US-based AI tool without a valid transfer mechanism is not compliant, regardless of the provider's privacy policy. The maximum fine for cross-border transfer violations is 4% of global annual turnover (Article 83), and these sit in the highest penalty tier.

The Schrems II aftermath. The Schrems II ruling invalidated the EU-US Privacy Shield in July 2020. The EU-US Data Privacy Framework (DPF), adopted in July 2023, provides a partial solution for transfers to certified US organizations. But the DPF's legal durability is uncertain. Privacy advocates have signaled challenges ("Schrems III"), and the fundamental tension remains: US surveillance law, particularly FISA Section 702, allows collection of non-US persons' data in ways that the CJEU has previously found incompatible with EU fundamental rights.

For AI deployments, the practical implication is that relying solely on the DPF for transfers to US-based AI providers carries legal risk. Many European organizations are choosing to process AI workloads within the EU to eliminate transfer risk entirely, even when the DPF would technically allow the transfer.

The Dutch financial regulator example. The Netherlands' AFM and DNB have issued guidance interpreting Schrems II with particular rigour for financial institutions. They require transfer impact assessments, supplementary measures including AES-256 encryption and pseudonymization, and contractual mechanisms limiting foreign government access. These requirements go beyond baseline GDPR and demand continuous monitoring, vendor transparency, and documented risk acceptance by senior management. Cloud architectures must support data residency controls, immutable audit logs, and the ability to suspend or terminate transfers if protective measures prove insufficient.

The EU AI Act

The EU AI Act entered into force on August 1, 2024, with phased enforcement:

  • February 2025: Prohibited AI practices banned (social scoring, real-time biometric identification in public spaces with narrow exceptions)
  • August 2025: General-Purpose AI (GPAI) model requirements effective. Fifteen GPAI models were notified by January 2026.
  • August 2026: Full high-risk AI system obligations apply
  • August 2027: Embedded AI system rules begin

By Q1 2026, EU member states had issued 50 fines totalling approximately €250 million, primarily for GPAI non-compliance. Ireland handles a disproportionate share of cases due to the location of tech company headquarters there.

Data residency implications. The AI Act does not mandate EU-only data storage. However, its requirements for high-risk AI systems create practical incentives for EU-based processing:

The AI Act applies to any AI system placed on the EU market or whose output is used in the EU, regardless of where the provider is based. UK companies serving EU customers, US companies with EU users, and any organization whose AI systems affect EU residents are in scope.

The Digital Omnibus proposal. The European Commission has proposed amendments to the GDPR through the Digital Omnibus package, including provisions allowing organizations to rely on the "legitimate interest" legal basis for using personal data to train or operate AI systems, and extending the ability to process special category data for detecting and correcting AI bias beyond just high-risk AI system providers. Trilogue negotiations among EU institutions are expected in mid-2026. These changes, if adopted, would ease some compliance burden around AI training data, but would not change the cross-border transfer rules.

⚠️

For organizations deploying AI in the EU: Every cloud AI API call where personal data is sent to a non-EEA server is a cross-border transfer subject to Articles 44-49. This includes prompts containing customer names, email addresses, or any other personal data. If you are using ChatGPT, Claude, or any other US-based AI API with European personal data, you need a valid transfer mechanism (SCCs, DPF certification for the specific provider, or BCRs), a Transfer Impact Assessment, and supplementary measures if the recipient country's protections are deemed inadequate. Self-hosted inference on EU infrastructure eliminates this entire category of compliance work because no cross-border transfer occurs.

EU Sovereign Cloud Initiatives

Several sovereign cloud offerings have launched or been announced:

The critical distinction remains: a regional deployment from a US provider, even a sovereign cloud entity incorporated in the EU, may still face jurisdictional questions if the US parent company retains operational control. The Microsoft France testimony referenced in our infrastructure guide underscores that contractual assurances from US-parent companies about EU data protection have limits.

The UK Landscape

The UK's data protection and AI regulatory environment has diverged from the EU since Brexit, creating a dual-compliance challenge for organizations operating across both jurisdictions.

UK GDPR and the Data (Use and Access) Act 2025

The UK GDPR was created by onshoring the EU GDPR into domestic law through the European Union (Withdrawal) Act 2018. The Data Protection Act 2018 supplements it. The Data (Use and Access) Act 2025 (DUAA), which received Royal Assent on 19 June 2025, introduced the most significant changes to UK data law since Brexit, with most data protection provisions taking effect on 5 February 2026.

Key changes relevant to AI:

Relaxed automated decision-making rules. The DUAA significantly relaxed Article 22 UK GDPR provisions on automated decision-making. Under the new rules, solely automated decisions with legal or significant effects can be made using a wider range of legal bases, including legitimate interests, without requiring explicit consent in all cases. This is more permissive than the EU GDPR position, where ADM remains prohibited unless narrow exceptions apply.

Legitimate interests. The UK now has a list of recognized legitimate interests (Schedule A1) that do not require the full three-part balancing test required under EU GDPR. Direct marketing is explicitly listed as a legitimate interest.

Senior Responsible Individual. The EU's mandatory Data Protection Officer (DPO) has been replaced with a "senior responsible individual" (SRI) with different expertise requirements and protections.

PECR fines increased. Privacy and Electronic Communications Regulations fines have been raised to £17.5 million or 4% of global turnover, up from £500,000. This makes direct marketing compliance (including AI-powered personalization) a board-level risk.

Data residency. UK GDPR does not mandate that data stay within the UK. International transfers require adequate safeguards, similar to the EU framework but using a "data protection test" (whether protection is "materially lowered") rather than the EU's "essential equivalence" standard.

UK-EU Data Adequacy

UK-EU data adequacy was renewed on 19 December 2025 for six years (until December 2031). This means personal data continues to flow freely from the EU to the UK without requiring SCCs or other transfer mechanisms.

However, data adequacy does not exempt UK businesses from EU AI Act obligations. If your AI systems are placed on the EU market, affect EU residents, or produce outputs used in the EU, the EU AI Act applies regardless of adequacy status. UK companies with EU-based clients must comply with both UK GDPR and EU AI Act requirements, and must appoint an authorized EU representative for AI Act purposes.

The adequacy decision could be affected if the UK diverges too far from EU data protection standards. The European Commission monitors UK data protection law, and the EDPB has provided opinions as part of the review process. Organizations should not assume adequacy is permanent.

UK AI Regulation

The UK has no standalone AI legislation in force as of April 2026. AI is regulated through existing laws (UK GDPR, DPA 2018, DUAA 2025, Equality Act 2010) applied by sector regulators including the ICO, CMA, and ASA.

A UK AI Bill was promised in the July 2024 King's Speech but has not materialized. The government confirmed it may appear in the spring 2026 King's Speech (scheduled for 13 May 2026), but this remains uncertain. If introduced, it is expected to be narrow, covering advanced AI models and AI-copyright provisions.

The ICO has published an AI and Biometrics Strategy that focuses enforcement on automated decision-making in employment, transparency, and redress. The government launched a £500 million Sovereign AI Unit to secure domestic computing capabilities.

Without dedicated AI legislation, the UK's approach is more permissive than the EU's on paper, but the ICO is an active enforcer. Organizations that comply with the EU AI Act will generally satisfy UK requirements, but not necessarily vice versa.

Dual Compliance for UK-EU Operations

Organizations operating across both the UK and EU face a dual-compliance requirement. The most significant practical differences are:

  • ADM rules are stricter in the EU than in the UK post-DUAA
  • The EU AI Act imposes risk-classification requirements that have no UK equivalent (yet)
  • The legitimate interests framework is more flexible in the UK than in the EU
  • DPO requirements differ between the two regimes (DPO in EU, SRI in UK)
  • Transfer mechanisms differ (EU "essential equivalence" vs UK "data protection test")

For AI deployments, the practical approach is to design for the stricter standard (EU GDPR + EU AI Act) and apply UK-specific relaxations where they clearly apply and are documented. Building to the EU standard and selectively relaxing for UK-only operations is significantly easier than building to the UK standard and trying to tighten for EU compliance after the fact.

Financial Services: Additional Requirements

EU. The European Banking Authority (EBA), EIOPA (insurance), and ESMA (securities) each provide guidance on AI governance, outsourcing, and ICT risk management. The Digital Operational Resilience Act (DORA), applicable from January 2025, imposes requirements on financial entities' ICT third-party service provider management, including AI vendors.

UK. The FCA and PRA apply existing frameworks (including the Senior Managers and Certification Regime) to AI governance. The FCA has published guidance on AI in financial services and has enforcement authority over firms using AI in ways that harm consumers.

Dutch regulators. As noted above, the AFM and DNB apply Schrems II with particular rigour, requiring measures that exceed baseline GDPR for cloud and AI deployments in financial services.

What This Means for Infrastructure Decisions

EU organizations. The combination of GDPR cross-border transfer rules, Schrems II residual uncertainty, and the EU AI Act's documentation and governance requirements creates strong practical incentives for EU-based AI processing. Self-hosted inference on EU infrastructure eliminates transfer risk and simplifies AI Act compliance documentation. For organizations that prefer cloud services, the emerging sovereign cloud offerings from hyperscalers provide a middle ground, but the jurisdictional questions around US-parent companies remain.

UK organizations serving only UK customers. The DUAA's relaxation of ADM rules and the UK's more flexible legitimate interests framework mean that cloud AI APIs are viable with appropriate DPAs and risk assessments. UK data adequacy with the EU simplifies data flows. Self-hosted infrastructure is less driven by regulatory necessity and more by security preferences or cost optimization.

UK organizations serving EU customers. You need to comply with both regimes. The EU AI Act applies to your AI systems if they affect EU residents. EU GDPR applies to personal data of EU data subjects. Design for the stricter standard and document your compliance with both.

Our infrastructure guide covers the technical architecture, hardware selection, security design, and operational reality for self-hosted AI deployments.

Need help with AI compliance for UK or EU markets? We build compliant AI infrastructure for organizations navigating GDPR, the EU AI Act, UK GDPR, and sector-specific requirements. From compliance assessment through architecture, deployment, and ongoing operations, we can help you navigate the regulatory requirements for your specific jurisdiction and data classification. Talk to our team.

Get Expert Help

Tags

data-residencyGDPREU-AI-ActUK-GDPRSchrems-IIcomplianceAI-infrastructure

Need help implementing these solutions?

Our expert development team can help you build, scale, and secure your applications.