EU DMA Expansion: Cloud & AI Gatekeeper Audits [2026]
Bottom Line
The EU did not rewrite the DMA in April 2026. It expanded enforcement in practice by using existing DMA tools to scrutinize cloud platforms, AI access paths, portability, defaults, search data, and privacy controls.
Key Takeaways
- ›The Commission's April 28, 2026 DMA review kept the law unchanged but elevated cloud and AI as priority targets.
- ›Three cloud DMA investigations opened on November 18, 2025, including gatekeeper probes into AWS and Azure.
- ›EU businesses hit 52.7% paid cloud adoption and 20.0% AI adoption in 2025, raising the stakes for lock-in controls.
- ›Article 17 DMA investigations target 12 months, with preliminary findings expected within 6 months.
- ›Data egress charges are scheduled to disappear under the Data Act on January 12, 2027.
On April 28, 2026, the European Commission said the Digital Markets Act remains fit for purpose. That sounds like stasis, but the engineering reality is the opposite: the EU is widening enforcement into the stack below consumer apps. Cloud control planes, AI access paths, search-data interfaces, portability tooling, consent flows, and interoperability layers are all moving into a more inspectable regime. For infrastructure teams, the interesting question is no longer whether the DMA reaches foundational systems. It is how your platform proves that it does not weaponize them.
The Lead
Bottom Line
The April 2026 DMA review did not add a brand-new cloud or AI chapter. It operationalized an audit model by applying existing DMA instruments to cloud infrastructure and AI distribution choke points.
The important nuance is procedural. Brussels did not announce a sweeping rewrite of the statute. It said the current framework is strong enough, then immediately pointed to cloud computing and AI services as priority areas for enforcement. That is a classic platform-regulation move: keep the law stable, expand the inspection surface, and force implementation detail into proceedings, compliance reports, consultations, and market investigations.
Three dates matter.
- November 18, 2025: the Commission opened three DMA market investigations into cloud computing services, including designation probes into Amazon Web Services and Microsoft Azure.
- January 27, 2026: the Commission opened specification proceedings tied to Google obligations with an explicit AI dimension, including interoperability and search-data access.
- April 27, 2026: the Commission opened a consultation on draft interoperability measures for Google Android aimed at AI services.
Read together, those actions show what "gatekeeper audit" means in 2026: not a single inspection checklist, but a layered supervision model applied to infrastructure dependencies that can quietly recreate lock-in.
Architecture & Implementation
What counts as an audit under the DMA now
For engineers, the DMA's inspection model is best understood as four overlapping mechanisms.
- Article 15 profiling audit: designated gatekeepers must submit an independently audited description of consumer-profiling techniques within six months of designation and update the public overview annually.
- Annual compliance reporting: gatekeepers publish recurring reports and defend implementation choices in Commission-run workshops.
- Article 17 market investigations: the Commission can assess whether a service should be designated a gatekeeper even when it does not obviously trip the headline thresholds.
- Specification proceedings: the Commission can move from broad obligation language to concrete technical measures on interoperability, access, and data sharing.
This matters because cloud and AI infrastructure rarely fail compliance in one obvious place. The problem shows up in joins between systems: billing plus egress, identity plus defaults, search plus ranking APIs, model deployment plus telemetry, consent plus training data, or mobile OS privileges plus assistant distribution.
The reference architecture regulators are implicitly asking for
If you run a cloud platform, AI platform, operating system, search engine, or developer ecosystem that could face DMA scrutiny, the safest design pattern is an evidence-first control plane. In practice, that means you need machine-verifiable proof across six layers.
- Service inventory: a current map of designated or potentially designatable services, internal dependencies, and user touchpoints.
- Portability layer: export pipelines for customer data, metadata, and workload descriptors in common machine-readable formats.
- Interoperability layer: documented APIs, access policies, protocol adapters, and objective eligibility criteria for third parties.
- Choice-and-default layer: logs showing when defaults can be changed, how many clicks are required, and whether equivalent competitors receive the same OS or UI affordances.
- Privacy provenance layer: records proving when personal data can and cannot be combined, reused, or grounded into AI workflows without consent.
- Evidence retention layer: immutable artifacts for audits, workshops, consultations, and litigation-grade review.
A minimal evidence manifest can look like this:
{
"service": "search-data-sharing",
"regime": ["DMA Article 6", "GDPR", "Data Act-adjacent portability"],
"owner": "platform-governance",
"artifacts": {
"api_schema": "v2026-04-15",
"access_policy": "frand-policy-3",
"anonymization_test": "run-88421",
"consent_ruleset": "privacy-graph-12",
"sla_report": "q2-2026-week-17"
},
"review_window_days": 30
}The point is not format. The point is reversibility. A regulator should be able to ask why one class of competitor got delayed, degraded, overcharged, or denied, and your platform should answer from system evidence instead of policy prose.
This is also where a privacy-safe toolchain becomes operational rather than cosmetic. If your compliance story depends on external sharing of ranking, query, click, or telemetry data, you need repeatable anonymization and masking workflows. TechBytes' Data Masking Tool is the kind of utility that fits directly into test fixtures and disclosure dry-runs before production evidence leaves the building.
Benchmarks & Metrics
The policy shift is happening because the underlying market is already infrastructure-heavy. In 2025, 52.7% of EU enterprises used paid cloud services, and 20.0% used AI technologies. That is enough scale for cloud switching, AI placement, and search-data access to become first-order competition issues rather than niche complaints.
| Metric | Latest verified value | Why it matters |
|---|---|---|
| EU enterprises using paid cloud services | 52.7% in 2025 | Cloud lock-in is now a mainstream market design problem. |
| EU enterprises using AI technologies | 20.0% in 2025 | AI access paths are becoming a broad business dependency. |
| Current DMA footprint | 7 gatekeepers, 23 core platform services | The supervisory perimeter is already large enough to shape adjacent infrastructure behavior. |
| Gatekeeper presumption threshold | 45 million monthly end users and 10,000 yearly business users | These are the headline numbers, but cloud probes show the Commission will look beyond them. |
| Article 17 investigation target | 12 months | Infrastructure teams should plan for long-running evidence production, not one-off responses. |
| Article 17 preliminary findings target | 6 months | Internal readiness needs to exist before the formal midpoint. |
| DMA maximum fines | 10% of worldwide turnover, or 20% for repeated infringements | Compliance arguments now sit directly next to board-level risk management. |
| Data Act egress-charge deadline | January 12, 2027 | Cloud portability economics will tighten even if a provider avoids DMA designation. |
What to measure internally
The best DMA benchmark is not a public KPI. It is your ability to prove neutral treatment under load, over time, and across classes of competitors.
- Switching completion time: median and p95 time to export data and move workloads.
- Functional equivalence gap: percentage of shared features preserved after migration to a same-type destination service.
- Default-change friction: clicks, screens, and elapsed time to change search, browser, assistant, or messaging defaults.
- Third-party access latency: time from eligibility approval to usable API or dataset access.
- Anonymization error budget: rejection rate, re-identification test outcomes, and remediation time for shared datasets.
- Consent provenance coverage: fraction of model-training, grounding, and recommendation jobs with auditable lawful-basis metadata.
If those numbers are missing, your real state is not compliant or non-compliant. It is unobservable, which is often worse.
Strategic Impact
Cloud stops competing only on price and performance
For hyperscalers, the biggest change is that portability and interoperability are becoming product attributes with regulatory weight. Historically, customers tolerated awkward exports, proprietary control planes, format friction, or expensive data movement because the alternative was operational pain anyway. The combination of DMA scrutiny and the Data Act changes that bargain.
- Product managers now need a defensible position on why a migration blocker exists.
- Platform teams need service contracts that separate normal usage fees from switching-related costs.
- Architecture reviews need to flag where "differentiation" is actually just disguised dependency.
AI infrastructure becomes part of market-access policy
The AI angle is subtler but potentially bigger. The Commission has already signaled interest in easy default changes, equal AI access to operating systems, search-data sharing for beneficiaries including AI chatbots with search functionality, and restrictions on combining personal data for AI training or grounding without consent. That turns AI infrastructure into a regulated distribution question.
- Operating systems become gatekeepers for assistant placement and privileged device capabilities.
- Search engines become infrastructure suppliers for downstream retrieval and ranking systems.
- Identity and privacy systems become hard compliance dependencies for model training and personalization.
- Developer platforms become evidence pipelines because every access decision may later need justification.
For smaller vendors, that is not just a burden. It is an opening. If dominant platforms must expose cleaner interoperability and data-sharing paths, challengers can spend less energy reverse-engineering access and more on differentiated product quality. That is the competitive thesis underneath the enforcement burst.
Road Ahead
As of April 30, 2026, the next year looks less like a legislative drama and more like a systems-engineering sprint.
- The AWS and Azure cloud investigations will keep pressure on portability, interoperability, and fair access design through late 2026 if they follow the normal Article 17 clock.
- The AI Act already applies GPAI obligations from August 2, 2025, and the broader framework becomes fully applicable on August 2, 2026, creating overlap with DMA evidence demands.
- The Data Act removes switching and data-egress charges from January 12, 2027, which means cloud contract, billing, and migration tooling should already be on the roadmap.
For engineering leaders, the implementation checklist is straightforward.
- Map every place where customers or rivals cross a platform boundary: export, search, identity, ranking, billing, defaults, and mobile OS access.
- Attach objective eligibility logic and machine-readable evidence to each boundary.
- Measure migration quality, not just migration possibility.
- Separate privacy governance for ads, personalization, training, and grounding flows.
- Assume consultations and specification proceedings will ask for runtime proof, not principles.
The strategic takeaway is simple. The EU's DMA expansion into cloud and AI infrastructure is happening mostly through enforcement mechanics, not headline amendments. That makes it more technical, less theatrical, and much harder to ignore. The winners will be the platforms that can demonstrate neutral behavior as a property of system design rather than as a promise in a compliance memo.
Frequently Asked Questions
What does the EU DMA expansion to cloud and AI actually mean in 2026? +
Are AWS and Azure already DMA gatekeepers? +
How does the Data Act interact with DMA cloud enforcement? +
What should platform teams log for a DMA-style audit? +
eligibility decisions, API-access timestamps, export completeness, migration outcomes, default-change flows, pricing logic, anonymization tests, and consent provenance. If you cannot reconstruct why one party got degraded access, your control plane is not audit-ready.Get Engineering Deep-Dives in Your Inbox
Weekly breakdowns of architecture, security, and developer tooling — no fluff.