On 9 October 2025, the European Commission and the European Data Protection Board (EDPB) issued joint guidelines on the interplay between the Digital Markets Act (DMA) and the GDPR, providing crucial clarification on areas of potential conflict. We highlight some issues that might be closer to a solution now that we have the guidelines.
Eliminating ambiguity on consent and legal bases (Art. 5(2) DMA)
One of the most contentious areas of the DMA has been determining what constitutes valid consent in the context of gatekeepers and whether other legal justifications can be used to bypass user choice. The new guidelines help on some core issues.
Issue: the combination of user data for AI training
The convergence of data models for large-scale processing, such as AI training, is directly impacted by Article 5(2) DMA and GDPR consent principles. While 'AI training' is not explicitly listed, it arguably constitutes a form of “service development” which is mentioned in the guidelines.
If the AI training model requires combining the same end user’s personal data from one Core Platform Service (CPS) with data from any other service provided by the gatekeeper, this falls squarely under the prohibition on data combination in Article 5(2)(b) DMA. Consequently, the gatekeeper must secure the end user’s granular consent for the distinct purpose of "service development" (eg AI training), and this purpose cannot be bundled with others, such as content personalisation or advertising.
But section 2.6 of the guidelines details specific types of data processing that do not trigger the mandatory consent requirement stipulated under DMA Article 5(2). And that may give rise to AI training opportunities.
Issue: processing within a single service (intra-service processing)
Processing personal data that a gatekeeper's CPS obtains directly from interactions with a specific end user, without combining it with personal data from any other gatekeeper service or third parties, falls outside the scope of Article 5(2) DMA.
Issue: cross-use in supporting services
Gatekeepers are not required to obtain consent under Article 5(2)(c) DMA for the cross-use of personal data between a CPS and other gatekeeper services that are provided together with or in support of that CPS. This exemption only applies to personal data that is strictly necessary to provide the interconnected functionality (e.g. using identification details from a CPS for a supporting payment or delivery service). The use must also align with end users’ reasonable expectations.
Even where DMA consent is not required, the gatekeeper must still rely on a valid GDPR lawful basis. Processing for online advertising services generally cannot be justified on the grounds of contractual necessity. The cross-use of a limited set of on-platform personal data (like geography, language, or content topics) in a supporting advertising service might potentially be justified on the grounds of legitimate interest (Article 6(1)(f)), provided it does not involve intrusive measures like profiling or tracking, and it is within reasonable user expectations.
Lawful basis
Under the GDPR, gatekeepers can only process personal data in accordance with an Article 6 GDPR lawful basis (and where an Article 9 exception applies in relation to special data).
Issue: the restriction of lawful grounds
Before these guidelines, gatekeepers sometimes argued that certain data combination activities (e.g. cross-using data across services) were justified by legitimate interest (Article 6(1)(f)) or contractual necessity (Article 6(1)(b)) - an argument that has been rejected under the GDPR and traditional competition law. The guidelines firmly put an end to this argument under the DMA too, echoing the Commission Meta 'pay or OK' decision and EDPB Opinion for large online platforms on the 'consent or pay' model (Opinion) (see here for more). For the specific processing activities listed in Article 5(2) DMA (combining, cross-using, processing for online advertising services, and signing in to combine data), gatekeepers cannot rely on the performance of a contract or on legitimate interests. They must get consent unless they can rely on a lawful basis under Article 6(1) (c-d) of the GDPR.
- Legal obligation (c): This typically covers processing necessary for the gatekeeper’s compliance with legal obligations, such as for network security, service integrity, or fraud detection.
- Vital/public interest (d/e): reliance on vital interests or public interest tasks is possible but only in very limited scenarios considering the economic and commercial nature of the gatekeeper's activities.
- Data segregation: personal data processed under these legal duties must be protected against reuse for commercial purposes (e.g. through the segregation of data in separate filing systems).
Issue: the unsuitability of binary paywalls
Despite prior rulings and the Opinion on the use of of 'pay of OK' models for online advertising, ambiguity persisted over whether presenting users with a choice between consenting to tracking (for behavioural advertising) or paying a fee could constitute "freely given" consent under the GDPR.
The Opinion, heavily referenced by the guidelines, established that for large online platforms, offering a binary choice between consenting to tracking and paying a fee will not, in most cases, result in valid consent. The guidelines effectively set out a high threshold for "freely given" consent, citing the inherent imbalance of power and the potential for detriment (negative consequences) when gatekeepers leverage their market position, network effects, or social prominence.
Issue: limits on repetitive consent requests
Platforms often used repetitive or slightly modified consent requests to fatigue users into clicking 'Accept.'
The guidelines introduce a definitive limit: once an end user has actively refused or withdrawn consent for a purpose covered by Article 5(2) DMA, the gatekeeper is prohibited from repeating that request for the same purpose within a period of one year. They must refrain from presenting "slightly modified consent requests (e.g. consent requests with different wording) within the same year, that seek to obtain consent for the same processing operations and for essentially the same purposes".
Clarity on data access and divided responsibility (Articles 6(9) and 6(10) DMA)
The guidelines also resolve key ambiguities concerning the sharing of data with third parties - both at the end user's request (portability) and the business user's request (access).
Issue: portability scope — including third-party data
Article 20 GDPR limits the portability right to data concerning the data subject. It was previously unclear whether Article 6(9) DMA obliged gatekeepers to port datasets that include personal data concerning other individuals (e.g. contacts or chat partners).
The guidelines confirm that portability rights under the DMA are broader. The obligation applies to personal data of data subjects other than the end user if that data is provided or generated through the end user’s activity within the CPS. The gatekeeper is legally obliged (Article 6(1)(c) GDPR) to facilitate this transfer although the gatekeeper must provide the means to exclude the data from other individuals.
Issue: responsibility for valid business user consent
Article 6(10) DMA grants business users access to end-user data provided the end user consents. A major point of dispute was the division of responsibility for obtaining and validating this consent between the gatekeeper (who provides the service interface) and the business user (who uses the data).
The guidelines establish a clear division of responsibility: the gatekeeper's obligation is to provide the mechanism to enable business users to obtain consent and to keep a record that consent was granted. The gatekeeper is not, however, responsible for assessing or verifying the validity of the consent obtained by the business user. That responsibility, including ensuring the consent is specific, informed, and freely given, lies solely with the business user, subject to monitoring by the competent national Supervisory Authorities.
Practical definition of anonymisation (Article 6(11) DMA)
For online search engines, Article 6(11) DMA requires gatekeepers to share ranking, query, click, and view data with competing search providers, provided that any personal data is anonymised.
Issue: who needs to anonymise the data, and how is it assessed?
It was unclear whether "anonymised" meant ensuring no one in the world could re-identify the user, or whether it focused on the recipient's means of identification.
The guidelines focus the obligation: only the personal data of the end user generating the data needs to be anonymised. Personal data of third parties (e.g. a person mentioned in a query) remains personal data and further processing of this data must comply with the GDPR. Furthermore, the assessment of anonymity must consider all the means reasonably likely to be used by the third party undertaking receiving the data, or by another person, to re-identify the end user.
Anonymisation should be achieved through appropriate technical measures for alteration of the data, complemented by organisational, administrative, and contractual measures to mitigate residual likelihood of identification.
A step in the right direction
The guidelines (which are subject to consultation until 4 December 2025) address several persistent and highly disputed issues relating to the processing of personal data by large platforms designated as gatekeepers under the DMA. For gatekeepers and business users alike, this guidance provides some clarity although, only time and the courts will tell whether the interpretation is sound.