There is more in regulating artificial intelligence than data protection, but it remains one of the crucial aspects. The AI Act poses a risk of establishing parallel enforcement structures with data protection authorities (DPAs), which might potentially result in legal uncertainty.

An article by Paweł Hajduk

An issue of overlap between enforcement structures in the EU data law is hardly new, as it is part of the risks arising from the proliferation of the EU legislation in this area. In a number of proposals – for example, the Data Governance Act, the Data Act Proposal, the Digital Services Act, and the European Health Data Space Regulation, the EU legislator has allowed national legislators to designate authorities to enforce these acts. The same applies to the AI Act (the version of the EU Parliament is analysed). It may lead to the designation of different authorities with overlapping competencies. Although national legislators can alleviate this by appointing the same authorities across all the legal acts, such a scenario seems unlikely, as an example of Spain shows.

Digital entanglement

The GDPR and the EU data legislation are drafted in general terms in the spirit of technological neutrality. This layers on top of a systemic reluctance to regulate technical standards in the EU statutory law. Consequently, it compels enforcement authorities to specify legal norms for individual cases during administrative proceedings. In soft law, this implies a shift in clarifying the meaning of legal norms from the level of statutory law to the level of enforcement authorities. It is an understandable mechanism allowing the statutory law not to become obsolete. Meanwhile, supervised entities must consider all authorities’ positions in securing their compliance since these authorities may impose fines. By this, the positions of authorities fuel the interpretation of the EU data law.

The existence of parallel enforcement structures in the EU data law, including the AI Act and the GDPR, means a risk of double jeopardy, divergent interpretations of the same issues, and dilution of the role of the data protection authorities.

In the AI Act context, leaving the issue of designating enforcement structures to Member States has three consequences. The first concerns the scope of authorities’ competencies. There will be no clear boundaries between enforcement authorities under the AI Act and the silos of remaining enforcement structures, especially data protection authorities – whose competencies are drafted broadly. Secondly, there are concerns regarding an adequate cooperation model between different enforcement silos and Member States – a popular issue in enforcing the GDPR. The third pertains to the organisational cultures of various authorities. Namely, Member States may designate existing authorities from different regulatory cultures (for example, data protection or competition authorities). It can be hypothesised that various authorities may look at the AI Act through the lenses of their original area of interest.

The intersection between the AI Act and the GDPR

The intersection between the AI Act and the GDPR has already been analysed, e.g., in the EDPB-EDPS Joint Opinion on the AI Act Proposal, the CEDPO Guide for DPOs, or an analysis by the Future of Privacy Forum. Therefore, only a few thoughts will be given.

The proposed AI Act, relying on the same treaty basis as the GDPR, refers repeatedly to the GDPR, including its principles, emphasising that they must be embedded in AI systems (Recital 45a of the AI Act). An example is the establishment of a separate legal basis in the AI Act, allowing exceptional processing of special categories of personal data to ensure “negative bias detection and correction” with the high-risk AI systems (Article 10(5) of the AI Act). This basis is subject to highly evaluative conditions. Hence, the question arises as to which authority will make their evaluation. Further – if there are to be separate authorities from data protection authorities – will their position differ from that of data protection authorities?

The GDPR provides for a two-tier enforcement model, with a leading role for data protection authorities in each Member State, complemented by the advisory and consistency mechanisms of the EDPB and the EDPS. In the absence of AI laws, data protection authorities are already taking positions on AI, with a prominent example of the Italian Data Protection Authority temporarily banning chatbot ChatGPT.

In turn, the AI Act’s two-level enforcement model seems convoluted. The AI Act introduces the following separate terms: “national supervisory authority,” “national competent authority,” “market surveillance authority,” and “notifying authority”. This is even more perplexing as it differs between particular proposals. At the EU level, the European Artificial Intelligence Office (including representatives of each Member State’s authority, the Commission, and the EDPS) plays a key role. Simplifying, the primary responsibility will rest with the national supervisory authorities designated for each Member State. The European Artificial Intelligence Office (the “AI Office”) will be tasked with advisory and cooperation roles.

Will mitigating mechanisms work?

The AI Act acknowledges the risk of overlap, shown by direct references to the GDPR and caveats that the AI Act is intended to complement the GDPR and other related legislations. Attempts are introduced to tackle this challenge, such as the “without prejudice” (towards the GDPR) clauses and cooperation mechanisms between authorities. The question is whether they will work.

“Without prejudice” clause

Both in recitals (e.g., Recital 2a) and in the operative part of the AI Act [e.g., Article 2(5a), Article 4a(1)(c) and (2), Article 68c(3)], there are provisions indicating that the AI Act remains without prejudice to the EU personal data law, meaning that the AI Act does not seek to affect their application. In the procedural dimension, the AI Act should not affect “tasks and powers of the independent supervisory authorities,” including data protection authorities (e.g., Recitals 2a and 2b and Article 53(2)).

But what does it mean? Can it be inferred that the AI Act constitutes a lex specialis to the GPPR? It remains unclear. Bania came to similar conclusions regarding the place of the DMA in the existing framework, describing the ‘without prejudice’ clause as a myth. Suppose that even at the level of the literal wording of both acts, it may be argued that they are complementary. Given possible divergences between authorities, technological neutrality, and the inextricable proximity of regulated areas, one should expect contradictions in their application.

Cooperation mechanisms

A way to reduce these divergences could be cooperation mechanisms – both in individual cases and in issuing soft law. The need for cooperation between authorities in the EU data law and the lack of specific rules for it (beyond the treaty-based duty of sincere cooperation) is highlighted by the CJEU.

Regarding soft law, the national supervisory authorities under the AI Act should consult competent authorities under EU law, including data protection authorities, on drafts of the guidance (Article 59(7) of the AI Act). While there is no unequivocal obligation to cooperate in adopting soft law, it is strongly encouraged. At the EU level, the AI Office will include a representative of the EDPS, which will result in the voice of data protection authorities being considered, albeit not to be deciding (Article 57b(1) of the AI Act). The task of the AI Office is to ensure cooperation between authorities at the national level, including data protection authorities, and the EU level, including the EDPB and EDPS (Article 56b of the AI Act). The issuance of soft law instruments will be a vital tool to do it. One can only hope they remain aligned with the existing recommendations of the EDPB and EDPS.

In individual cases, the matter is more tangled, considering procedural challenges and implications for the parties involved. Primarily, the AI Act does not provide explicit rules of cooperation between national supervisory authorities and data protection authorities. Such an obligation can be inferred from the duty of the national supervisory authorities to cooperate with “national public authorities or bodies which supervise or enforce the respect of obligations under Union law protecting fundamental rights” (Article 64(3) of the AI Act), which includes data protection authorities (Article 51(1) of the GDPR). It may take the form of reporting violations related to fundamental rights to data protection authorities (Article 62(2) of the AI Act) or cooperating in evaluating AI systems presenting the national level risk (Article 65(2) and Article 67(1) of the AI Act).

While the efforts of the EU legislator should be recognised, they are insufficient. Firstly, there is no clear indication of when cooperation is necessary. Secondly, it is not evident what the consequences of a lack of cooperation would be. Thirdly, there are no explicit rules for this cooperation. To some extent, a root cause is the EU legislator’s lack of capacity to interfere in national administrative orders due to the principles of conferral and procedural autonomy. Hence, it seems to be an inherent issue in the emergence of the European Administrative Space.

Is the storm coming?

The overlap of enforcement structures between the GDPR and the AI Act will pose a challenge. The proposed counter-mechanisms seem unsatisfactory. Increasing legal uncertainty means rising compliance costs. It does not mean that the EU should not pay this price if the protection of fundamental rights is at stake. However, it should be considered as a part of the fierce global competition. A taxonomic issue is hidden beneath the surface, i.e., a question of what constitutes EU data law. This is related to the broad concept of personal data in the GDPR and the ubiquitous presence of data. There seems to be no silver bullet. Technological advances of the last years are unprecedented, so a proper response will require time and, perhaps, a few hiccups.

Disclaimer: General remarks on the nature of the overlap of legal acts and authorities’ competencies in the EU data law have been presented during my talk titled “A Walk in the Labyrinth. Evolving EU Regulatory Framework for Secondary Use of Electronic Personal Health Data for Scientific Research” at the 18th IFIP Summer School on Privacy and Identity Management 2023 – Sharing (in) a Digital World at the University of Oslo in August 2023. This blog post applies them to the AI Act.

Published under licence CC BY-NC-ND. 


  • Paweł Hajduk

    Paweł Hajduk is an EU-qualified lawyer (Polish advocate), a Ph.D. Candidate at the Department of Informatics Law at Cardinal Stefan Wyszyński University in Warsaw (Poland) and Data Protection Officer.

    View all posts

This Blogpost was written by


  • Paweł Hajduk

    Paweł Hajduk is an EU-qualified lawyer (Polish advocate), a Ph.D. Candidate at the Department of Informatics Law at Cardinal Stefan Wyszyński University in Warsaw (Poland) and Data Protection Officer.

    View all posts

Paweł Hajduk Written by:

Paweł Hajduk is an EU-qualified lawyer (Polish advocate), a Ph.D. Candidate at the Department of Informatics Law at Cardinal Stefan Wyszyński University in Warsaw (Poland) and Data Protection Officer.