The Data Protection Commission (DPC) is Ireland's supervisory authority for GDPR. And if you're an Irish business deploying AI systems that process personal data, the DPC expects you to have conducted a Data Protection Impact Assessment before those systems go live.
Not after. Not "when we get around to it." Before.
When a DPIA is required in Ireland
Under GDPR Article 35, a DPIA is mandatory when processing is "likely to result in a high risk to the rights and freedoms of natural persons." The DPC has published its own list of processing operations that require a DPIA. AI systems trigger multiple criteria simultaneously:
New technologies. AI is explicitly identified as a novel technology that increases risk. Whether you're using a customer service chatbot, an automated decision engine, or a recommendation system — the technology itself is a trigger.
Automated decision-making. If your AI makes or materially influences decisions about people — loan applications, insurance quotes, recruitment screening, customer eligibility — Article 22 applies and a DPIA is required.
Large-scale processing. An AI chatbot handling thousands of customer conversations daily processes personal data at scale. A fraud detection system monitoring every transaction does the same.
Profiling. If your AI system analyses customer behaviour, segments users, or predicts preferences — that's profiling under GDPR. DPIA required.
Most AI systems I assess match at least two of these. Some match all four.
What the DPC specifically expects
The DPC has been more active on AI enforcement than many people realise. The Snap/My AI investigation is the clearest signal — the DPC investigated Snap's AI chatbot specifically for failing to conduct an adequate DPIA before launch.
What the DPC wants to see in your DPIA:
A genuine assessment, not a template. The DPC has been explicit that a DPIA should be a real analysis of your specific processing, not a copy-pasted document with your company name swapped in. If your DPIA reads like it could apply to any business running any AI system, it's too generic.
Data flow mapping. Where does personal data enter the AI system? Where does it go? If you use OpenAI, Anthropic, or any external AI provider, the data crosses borders. Every hop needs documentation.
Lawful basis analysis. Not just "we rely on legitimate interest." A proper balancing test showing you've weighed your business needs against the individual's rights. For AI systems, this means considering the intrusiveness of the processing, not just the business benefit.
Risk assessment with mitigations. What could go wrong? What are you doing about it? The DPC expects you to identify specific risks — model bias, data breaches, incorrect automated decisions — and show what controls are in place.
Consultation with your DPO. If you have a Data Protection Officer (or should have one), they need to have been involved in the DPIA process. The DPC will check.
The EU AI Act adds to this
From August 2, 2026, the EU AI Act creates additional requirements on top of GDPR. Ireland has designated 15 national competent authorities for AI Act enforcement.
If your AI system is classified as high-risk under the AI Act (credit scoring, recruitment tools, insurance pricing, fraud detection), you need both a GDPR DPIA and AI Act conformity documentation. They're separate requirements but the analysis overlaps significantly. A well-written DPIA can cover both — assess GDPR risks and AI Act requirements in one document.
For limited-risk systems (most customer service chatbots), the AI Act mainly requires transparency — tell users they're interacting with AI. Your DPIA should document how you meet this requirement.
What makes an AI DPIA different from a standard DPIA
The ICO's DPIA template is a reasonable starting point, but AI systems have complications that standard templates don't address:
Third-party AI providers. If you use Claude, GPT-4, or any cloud AI, you're sending personal data to a third-party processor in another jurisdiction. Your DPIA needs to cover the Data Processing Agreement, the transfer mechanism (usually Standard Contractual Clauses), the provider's data retention policy, and whether they use your data for model training.
Training data. If you fine-tuned a model on personal data, or your RAG system indexes customer documents, that's processing. The DPIA needs to assess the lawful basis for training use, data minimisation in the training set, and the practical impossibility of deleting data embedded in model weights.
Explainability. If the AI makes decisions about people, you need to explain the logic. Your DPIA should document what explanation mechanisms exist, how they work, and whether they're genuinely meaningful or just "the algorithm decided."
Bias and fairness. Has the AI been tested for bias across protected characteristics? What happens when it gets it wrong? Your DPIA should address testing methodology and remediation processes.
Irish-specific considerations
Enterprise Ireland support. If you're an Irish SME, Enterprise Ireland and Local Enterprise Offices offer digitalisation vouchers that may cover the cost of a compliance assessment or DPIA. Worth checking before you pay full price.
Sector regulators. Depending on your industry, your AI system may fall under additional oversight. The Central Bank of Ireland for financial services AI, ComReg for telecoms, the Health Products Regulatory Authority for medical AI. Your DPIA should reference the relevant sector regulator.
The DPC as lead authority. Many global tech companies are headquartered in Ireland, which makes the DPC the lead supervisory authority for GDPR enforcement against them. The DPC's enforcement posture on AI is informed by these high-profile cases — and the standards filter down to Irish SMEs too.
What to do
If you're running AI systems in Ireland and don't have a DPIA:
Write one before anything else. The cost of a DPIA is a fraction of the cost of an enforcement action. The DPC's Snap investigation demonstrated that launching an AI system without a DPIA is itself a compliance failure, regardless of whether the system causes any actual harm.
Map your data flows first. Understand where personal data enters, where it goes, who processes it, and what happens to it. Then write the assessment.
Review when anything changes. Switched AI providers? Update the DPIA. Added a new data source? Update the DPIA. Changed the model? Update the DPIA. It's a living document.
If you're not confident doing this internally — especially for AI systems where the data flows involve external providers and cross-border transfers — that's normal. The intersection of data protection law and AI architecture is specialised.
We include DPIAs in every AI system we build. See our AI Chatbot + Compliance Package — £3,500 or our AI Compliance Consulting services for standalone DPIA work. Not sure what you need? Start with a £500 scoping review.
Need help with this?
We build compliant AI systems and handle the documentation. Tell us what you need.
Get in TouchRelated Articles
GDPR
GDPR Compliance Ireland: What AI Businesses Need to Know
GDPR compliance for Irish businesses using AI. What the DPC expects, how GDPR interacts with the EU AI Act, and practical steps for SMEs deploying chatbots, automation, and data processing.
GDPR
Do I Need a DPIA for My AI System? Yes — Here's Why
Most AI systems processing personal data require a DPIA under GDPR Article 35. What triggers the requirement, what the assessment must cover, and what happens if you skip it.
Nigeria
AI in Nigerian Financial Services: The Complete Regulatory Stack (NDPA + CBN + GDPR + EU AI Act)
Every regulation that applies when Nigerian financial institutions deploy AI. NDPA, CBN directives, GDPR extraterritorial reach, and EU AI Act obligations mapped in one guide.