← Back to Insights

Nigeria

Nigeria Data Protection Act 2023: What Your Business Must Do Now

Michael K. Onyekwere··9 min read

I've spent the last decade working in data protection across financial services — Royal Bank of Scotland, Fidelity, TMF Group. Most of that was GDPR. But the Nigeria Data Protection Act 2023 is the law I keep getting asked about now, because Nigerian businesses are waking up to it and realising they're not ready.

This guide is for people who build things. Not lawyers parsing subsections — founders, CTOs, and compliance leads who need to know what the law actually requires so they can build systems that comply from the start.

Who the NDPA applies to

If your business touches data belonging to people in Nigeria, it applies to you. Specifically:

  • The person whose data you're processing is in Nigeria
  • Your business is in Nigeria
  • You're offering goods or services to people in Nigeria
  • You're monitoring the behaviour of people in Nigeria

That last one catches a lot of tech companies off guard. Running analytics on how Nigerian users behave? Recommendation engine? Behavioural profiling? You're in scope, even if your servers sit in Dublin.

The principles — quickly

These mirror GDPR but enforcement is Nigerian, which changes the practical calculation. In short:

Have a reason for every piece of data you process. Not "we might need it" — an actual legal basis. More on that below.

Tell people what you're doing with their data. Before you do it. In language they understand. Not buried in a 40-page privacy policy nobody reads.

Only collect what you need. If your chatbot doesn't need someone's date of birth to answer a support question, don't collect it. This sounds obvious but I've seen countless systems hoovering up data "just in case."

Don't keep it forever. Set retention periods. Actually delete things when the period expires. CBN's 5-year AML retention requirement doesn't mean you keep everything for 5 years — just the transaction records they specifically require.

Protect it. Encryption, access controls, the basics. If you lose it or someone steals it, there are consequences.

Prove it. This is the one that separates paper compliance from real compliance. Documentation, audit trails, impact assessments. The NDPC doesn't want you to say you're compliant — they want evidence.

Lawful bases — which one to use

Every processing activity needs a legal basis. Six options:

Consent — the one everyone defaults to. Person agrees, you process. Has to be specific and freely given. Can be withdrawn. For AI systems, this is often a bad choice because withdrawing consent mid-processing creates operational chaos. Use it for marketing, not for core service delivery.

Contract — processing is necessary to deliver a service someone signed up for. Customer joins your fintech app, you process their data to provide the service. Usually the strongest basis for customer-facing AI.

Legal obligation — the law requires it. CBN AML monitoring is a clear example. The NDPA recognises that some processing is mandatory.

Vital interests — protecting someone's life. Rare outside healthcare.

Public interest — mostly government. You're probably not using this.

Legitimate interest — you have a genuine business reason that doesn't override individual rights. Fraud detection fits here. But you need a documented balancing test showing you've weighed your interest against the person's privacy. The NDPC guidance on this is still developing, which means the safe play is documenting thoroughly.

For AI: contract and legitimate interest cover most cases. Don't lean on consent unless you genuinely need it.

What you need to document

The NDPA is a documentation-heavy regime. Here's what you actually need:

Privacy notices. Before you collect data, people need to know who you are, what you're collecting, why, who sees it, how long you keep it, and how they can exercise their data subject rights. If your AI chatbot processes conversation data, the customer needs to see a privacy notice before the first message.

Records of processing. A register of every processing activity — what data, what subjects, what purpose, what legal basis, who receives it, retention periods, security measures. Sounds tedious. It is. But it's also the first thing the NDPC asks for in an investigation.

Data Protection Impact Assessments. Required when processing is high-risk. Most AI systems qualify — automated decision-making, new technology, large-scale processing. Write the DPIA before you deploy. Not after. After is expensive.

Data Processing Agreements. Any third party that touches personal data on your behalf gets a DPA. Your AI provider (OpenAI, Anthropic, whoever), your cloud host, your analytics platform. The DPA covers what they can do with the data, how they secure it, and what happens when the relationship ends. Using ChatGPT API or Claude API? You need a DPA with them.

The NDPC — who's watching

The Nigeria Data Protection Commission replaced the old arrangement where NITDA handled privacy alongside everything else in IT regulation. The NDPC is dedicated, focused, and building capacity fast.

They can investigate, audit, issue compliance orders, and fine. Fines run up to 2% of annual gross revenue or ₦10 million — whichever is higher. Those numbers are in the Act itself.

Two things you need to know about NDPC compliance:

The Compliance Audit Return. Annual filing through a licensed DPCO. If you're a data controller of major importance (more than 2,000 data subjects, regulated sector, sensitive data), you file. The 2026 deadline was March 31.

The Data Protection Officer. If you're classified as a DCMI/DPMI, you need one. Can be an internal hire or outsourced.

AI and the NDPA

This is the part I care about most, because it's where I see the most confusion.

Automated decisions. If your AI makes or materially influences decisions about people — credit approvals, fraud flags, loan pricing, insurance terms — you owe them transparency, an explanation of the logic, a right to human review, and a right to challenge the decision. You don't have to publish your source code. You have to be able to say "the model considers these factors and weighs them roughly like this."

Training data. Personal data used to train models counts as processing. You need a lawful basis for it. And it should show up in your records of processing. The tricky part: once personal data is embedded in model weights, you can't easily honour a deletion request. Document this limitation upfront.

Cross-border transfers. Every API call to OpenAI sends data to the US. Every AWS backup might land in Ireland. Nigerian personal data leaving Nigeria needs documented safeguards — standard contractual clauses, adequacy decisions (rare), or explicit consent for the transfer. Document every route.

WhatsApp bots. Nigeria runs on WhatsApp. If you're building a WhatsApp AI chatbot, every conversation is personal data processing. Consent or another lawful basis, privacy notice, retention policy. The informality of WhatsApp doesn't make the NDPA go away.

The CBN overlap

Financial services companies get hit from both sides. The NDPA governs your data processing. CBN directives govern your banking operations. They don't always agree.

Data retention conflict. CBN wants 5 years of transaction records for AML. The NDPA says don't keep data longer than necessary. Resolution: keep what CBN specifically requires for 5 years, apply NDPA minimisation to everything else. Document why you're keeping what you're keeping.

The June 2026 AML deadline. CBN mandates automated transaction monitoring. The NDPA requires a DPIA for that kind of large-scale automated processing. You need to satisfy both.

KYC data. CBN requires you to collect a lot of personal data for customer due diligence. The NDPA says minimise. The answer: collect exactly what CBN requires, not a byte more, and document the legal obligation as your lawful basis.

EU AI Act intersection

If your AI system's output reaches EU residents — diaspora customers, European business partners, anyone in the EU — the EU AI Act may also apply. It has extraterritorial reach, same as GDPR.

The overlaps: AI Act requires risk classification, NDPA requires a DPIA. AI Act requires technical documentation, NDPA requires records of processing. Similar exercises, not identical ones. If you're a Nigerian fintech serving diaspora, you're potentially under three regulatory frameworks simultaneously.

We've written about managing the full regulatory stack — NDPA + CBN + GDPR + AI Act — in one programme.

What I'd tell you to do right now

If you're building or running AI systems in Nigeria:

Map your data. What personal data do you hold? Where did it come from? Where does it go? This exercise is unglamorous and nobody wants to do it. It's also the foundation of everything else.

Check your legal bases. For every processing activity, can you name the lawful basis? If the answer is "consent" for everything, rethink. Consent isn't wrong, it's just fragile for AI.

Write DPIAs for your AI systems. Before deployment. The cost of a DPIA is a fraction of the cost of retrofitting compliance after a complaint.

Get DPAs in order. Every AI provider, every cloud host. Most of them have DPAs ready to sign. The failure is usually that nobody at the company has actually signed them.

Appoint a DPO. Internal or outsourced. If you're above the threshold, this isn't optional.

Engage a DPCO. For your annual CAR filing. Build the relationship now, not the week before the deadline.

The NDPA isn't going away and enforcement is ramping up. The businesses that build compliance into their systems from the start will spend less, stress less, and never have to explain to a regulator why they didn't bother.


Need help with NDPA compliance? Our NDPA Fintech Compliance Programme gets you compliant — from ₦3,500,000. Not sure where you stand? Start with an NDPA Readiness Diagnostic — ₦500,000.

Start with an NDPA Readiness Diagnostic

If you need NDPA compliance advice or a compliant AI build, the first step is a written diagnostic. You get a real assessment, not a vague intro call.

NDPANigeriaData ProtectionComplianceAI