Nigeria Data Protection
NDPA Compliance for Nigerian Fintechs: What the Act Means for Your AI Systems
If your Nigerian fintech uses AI for credit scoring, fraud detection, KYC verification, or automated lending decisions, you're processing personal data under the Nigeria Data Protection Act 2023. And the NDPC is paying attention.
This isn't theoretical. Multichoice was fined 766 million naira. Meta was fined $220 million. Over 1,368 organisations are currently under investigation. The enforcement apparatus is real and it's growing.
Here's what your AI systems need to comply with, and what most fintechs are getting wrong.
The NDPA and AI: What Applies
The Nigeria Data Protection Act 2023 replaced the old NDPR and established the Nigeria Data Protection Commission (NDPC) as the sole regulatory authority. It applies to any processing of personal data by any entity operating in Nigeria or processing data of Nigerian residents.
That covers every fintech using AI. Specifically:
- Credit scoring algorithms that assess loan eligibility
- Fraud detection systems that flag transactions
- KYC verification tools using facial recognition or document AI
- Automated lending decisions
- Customer segmentation using behavioural data
- Chatbots collecting customer information
Section 37: The Automated Decision-Making Rule
This is the section most fintechs haven't read. Section 37 of the NDPA gives individuals the right not to be subject to decisions based solely on automated processing that produce legal effects or significantly affect them.
What counts as "significantly affecting" someone? Denying a loan. Flagging a transaction as fraudulent and freezing an account. Rejecting a KYC application. Setting an interest rate based on algorithmic scoring.
If your AI does any of these, Section 37 requires you to:
- Provide meaningful information about the logic involved
- Give the individual the right to obtain human intervention
- Allow them to express their point of view
- Allow them to contest the decision
"Meaningful information about the logic" is the hard part. You don't need to reveal your algorithm, but you need to explain what factors affect the decision and how. "Our AI decided" is not sufficient. "Your application was assessed based on transaction history, income verification, and repayment patterns, with the following factors weighing against approval" is closer to what's required.
What You Need in Place
1. A Lawful Basis for Processing
Under the NDPA, you need a legal ground for processing personal data. For fintechs, this is usually:
- Contractual necessity — the customer applied for a service and you need to process their data to deliver it
- Legitimate interest — you have a legitimate reason to process the data (fraud prevention, for example) provided it doesn't override the individual's rights
- Consent — the customer explicitly agreed, and you can prove it
For AI credit scoring, contractual necessity is the strongest basis. The customer wants a loan; you need to assess their eligibility.
2. A Data Protection Impact Assessment
The NDPA requires assessment of risks for processing that is likely to result in high risk to individuals. AI-powered financial decisions are textbook high risk. Your DPIA should cover:
- What personal data the AI processes (input data, training data, output data)
- The purpose and necessity of automated processing
- Risks to individuals (incorrect decisions, bias, discrimination)
- Safeguards you've put in place (human review, appeal mechanisms, bias testing)
- Data retention and deletion policies
3. Engagement with a DPCO
Data Protection Compliance Organisations are licensed by the NDPC to audit organisations' data protection practices. There are currently around 146 licensed DPCOs in Nigeria. If your fintech processes data above the prescribed threshold, you need to engage one.
Here's the problem: almost none of the existing DPCOs specialise in AI systems or fintech-specific compliance. They can audit your general data protection practices, but the AI-specific questions — model bias, automated decision explainability, training data governance — fall outside most DPCOs' expertise.
That gap is what we fill.
4. Technical and Organisational Measures
The NDPA requires appropriate technical and organisational measures to protect personal data. For AI systems, this means:
- Encryption of data at rest and in transit
- Access controls — not everyone in the company should access customer data
- Audit logging — what did the AI process, when, and what was the output
- Model monitoring — is the AI performing as expected? Is it drifting?
- Incident response — if the AI makes a bad decision at scale, what's the process?
5. Cross-Border Transfer Safeguards
If your AI uses cloud-based LLMs (ChatGPT, Claude, etc.), customer data is leaving Nigeria. The NDPA requires adequate safeguards for cross-border transfers. You need to document where data goes, why, and what protections are in place.
If your LLM provider processes data in the US or EU, you need contractual clauses addressing this. Most LLM providers now offer data processing agreements — make sure you have one in place and filed.
The CBN Angle
The Central Bank of Nigeria's regulatory sandbox accepts fintech applications that use AI. But sandbox participation doesn't exempt you from the NDPA. If anything, it draws attention to your data processing practices. The CBN expects sandbox participants to demonstrate responsible innovation — and that includes data protection compliance.
If you're in the sandbox or planning to apply, having your NDPA compliance documented strengthens your application significantly.
The Coming AI Bill
Nigeria is expected to pass a dedicated AI governance bill. The details are still being debated, but early drafts suggest requirements for AI transparency, risk assessment, and accountability that build on the NDPA's foundations. Getting your NDPA compliance right now means you're already most of the way to whatever the AI bill requires.
What Most Fintechs Get Wrong
- Treating the NDPA like a checkbox — it's not. The NDPC is actively investigating and fining organisations.
- No explainability for AI decisions — Section 37 requires it. "The algorithm decided" is not an explanation.
- No human oversight for automated decisions — you need a process for human review when requested.
- Ignoring training data governance — where did your model's training data come from? Was it lawfully obtained?
- No DPIA for AI systems — this is usually the first thing an investigator asks for.
- Using a cloud LLM with no DPA — this is a cross-border transfer violation on day one.
How We Work With Nigerian Fintechs
We build AI systems — credit scoring tools, fraud detection, KYC automation, customer service chatbots — and we deliver them with NDPA compliance documentation as standard.
That means the DPIA is done before launch. The cross-border transfer mechanisms are documented. Section 37 explainability is built into the system architecture, not bolted on after an enforcement notice.
We also work with existing AI systems that need compliance remediation. If you've already built the AI but haven't done the compliance work, we can audit what you have and fill the gaps.
Most importantly: we understand both the technology and the regulation. You don't need to explain how your AI works to a lawyer, or explain the NDPA to a developer. We speak both languages.
Get in touch if your fintech needs help.
See our full services and pricing, or read about EU AI Act compliance for SMEs if you also serve EU customers. Our guide on whether you need a DPIA for your AI system covers the assessment process in detail.
Frequently Asked Questions
Does the NDPA apply to AI systems in Nigeria?
Yes. The Nigeria Data Protection Act 2023 applies to any processing of personal data, including automated processing by AI systems. If your fintech uses AI for credit scoring, fraud detection, KYC, or automated lending decisions, the NDPA applies. Section 37 specifically covers automated decision-making and requires safeguards including the right to human intervention.
Do I need a DPCO for my Nigerian fintech?
If you process personal data of more than a prescribed threshold of data subjects, yes. A Data Protection Compliance Organisation (DPCO) is a licensed entity that audits your data protection practices and files compliance reports with the NDPC. There are currently around 146 licensed DPCOs in Nigeria — very few specialise in fintech or AI.
What are the penalties for NDPA non-compliance?
The NDPC has enforcement powers including fines. They fined Multichoice 766 million naira and Meta $220 million. They are currently investigating over 1,368 organisations. The penalties are significant and enforcement is accelerating. For fintechs handling sensitive financial data with AI, the scrutiny is especially high.
How is the NDPA different from GDPR?
The NDPA is modelled on GDPR principles — lawful basis, data minimisation, purpose limitation, data subject rights. Key differences: the NDPC is the sole regulatory authority (unlike the EU's multiple DPAs), DPCOs play a unique audit role, and specific provisions around national security data differ. If you're GDPR compliant, you're about 70% of the way to NDPA compliance.
What does Section 37 of the NDPA say about automated decisions?
Section 37 gives individuals the right not to be subject to decisions based solely on automated processing — including AI profiling — that produce legal effects or significantly affect them. This directly covers AI credit scoring, automated loan decisions, and algorithmic risk assessments. You must provide meaningful information about the logic involved and the right to human intervention.
Need help with this?
We build compliant AI systems and handle the documentation. Tell us what you need.
Get in TouchRelated Articles
EU AI Act
EU AI Act Compliance for SMEs: What You Actually Need to Do Before August 2026
The EU AI Act high-risk deadline hits August 2, 2026. Here's what SMEs need to know — risk classification, requirements, penalties, and how to get compliant without spending a fortune.
AI for Business
AI Agents for Business: What They Are, How They Work, and How to Deploy One
AI agents go beyond chatbots — they take actions, make decisions, and complete tasks autonomously. Here's what business AI agents actually do, what they cost, and how to deploy one without breaking data protection law.
AI Automation
How Much Does an AI Chatbot Cost? Real Pricing Breakdown for 2026
Honest pricing for AI chatbots in 2026. From free tools to custom builds, here's what it actually costs, what affects the price, and what most agencies don't tell you about ongoing expenses.