When Akeem Bankole signed up for a free AI learning programme in 2023, his excitement outweighed his curiosity about the privacy policy. Like most young Nigerians eager to learn digital skills, he clicked “accept” without reading the fine print.
“I always check the box without reading because it’s too much,” says Akeem, who completed courses with ALX and 3MTT. “I trust them with my data because they have a reputation to protect. But honestly, EdTech platforms should be more transparent by making their policies short and simple so users can understand.”
Akeem’s experience reflects a wider pattern. A survey of 40 Nigerian learners conducted for this investigation revealed that over 60% rarely or never read privacy policies before giving consent. Instead, they skim or skip through, driven by trust in the platforms or the urgency to gain access to learning opportunities.
Roqeebah Lawal, another learner, echoes this sentiment: “The majority of users only scan through the privacy policy because it’s bulky. Platforms do share users’ data with third parties, though consent is usually sought. Still, most people just give out their data to gain access to learning.”
Others, like Mishak Mosimabale, an ALX alumnus, raise concerns that the design of these documents discourages careful reading. “It’s like they intentionally make it bulky so people will lazily not read it. Consent should be requested again before sending data to third parties.”
Their experiences highlight a troubling reality: while learners are benefiting from free or subsidised courses, they often have little clarity about how their data is collected, stored, and used in Nigeria’s rapidly expanding AI-driven EdTech sector.
Platforms Speak: Offline Operations vs. AI-Driven Systems
Not all Nigerian EdTech platforms operate in the same way when it comes to handling learner data. Some, like Crystal EdTech’s platform, are still largely offline. Mafyeng Jonathan Dahoro, a representative of this EdTech, explains that although his organisation’s platform has collected personal details from learners since its launch in 2022, everything is stored manually and offline.
“We are not fully online yet,” Dahoro says (Crystal EdTech Representative). “Data is collected during training registrations and stored offline in files. For now, it’s mainly about record-keeping and tracking learners, not advanced AI profiling or marketing.”
This contrasts sharply with larger AI-powered EdTech providers like ALX, Cisco Networking Academy, and Coursera, which operate fully online and rely heavily on digital data for personalisation.
Daniel Osibanjo, founder of another Nigerian EdTech platform named Destination Academy, argues that data is a necessary backbone for innovation. “AI-driven systems require data to personalise learning paths, track engagement, and measure outcomes. Without it, the platforms cannot scale or serve diverse learners.”
Yet, this reliance on data is not without concerns. Learners interviewed noted that they receive personalised recommendations they never asked for, a sign that their data is being analysed beyond basic registration details.
Hope Rubainu, a representative of ALX_Nigeria, defended their practices during an interview. “Data being collected is just to know the number of people we are working with. We don’t send out data to anybody—it’s for the organisation alone. Privacy is valued and upheld. Learners give consent when signing up, and their information is safe with us.”
Rubainu acknowledged that questions around Nigeria’s Data Protection Act (NDPA) 2023 are better left to legal experts but insisted that learners’ information is secure within ALX.

Despite these assurances, experts argue that offline record-keeping and online AI-powered systems both pose risks when safeguards are weak or unclear. While offline systems may be prone to physical breaches, online platforms raise questions about profiling, third-party sharing, and opaque algorithms.
Consent not informed
While EdTech platforms emphasise innovation, digital rights and legal experts warn that Nigeria’s fast-growing sector may be trading accountability for speed.
Israel Olawunmi, a lawyer and data protection expert, says that many Nigerian EdTech providers do not meet international standards of informed consent.
“Common violations observed in Nigerian EdTech platforms’ privacy policies include vague or overly broad language,” Olawunmi explains. “Users don’t really understand what they are consenting to, and yet, once they click ‘accept’, they’ve permitted data processing. This is a problem because sensitive personal data, including that of minors, is at stake.”
Ebiegberi Zidafamo, another legal expert, echoed similar concerns. He argued that while Nigeria’s Data Protection Act (NDPA) 2023 was a major step forward—establishing penalties such as fines of up to ₦10 million or 2% of annual revenue—its enforcement remains weak.
“The law is there, but the enforcement mechanism is fragile,” Zidafamo said. “Platforms know this, so compliance is more about ticking boxes than true accountability.”
For Paradigm Initiative (PIN), one of Africa’s leading digital rights organisations, the lack of transparency is particularly concerning. Ihueze Nwobilor, who leads their work on EdTech, highlighted how vague policies leave learners exposed. “Consent is not informed if users cannot clearly understand the terms. EdTech providers must do more than provide bulky privacy documents—they need to actively educate users about how their data will be used.”
This gap between legal frameworks and practical protections means learners are left vulnerable. A survey of the learners revealed that an estimated 62.5% admitted to never reading privacy policies fully or skimming them, while 45% were unsure if their data was being shared or used without consent. 88% of users said they would support stricter penalties for misuse of data by EdTech providers.
Luminate’s report emphasises that privacy is not a compliance issue, but a fundamental right tied to freedom of expression and association.

Learner Experiences – Between Empowerment and Exploitation
Behind every policy and platform are the learners—young Nigerians determined to upskill but often unsure what they are giving up in exchange.
Bankole described the trade-off clearly when he acknowledged that while the platforms have given him access to valuable content, the privacy policies remain a blur. “You don’t even know whether they are selling your data or using it for something else. The truth is, we don’t really have a choice if we want to learn,” he said.
Roqeebah Lawal, an ALX alumna, admitted she only scans through privacy terms. “It’s always bulky, and nobody has the time,” she said. While she trusts that data won’t be shared without consent, she acknowledges that “sharing with third parties often hides in the fine print. The choice is really between giving consent or being locked out.”
For Kabir Azeez, who studies across ALX, 3MTT, and Coursera, the main concern is clarity. “The privacy policy is too much to read and understand. I always check the box and proceed. I think there should be penalties for platforms that use people’s data without consent.”
Others, like Ruth Alesinloye, could not even recall seeing a privacy policy. “I support better transparency,” she said, “so that people can know what’s being done with their data before they use these platforms.”
A survey of 40 Nigerian learners reinforced these personal accounts: nearly 60% said they trust EdTech platforms to protect their data based on reputation, even without reading policies; 88% agreed data collection was “necessary” for platform improvement but demanded clearer explanations, while only 10% felt “the benefits outweighed the risks.”
This mix of trust, resignation, and uncertainty paints a troubling picture: learners gain access to digital skills but surrender personal information without fully understanding the consequences.

Platforms on the Defensive
EdTech platforms defend their practices as responsible and lawful, but responses reveal a split between mostly offline operators and those scaling online.
While Dahoro’s offline-focused EdTech model limits current data risks, it may face compliance challenges once it starts to store data online. In contrast, Osibanjo’s hybrid platform already relies on user data for personalisation yet acknowledges that the sector lags behind best practices.
In both cases, the theme of “consent” remains shaky. Learners give data because they must, not because they fully understand or agree with the terms.
The Legal Lens
The NDP Act (2023) is the country’s most ambitious attempt to regulate how personal data is collected, stored, and shared. It establishes the Nigeria Data Protection Commission (NDPC), introduces penalties of up to ₦10 million or 2% of annual turnover for violators, and requires organisations to appoint Data Protection Officers (DPOs). But how this law interacts with the booming EdTech space remains unclear.
Israel Olawunmi, a Nigerian-trained attorney specialising in data privacy and protection, pointed out that “in practice, many EdTech platforms in Nigeria tend to treat consent as a formality rather than ensuring that learners fully understand how their data will be used. While the law requires informed consent, platforms often fail to provide detailed information about data processing, leaving learners with little understanding of the implications. This undermines the essence of informed consent and puts platforms at risk of non-compliance.”
Ebiegberi Zidafamo, a lawyer with experience in data protection compliance, stressed the accountability gap. “The law is clear: platforms must spell out how data is collected, stored, and shared. But in practice, many EdTech platforms are not transparent. They often assume that once users click ‘I agree’, they have a blanket licence to use data however they wish.” Zidafamo added that compliance requires more than a written policy—it requires audits, internal controls, and consequences for breaches.
Both experts stressed the same point: Nigeria has strong data laws on paper, but weak enforcement lets EdTech platforms grow without full accountability.
Violations put children at risk
While learners and legal experts raise concerns about consent and accountability, digital rights organisations argue that the issue goes beyond individual platforms—it reflects deeper structural problems in Nigeria’s digital governance.
Paradigm Initiative (PIN), a leading digital rights advocacy group, has consistently highlighted risks tied to EdTech adoption. PIN argues that uninformed consent—where learners tick boxes without understanding the implications because “most privacy policies are written in legalistic English, not in age-appropriate, simple language. Many learners are minors, and the policies are neither designed nor delivered in a way they can understand.”
This aligns with findings from a report by Human Rights Watch that “89% of EdTech products investigated engaged in data practices that put children’s rights at risk, undermined or actively violated them.” The report highlights the need for greater regulation and transparency in EdTech data practices
Similarly, research supported by Luminate emphasises that privacy is not merely a compliance checkbox but a fundamental human right. In contexts like Nigeria, where digital literacy gaps are wide, failing to make privacy policies accessible only deepens inequality. Learners from rural or low-income backgrounds are disproportionately affected, as they lack the time or technical knowledge to navigate complex terms of service.
The warning from these advocates is clear: without stronger accountability, Nigeria risks building its digital future on shaky ethical foundations.

What the Law Requires—and Where Practice Falls Short
The Data Protection Act (2023) requires clear consent, transparency, security safeguards, and organisational accountability. Yet privacy policies remain dense and poorly adapted to local users. Consent is often rushed at the point of urgency—just before accessing content—making it neither informed nor meaningful.
As one digital rights expert put it, “consent that is not comprehensible and freely given does not protect users in any meaningful sense.”

What’s at Stake
Learners urgently need digital skills to compete in today’s economy. But in gaining access, they often trade away privacy without real choice. Offline platforms face future risks when records move online, while online platforms confront challenges around profiling, retention, and opaque data sharing.
Legal and digital rights sources converge on the same message: the goal isn’t to halt EdTech innovation, but to align growth with rights—making it possible for learners to benefit from personalisation without surrendering agency over their own data.

This report was produced with support from the Centre for Journalism Innovation and Development (CJID) and Luminate.



