Nyambura Kogi, Chairperson of the Association of Women Commercial Drivers of Kenya, sits on the edge of her couch, taps her two phones Redmi 12 Pro and Sumsung Galaxy A56 impatiently with a sigh. โI hate Gemini, especially because it makes both my phones really hang and there seems to be no way to disable it.โ Kogi does not recall opting into Artificial Intelligence (AI) assistants. It was simply thereโbundled in a phone update, learning from her daily habits.
Across Africa, millions are in the same boat. From AI-suggested playlists to personalised news alerts and autocorrect that adapts to their slangs, AI assistants including voice, text, and search assistants, are now embedded in the apps and devices Africans rely on daily. But what feels like convenience masks a more troubling truth: users are rarely asked, explicitly, if they consent to this invisible data exchange. And for most, opting out can be unclear or nearly impossible.
Consent without choice
The promise of AI is personalisation. But itโs powered by dataโyour data. Every tap, scroll, and voice command potentially becomes a training input. And in many of the worldโs most popular apps, this data is collected under vague terms or behind dense privacy policies most users don’t read nor fully understand when they “agree”.
โAs consumers we find ourselves in a situation where we are not in control of AI introduced in our gadgets because those who design the gadgets are the ones who decide whether to put AI tools or not,โ said Zenzele Ndebele, a director of the Centre for Innovation and Technology (CITE).
Even when tech companies disclose that their AI tools are collecting data, the process often remains opaque. Meta, for example, states that messages sent to its AI assistants โmay be used to improve AI.โ But what does that mean in practice? How long is data stored? Can it be deleted? Can it be sold?
โThese are questions users have a right to ask,โ says Ndebele.
The invisible cost of โfreeโ products and services
Jean-Pierre Murray-Kline, a business technologist based in South Africa, puts it plainly: โIf the product is free, you are the product.โ
He describes AI as a digital mirrorโone that reflects our behaviours back to us, but amplified. โItโs watching how we speak, what we type, what we search. Then it gives us more of the sameโreinforcing habits, biases, even political leanings.โ
And yet, users often do not know what happens behind the scenes. Many apps do not just gather dataโthey harvest it continuously, sometimes even when the app is not open.
Candice Grobler, community marketing strategist and a founder of Candid Collab, noted that โIt’s really difficult as a user to really know what these AI assistants in apps are doing with our data. They have terms of service but they update automatically without always being clear on the real impact.โ
โSome developers design their apps specifically to avoid triggering permission requests,โ says Murray-Kline. โIf an app does not ask for any permissions at all, that should be a red flagโnot a relief.โ
At the core, AI remains a business-driven tool. While everyone will eventually use it, companies prioritise profitabilityโoptimising AI systems to extract data with little focus on empowering users with control. Ndebele cautions that while businesses invest heavily in AI, โusers must be vigilant about what is being collected, how it is being used, and what is being withheld from them.โ
AI literacy and smart policy matter in Africa
Africa is one of the fastest-growing mobile markets in the world, yet its users are largely passive participants in the AI economy. Local startups are still developing AI capacity, while global platforms dominate usageโand set the rules.
Data from African users fuels global AI systems, but those same users have little control over how that data is used.
AI literacy is crucial as users need to quickly mature their understanding of AIโlearning what it is capable of, what data it collects, and which permissions they can give and should withhold. โIn 2025, AI ignorance is not a defence,โ Murray-Kline says. โUsers need to know their rights, protect their data and demand transparency.โ
Beyond individual action, Ndebele noted that African governments need to be very serious about data protection laws. โInstead of using data protection laws for their selfish ends, they need to monitor and regulate how tech is collecting data and using it,โsaid Ndebele.
Ndebele urges African governments to move beyond rhetoric. โWe need robust, user-centric data laws. And we need tech companiesโlocal and globalโto respect African users.โ
In Nigeria, the National Information Technology Development Agency (NITDA) has begun to address data privacy, but enforcement is patchy. In Kenya, the Data Protection Act is a step forward, but public awareness and implementation remain low. South Africa has established sophisticated data protection laws, notably the Protection of Personal Information Act (POPIA), which provides robust rights and enforcement mechanisms for individuals. However, similar to Kenya, challenges in awareness and compliance persist, with many organisations and consumers still struggling to fully understand or meet their obligations under the law.
AI is here to stay. But how it shapes our lives depends on vigilance, digital literacy, and willingness to demand better.
Mark your calendars! Moonshot by TechCabal is back in Lagos on October 15โ16! Join Africaโs top founders, creatives & tech leaders for 2 days of keynotes, mixers & future-forward ideas. Early bird tickets now 20% offโdonโt snooze! moonshot.techcabal.com

















