Can trust norms within the African moral system support data gathering for Generative AI (GenAI) development in African society? Recent developments in the field of large language models, such as GenAI, including models like ChatGPT and Midjourney, have identified a common issue with these GenAI models known as “AI hallucination,” which involves the presentation of misinformation as facts along with its potential downside of facilitating public distrust in AI performance. In the African context, this paper frames unsupportive data-gathering norms as a contributory factor to issues such as AI hallucination and investigates the following claims. First, this paper explores the claim that knowledge in the African context exists in both esoteric and exoteric forms, incorporating such diverse knowledge as data could imply that a GenAI tailored for Africa may have unlimited accessibility across all contexts. Second, this paper acknowledges the formidable challenge of amassing a substantial volume of data, which encompasses esoteric information, requisite for the development of a GenAI model, positing that the establishment of a foundational framework for data collection, rooted in trust norms that is culturally resonant, has the potential to engender trust dynamics between data providers and collectors. Lastly, this paper recommends that trust norms in the African context require recalibration to align with contemporary social progress, while preserving their core values, to accommodate innovative data-gathering methodologies for a GenAI tailored to the African setting. This paper contributes to how trust culture within the African context, particularly in the domain of GenAI for African society, propels the development of Afro-AI technologies.