Primary tabs

How can we embed Global South perspectives in GenAI teaching?

By Laura.Duckett, 4 December, 2025
GenAI can accelerate learning, but if we aren’t careful, it can also reinforce colonial assumptions. Jasmine Mohsen explains how to avoid this
Article type
Article
Main text

When I began teaching marketing analytics, the excitement around generative AI (artificial intelligence) was palpable. Students were using ChatGPT to summarise theories, build business plans and even craft campaign slogans. Yet behind this technological enthusiasm lay an unease that many of us in global classrooms recognise: whose knowledge are these tools really drawing from?

Most GenAI systems are trained on data that overwhelmingly reflects Western perspectives. They reproduce dominant voices while muting the cultural and linguistic diversity that shapes the global academy. If we integrate these tools uncritically, we risk replacing one colonial hierarchy with another.

Having taught and researched across Egypt, the UK and Belgium, I have seen how knowledge hierarchies persist in subtle ways. During my postgraduate studies in Cairo, I learned that citing Western theorists lent credibility to local research. Years later, in Leeds, my examples from Egyptian consumer culture were treated as “contextual curiosities” rather than contributions to theory. These experiences reminded me that decolonisation is not a slogan; it is a daily pedagogical practice.

Bring back contextual richness

GenAI can process information, but it cannot replicate lived experience. In my current research on women entrepreneurs in Egypt’s tourism sector, many participants use GenAI tools such as automated dashboards or translation software to reach global audiences. Yet these technologies often fail to capture the nuances of Arab femininity, community resilience and the informal economies that sustain small enterprises.

In class, I ask students to interrogate these gaps. When we analyse AI-generated marketing content, we compare it with locally produced materials and discuss what has been “lost in translation”. This exercise not only develops critical thinking but also positions students as co-creators of inclusive knowledge.

Teach AI literacy through a decolonial lens

AI literacy should go beyond technical competence. Students must learn to question where data originates, which voices are excluded and how algorithms shape meaning.

One approach is to integrate data-ethics mapping into coursework. Students trace the “journey” of the data behind a case study, asking who collected it, whose interests it serves, and what assumptions it encodes. This method encourages reflexivity and helps them see technology as a social construct rather than a neutral tool.

At my institution, I plan to bring together learners from our global campuses – London, Dubai, Singapore and Sydney – to explore how regional datasets differ in tone, language and representation. These discussions can be powerful moments of discovery, helping students see that bias is not just a technical glitch, but a mirror of global power dynamics embedded in data itself.

Foster cross-regional collaboration

Decolonisation thrives through relational learning. The design of joint projects that connect students across campuses or countries can challenge monocultural perspectives and reveal how local context shapes interpretation. For example, pairing marketing students in London with peers in Cairo to analyse how GenAI personalises travel advertising could enable each group to act as both teacher and learner. Such exchanges would turn decolonisation from an abstract concept into lived experience, showing students that “global” does not mean “homogeneous.” It would also help graduates develop the intercultural intelligence increasingly essential in diverse workplaces.

Value local expertise in global discussions

Universities often treat Global South knowledge as illustrative “case material” rather than as a source of theory. To reverse this hierarchy, we must platform regional scholarship and invite local practitioners to co-teach or guest lecture. In future modules on sustainability, I aim to feature entrepreneurs from Egypt and the UAE who use GenAI for resource-efficient tourism. Their innovations, such as training chatbots in Arabic dialects to communicate sustainability messages, highlight creative solutions that Western frameworks often overlook. Incorporating such voices would remind students that innovation flows in many directions, not only from north to south.

Rethink assessment and authorship

With GenAI tools now capable of producing essays or marketing plans in minutes, assessment design must reward reflection rather than reproduction. Instead of asking students to define a concept, I encourage them to position themselves within it: “How might your cultural background influence your interpretation of AI ethics? Which voices does your analysis include or exclude?” This reflective approach not only deters plagiarism but also models a key decolonial principle: positionality, recognising that all knowledge is produced from a particular standpoint.

Beyond tokenism

Decolonising degrees amid the rise of GenAI is not about rejecting technology or adding a few “diverse” readings to the syllabus. It is about recentring human agency and embracing epistemic plurality. The promise of AI lies not in automating insight but in amplifying diversity if we teach students to see it that way.

As educators, we stand at a crossroads. We can allow algorithms to decide whose knowledge counts, or we can use them as mirrors to reveal our own biases and possibilities. The future of higher education depends on choosing the latter.

Jasmine Mohsen is a lecturer (assistant professor) at SP Jain London School of Management

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Standfirst
GenAI can accelerate learning, but if we aren’t careful, it can also reinforce colonial assumptions. Jasmine Mohsen explains how to avoid this

comment