The arrival of generative artificial intelligence (GenAI) tools such as ChatGPT, Gemini or Claude in university classrooms has triggered sharply divided reactions. Some see them as a threat to academic integrity and critical thinking. Others welcome them as powerful new learning tools that provide students with personalised knowledge building support. Regardless of one’s perspective, GenAI is here to stay. Instead of denouncing it, educators need upskilling so they can guide and mentor students in using GenAI technologies responsibly.
Cut corners and other challenges around using GenAI in education
Despite its promise as a customised learning path, GenAI comes with well-documented caveats such as the risk that its users submit AI-generated work as their own or skip research efforts and analysis. GenAI can hurt students’ ability to think creatively when it anchors them to AI-generated suggestions. After reviewing AI outputs, learners may find it harder to think outside those boundaries.
Large language models (LLMs) are prone to hallucinations and the reproduction of outdated knowledge. LLMs predict language based on patterns in data (that is, they recombine existing information rather than generate original or novel content). Another issue is the possibility that GenAI produces convincing but false information and biased content. An extreme example was Microsoft’s AI bot Tay, which was quickly taken offline after its 2016 launch when it began generating offensive and racist tweets.
The educator’s role: fostering students’ AI (prompt) literacy
However, depriving students of the opportunity to develop GenAI proficiency would not solve these problems. To ban AI would simply drive use underground. The real risk is arguably that students are using GenAI incompetently or without any critical framework. If universities are serious about maintaining academic integrity, they must move beyond prohibition and invest in AI literacy. Students need to know how GenAI models work, where and why they fall short, and how to verify what they produce. Teaching users to cross-check information, challenge assumptions and ask thoughtful questions fosters deeper engagement rather than diminishing it. Instructors need to be able to strengthen students’ critical thinking (metacognition) and integrate GenAI into deep learning activities.
- Spotlight guide: Bringing GenAI into the university classroom
- (Re)learning critical reading in the age of GenAI
- How to empower your university to integrate GenAI using tools and talent you already have
Much of the value in GenAI depends on how it is used. A vague question will yield a vague response. Students who refine their questions using iterative prompting often arrive at more insightful conclusions than those relying on AI for quick answers. With proper guidance and reflection, this can become a powerful method for learning. For instance, in a sustainability course, students may use GenAI to analyse carbon emissions data. But they should also be asked to reflect on how emissions relate to their own values, choices and consumer behaviour. The point is not just to learn the technical jargon such as the difference between Scope 1, 2 and 3, emissions but to consider its relevance in the real world.
Engaging learners with AI as co-pilot
Combining the use of GenAI tools with effective prompting and Socratic questioning transforms students’ use of technology from passive consumption to active, reflective and critical engagement.
In one of my undergraduate classes, students harnessed Google NotebookLM (a personalised, AI-powered virtual research assistant) to review abstract leadership theories and compared them through interactive AI-generated dialogues. As one student observed, the podcast-style audio summaries provided a valuable learning opportunity: “Eighteen pages of content were explained verbally in just 15 minutes, making it both time-efficient and engaging.” The learning (for example, about the need to adapt Western leadership models to an “Asian” context) did not come from the AI itself, but from the students’ process of questioning, testing and iterating powered by an appropriate instructional design.
In another course, students harnessed ChatGPT to explore robust revenue models for a fictional B2B SaaS (software as a service) start-up. The first outputs were vague. But as students refined their prompts and added context, the responses became more targeted, practical and insightful. This enabled them to select the “right” pricing tier for early-stage SaaS companies tailored to a certain type of customer.
Student engagement is multidimensional: it involves behavioural (active participation), emotional (feeling motivated) and cognitive (thinking critically) processes, each contributing differently to how students connect with learning materials and succeed in applying GenAI tools effectively. Many educators are arguably not adequately prepared to engage and enable students to work with GenAI critically. ChatGPT, for example, might enhance students’ behavioural engagement but not the important critical engagement component because of shallow learning.
At some institutions, teachers might have lingering scepticism about the educational value of AI tools and competing priorities that overshadow the need for them to develop their own AI-ready engagement skills for guiding students. Given the popularity of GenAI among students and the fact that this pivotal shift towards GenAI integration is still in its early stages, instructors need to develop strong(er) engagement skills to effectively guide, motivate and challenge students in using generative AI meaningfully.
Setting boundaries and expectations
While both students and educators should embrace GenAI as a learning and teaching tool, schools must set clear guidelines on what is acceptable. For example, students must understand that writing is a skill that they cannot outsource. Course participants might be allowed to use AI for research or brainstorming but not to generate final submissions without proper attribution.
AI-compatible assessment design must also evolve. Tasks that require personal reflection, oral defence or hyperlocal context are much harder to outsource to GenAI. Educators should adopt project-based assignments that emphasise process over product, iteration, feedback and growth. When students know that they will be assessed on the process-related steps they took to complete a task (instead of on the final output) through submitted drafts, reflections or prompt logs, they are less likely to submit AI-generated work without critical engagement.
A balanced path for Singapore’s education system
Singapore’s education system has long been praised for its rigour, adaptability and future orientation. GenAI offers a chance to build on that foundation, not undermine it. Holistic engagement is more crucial than ever.
Rather than treating GenAI as a threat, it should be integrated into our pedagogy with strategic intent. This means equipping students with the skills to use it critically, ethically and responsibly. It also implies enabling more educators to redesign curricula and process-based assessments for an AI-augmented world.
Thomas Menkhoff is professor of organisational behaviour and human resources (education) in the Lee Kong Chian School of Business at Singapore Management University.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment