Artificial intelligence (AI) presents opportunities and challenges for higher education teaching and learning. While the impact of GenAI tools such as ChatGPT on academic integrity and critical engagement is of concern, these large language models have the capacity to streamline administrative tasks, spark creativity, enhance assessment methods and guide deeper learning, if used effectively. And with GenAI soon to become ubiquitous in most workplaces, it is vital students understand how to use it well. Building AI literacy, rethinking assignments and exploring how AI intersects with established pedagogical approaches should, therefore, be priorities for university educators. This collection offers practical insights into leveraging AI as a teaching tool that supports greater understanding and critical thinking, helping both students and their lecturers.
A good starting point is understanding how your students are using these tools. And this is what Miriam Wun and Nah Yong En of Singapore Institute of Technology sought to find out by talking to their students. Read their insider’s guide to how students use GenAI tools.
AI as a teaching assistant
Transform GenAI into a teaching assistant by training tools on the relevant course materials and learning outcomes, but also guiding students in how to get the most out of their prompts. These resources provide insight on using AI to provide students with support and feedback round-the-clock.
Beyond Chat - how AI teaching assistants are transforming student support: Pedagogically integrated AI is like having knowledgeable support that understands course materials and objectives and, most importantly, can guide students towards deeper learning, writes Thorsten Fröhlich of the Walbrook Institute London.
AI as tutor and critic - using tech to personalise education: Artificial intelligence can have practical applications for assessment in higher education, despite the focus on the threats it poses. Michael Butler of King’s College London shares pointers for using AI to support teaching and generate feedback.
Can AI offer everyone a personal tutor 24/7? Generative artificial intelligence can trigger a certain amount of angst, but AI’s potential to support student learning should be explored, write Steve Hill and Quintus Stierstorfer of the Walbrook Institute London.
Use AI to promote critical thinking
There are very real fears that students might use GenAI as a shortcut, preventing them from engaging critically with learning material. However, by encouraging students to work with AI as an assistant and embedding it into learning activities and assignments, AI outputs can be used to help students evaluate arguments, spot biases and refine their reasoning skills, as these resources explain.
Three ways to use ChatGPT to enhance students’ critical thinking in the classroom: The balance between technology and traditional educational values, as well as ensuring that AI complements, rather than replaces, the human element in education, are the keys to maximising AI’s benefits in the classroom, writes Nikolas Dietis of the University of Cyprus.
GenAI can help literature students think more critically: Is ChatGPT destroying critical thinking, or is it allowing us to reconsider how we teach it? Shuri Mariasih Gietty Tambunan of the University of Indonesia explores some ways to empower literature students to use it to deepen their understanding.
Use artificial intelligence to get your students thinking critically: When crafting online courses, teaching critical thinking skills is crucial. Urbi Ghosh of Colorado State University Global shows how generative AI can shape how educators can approach this.
Is critical thinking the answer to generative AI? Designing assessment that tests critical thinking has value and practicality, so the challenge is figuring out questions that flummox the AI without creating wildly difficult problems for students, write Luke Zaphir and Jason M. Lodge of the University of Queensland.
How to build AI literacy
With AI sparking such a monumental transformation, not just in teaching and learning but in the careers that await students after university, AI literacy has become a must-have graduate skill. To close the digital divide, university lecturers need to consider the AI know-how students already have and the skills they will need. Then, as these resources explain, guide them to get the most out of these powerful tools, with a clear understanding of AI’s strengths and limitations.
Is AI literacy an information skill? To capitalise on GenAI’s strengths, and understand its limitations, students need to develop their research and critical thinking skills in practical, embedded and subject-specific ways, write Emily Dott and Terry Charlton of Newcastle University.
To demystify AI for your students, use performance: Updating Mary Shelley’s Frankenstein for the AI era helped students to understand the opportunities and limitations of the tool, in an engaging way. A team from Royal Holloway, University of London describe how they used performance as pedagogy.
The ‘deep learn’ framework - elevating AI literacy in higher education: AI literacy is no longer a futuristic concept; it’s a critical skill for university students. The ‘deep learn’ framework offers a comprehensive approach to enhancing literacy around artificial intelligence and application in higher education settings, explained here by Birgit Phillips of FH Joanneum University of Applied Sciences.
Promoting ethical and responsible use of GenAI tools: How can we encourage staff and students to use generative AI in ways that do not threaten an institution’s ethics or academic integrity? Kelly Louise Preece offers lessons from the University of Exeter’s approach.
Three ways to develop students’ AI literacy: Is higher education prepared for a future defined by AI, or do we need to do more to align education with technology’s changing landscape? Here are three ways to get your students to engage with it critically, by Chahna Gonsalves of King’s College London and Sam Illingworth of Edinburgh Napier University.
A four-step process to embedding AI literacy in business courses: Business students will need to know how to work with AI tools in their future careers. Prepare them with this four-step process from John Murphy of the University of Adelaide.
How to assess students in an AI age
One of the best-documented concerns related to AI and higher education relates to academic integrity and plagiarism. The availability of GenAI means rethinking assessment and, some argue, reframing definitions of academic integrity. Traditional models such as essays are being swapped out for assessments designed around reflection, and real-time, practical tasks such as presentations. As the resources here explain, the focus must move towards examining students’ work in progress rather than their final output.
We have to rethink academic integrity in a ‘post-plagiarism era’: What is the future of plagiarism as a concept in the AI age and what are the implications for assessment? Karen Kenny of the University of Exeter seeks to answer these questions, among others.
The renaissance of the essay starts here: In the age of AI, has long-form writing in higher education reached a dead end? Martin Compton of King’s College London and Claire Gordon of London School of Economics and Political Science discuss the unique aspects of the essay and introduce a manifesto to revitalise it.
AI as a catalyst for assessment innovation: University educators have an opportunity to rethink their approach to assessment, so that artificial intelligence tools support student learning without compromising academic integrity, write Zheng Feei Ma and Antony Hill of the University of the West of England Bristol.
RIP assessment? How can educators make learning and human intelligence visible in the age of GenAI? The University of Bath’s Abby Osborne and Christopher Bonfield outline a model to rethink assessment and reward non-AI knowledge and understanding.
Embed AI in assignment design
Find out how to design assignments which invite the use of GenAI in ways that aid deeper understanding – supporting, rather than replacing the learning process. Equally important is knowing when to ban the use of AI to ensure students properly engage with course materials, as explained below.
The AI genie is out of the bottle – now what? Generative AI is here to stay, so let’s build AI literacy, incorporate AI into assessments and craft solid policies for its use, write Aida Nuranova and Timothy Wawn of Nazarbayev University.
Why I ban AI use for writing assignments: Students may see handwriting essays in class as a needlessly time-consuming approach to assignments, but I want them to learn how to engage with arguments, develop their own views and convey them effectively, writes James Stacey Taylor of the College of New Jersey.
How students’ GenAI skills and reflection affect assignment instructions: The ability to use generative AI is akin to time management or other learning skills that students need practice to master. Vincent Spezzo and Ilya Gokhman of Georgia Tech’s Center for 21st Century Universities offer tips to make sure instructions land equally no matter students’ level of AI experience.
Three ways to promote critical engagement with GenAI: However much we fear its impact or despise its outputs, when teaching humanities, the best response is to encourage students to engage with AI critically, says Neville Morley of the University of Exeter.
Where AI fits into pedagogy
How can instructors align AI use with traditional models of teaching? These resources look at educational philosophies including behaviourism, constructivism and critical consciousness, along with how AI intersects with Bloom’s Taxonomy. Academics share specific exercises that use AI to strengthen learning and student engagement.
Align AI tools with teaching philosophies - a practical guide: Lucy Gill-Simmen, of Royal Holloway, University of London, provides a practical framework for integrating AI into teaching, while remaining true to your pedagogical principles.
The trouble with Bloom’s taxonomy in an age of AI: When using large language models to create learning tasks, educators should be careful with their prompts if the LLM relies on Bloom’s taxonomy as a supporting dataset. Luke Zaphir and Dale Hansen of the University of Queensland break down the issues.
Class exercises that use ChatGPT to strengthen students’ learning: To foster engagement, comprehension and knowledge retention in the classroom, educators should find a balance between leveraging AI tools such as GenAI to strengthen learning while also preserving their own guiding role, writes Nikolas Dietis of the University of Cyprus.
The two key steps to promoting responsible use of LLMs: Large language models offer opportunities for higher education, but also present challenges. Xiangen Hu of the Hong Kong Polytechnic University explains how to balance both.
Apply the principles of critical pedagogy to GenAI: Artificial intelligence can shape our educational practices – but when we allow this to happen unthinkingly, what do we risk losing? Here, colleagues at the University of Adelaide explore how to stay uncomfortable and ask the critical questions.
Using AI to create engaging educational games for humanities students: Combine AI with gamification and storytelling activities to enhance student engagement, write Dania Arriola Arteaga and Bárbara Regina Granados Guzmán of Monterrey Institute of Technology.
How to use AI to streamline your teaching admin
Much of the work of teaching happens outside the classroom – emails, marking, feedback, class planning, course design – the list goes on. So while many academics are wary of AI, a growing number are using it to lighten their workloads by outsourcing some of the administrative tasks associated with university teaching. Read here about how AI can speed up marking, help deliver feedback, draft emails and summarise course materials, leaving instructors time to focus on more meaningful learning and assessment design.
Reduce your teaching admin burden with AI: How university teachers can use AI to respond to student enquiries, provide feedback and create engaging learning content, as a University of Auckland team explain.
Here are seven AI tools you should be using for your teaching and research: AI can assist with idea generation, data analysis and mind-mapping, among others. Natalie K. D. Seedan of the University of the West Indies lists the tools that should be on any academic’s radar.
We use ChatGPT to give feedback on students’ abstracts: Teaching students whose first language is not English to write concise abstracts helps them develop their academic writing skills, but providing feedback on them can often be laborious. Here, Yu Liu and Shuhao (Jeremy) Zhang of Xi’an Jiaotong-Liverpool University share how you can use ChatGPT to speed up the process.
Will AI revolutionise marking? Artificial intelligence has the potential to improve speed, consistency and detail in feedback for educators grading students’ assignments, writes Rohim Mohammed of University College Birmingham. Here he lists the pros and cons based on his experience.
Is it worth paying for GenAI? How useful is artificial intelligence for syllabus design? Law lecturer Sophia De Arez Cintra of King’s College London compared the free and subscription versions of three generative AI platforms, with surprising results.
Career focused AI
Find out what AI related skills students may need for specific careers and how to teach them as part of your course.
Future-proof software engineering students for an AI-dominated world: Software engineering is increasingly being shaped by generative AI. Houda Chakiri of Al Akhawayn University explains how to prepare your students for their future workplace.
These AI tools can help prepare future programmers for the workplace: Rohini Rao of Manipal Academy of Higher Education looks at how educators can teach programming students to use AI tools to enhance productivity.
Essential GenAI skills for marketing students: How students can use AI to generate promotional copy, conduct market research and identify biases, among other applications, shared by lecturers from the University of Bristol.
Leverage large language models to assess soft skills in lifelong learning: Leadership and critical-thinking skills are difficult to measure. Here, Jonna Lee of Georgia Tech’s Center for 21st Century Universities offers case studies that test the idea of integrating large language models into assessment practices as a feedback tool to empower both students and instructors.
For more detailed insights and resources exploring how to use GenAI to improve assessment and equip students for an AI-driven workplace browse our spotlight guide on AI and assessment in higher education.
Thank you to all the academics and higher education professionals who contributed their professional advice on this wide-ranging topic.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment