Artificial intelligence is moving into healthcare practice faster than many curricula are moving to meet it. Yet in higher education, discussion still too often slips into the wrong debate: not how future professionals will work with AI in clinical settings, but whether students are using generative AI to help write assignments. While critically important, this preoccupation risks obscuring the more pressing question of how the future registrant will work with AI within clinical care delivery.
The question we need to ask is whether healthcare education is preparing students for a professional world in which AI is becoming more visible in documentation, imaging, triage, risk prediction and clinical decision support. Higher education institutions are not just deciding how to respond to a new technology, they are deciding how future registrants will learn to recognise its limits, question its outputs and remain professionally accountable when digital systems shape care.
AI literacy in healthcare should therefore not be treated as an academic integrity sidebar or a technical specialism for a small group of interested students. It is becoming part of professional formation. Healthcare students do not need to become software engineers or data scientists, but they do need to understand what AI is, what it is designed to do, how it is evaluated, where it can fail and what it means to use AI-enabled systems in regulated care environments. They also need an understanding of how such systems move from governance and development into practice. Most importantly, they need to understand that AI use does not lessen their legal, ethical or professional accountability.
What does AI literacy in healthcare look like?
In many cases, the more effective response is not to bolt on a stand-alone AI unit, but to review what is already taught and identify where AI literacy can be integrated in coherent, professionally relevant ways.
Evidence-based practice teaching can include appraisal of studies involving AI-enabled tools, prompting them to think about validation, bias and generalisability. Ethics and law sessions can address transparency, accountability, informed decision-making and the use of data within digital systems. Simulation can explore what happens when a tool’s recommendation supports, complicates or appears to conflict with clinical judgement. Informatics teaching can address data quality, interoperability and digital safety. Rather than making AI the centre of every session, we should ensure students encounter it where it matters in practice.
- AI governance is a duty of care, not a branding exercise
- Is AI literacy an information skill?
- Promoting ethical and responsible use of GenAI tools
Some programmes may need additional lectures, workshops or tutorials. But for many, the bigger gain will come from updating existing teaching so that AI literacy complements the wider curriculum rather than competes with it. That is likely to be more sustainable, and it makes it easier for students to understand AI as part of the professional environment they are entering, rather than a “niche topic”.
A scaffolded approach to AI literacy in healthcare education
A spiral curriculum offers a particularly strong way of embedding AI literacy into healthcare education. AI literacy is best introduced early and then revisited across a programme at increasing levels of depth, complexity and professional relevance. That matters because students arrive with very different levels of prior knowledge, confidence and exposure. It also matters because understanding of AI-related issues should develop alongside their broader professional formation.
In practice, early teaching may focus on what AI is, where it appears in health and care, and why professional judgement remains central. Later teaching can return to those foundations in more applied ways, addressing bias, explainability, trust, governance, implementation risk, consent and accountability. In this model, AI literacy becomes a developmental thread running through the curriculum.
Setting expectations with industry AI guidance
Professional regulators have an important role here, too. Bodies such as the Nursing and Midwifery Council, the General Medical Council and the Health and Care Professions Council shape expectations for registrants and, indirectly, the priorities of approved programmes. Some have begun to acknowledge the implications of AI for education and practice, but the position remains uneven across professions. Clearer profession-specific guidance would help create greater consistency across higher education providers and establish a firmer baseline for what safe and professionally appropriate AI literacy should include in each field.
How higher education and industry can work together
Staff development is just as important. Many educators trained before AI became as visible in healthcare as it is now and may not feel well prepared to teach it without support. That should be recognised as a curriculum issue, not an individual failing. There is a strong case for closer triangulation between higher education providers, care providers and those involved in designing AI-enabled products used in healthcare. Care providers can offer insight into how AI is being adopted, governed and used in practice. Product developers can help educators understand how systems are intended to function, what assumptions underpin them and where implementation challenges arise. Universities, in turn, must translate that knowledge into teaching that is pedagogically sound, professionally relevant and critically framed.
AI literacy matters because it supports the core purposes of healthcare education. Done well, it helps students remain safe, critical, communicative and accountable in environments where digital systems are becoming more visible. It is not about teaching technology for its own sake, and it should not be reduced to a proxy debate about AI in academic work. It is about preparing future registrants to uphold professional standards in changing clinical conditions.
Andy Barker is a lecturer in nursing at the University of Hull.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment