Primary tabs

AI as an access tool for neurodiverse and international staff

By Eliza.Compton, 15 October, 2025
Used transparently and ethically, GenAI can level the playing field and lower the cognitive load of repetitive tasks for admin staff, student support and teachers
Article type
Article
Main text

University staff are working at full stretch. Class sizes are larger, admin loads are heavier, and student support is more complex. This hits international colleagues and staff with disabilities, including neurodiverse staff, especially hard. When everyday tasks – from navigating institutional systems to written communications – require more time and cognitive effort, the result is a persistent energy tax that undermines well-being, slows innovation and widens inequities in career progression.

Artificial intelligence can reduce this tax. Used transparently and ethically, it acts as a practical adjustment that begins to level the playing field. 

Yet many colleagues hesitate to use AI for routine tasks because of fears about being seen as “cheating” or shame about appearing less professional or less of an intellectual or as presenting ethical and data risk. Inconsistent local guidance compounds these concerns. This points to a gap between institutional aspiration for inclusion and the everyday support that staff need to work fairly and efficiently.

Added to this is the fact that barriers facing neurodiverse and international staff are often hidden. Experiences vary, but common challenges include:

  • Cognitive load and executive function: Tasks such as document formatting, inbox triage and timetable changes create context-switching that drains executive function for many neurodivergent colleagues.
  • Language and register friction: For international staff, getting the tone “right” (such as using idiom or institutional voice) for emails, policies, assessment briefs and teaching materials adds hours of micro-editing that have little to do with expertise.
  • Perfection pressure: Expectations for polished outputs such as emails, briefs or slide decks can trigger overcorrection and procrastination – especially when norms are unwritten or vary by team.
  • Time pressure in student support: Drafting clear, compassionate responses under time constraints is cognitively demanding, particularly when translating complex policy into plain language.
  • Stigma and second-guessing: The fear of being seen as “cheating” (or simply as less capable) deters colleagues from making even clearly reasonable adjustments for tasks such as drafting, structuring or accessibility checks.

These are structural barriers, not individual failings, and as such call for structural supports.

Where AI helps without cutting academic corners

When framed as accessibility and quality enhancement, AI can support staff to complete standard tasks with less friction. However, while it supports clarity, consistency and inclusion, generative AI (GenAI) does not replace disciplinary expertise, ethical judgement or the teacher–student relationship. These are ways it can be put to effective use:

  • Drafting and tone calibration: Use AI to generate a first draft for emails, policy summaries or feedback, then edit yourself for accuracy and context. This reduces time on phrasing while preserving judgement.
  • Language scaffolding: For international writers and all readers, GenAI can translate notes to English or produce parallel versions (such as a plain-language summary, accessible bullet list or longer policy detail).
  • Structure and templates: Turn outlines into consistent formats for module guides, assignment briefs and meeting agendas. This lowers the executive-function load and reduces errors.
  • Summarise and prioritise: Extracting actionable items from long policies, minutes or student queries supports focus and reduces overwhelm.
  • Accessibility by default: Move inclusion from “extra work” to standard workflow. Use AI to create alt text drafts, check captions, and write reading-order suggestions and colour-contrast prompts.
  • Idea generation for pedagogy: Produce optional examples, case vignettes or formative question banks that staff can customise. Saves ideation time while keeping academic ownership.
  • Translation and cultural mediation: Offer draft translations or explain idiomatic phrases to support cross-cultural communication in teams and with students.

Creating psychological safety: how to support and encourage uptake

Institutions can normalise ethical, accessibility-led AI use through practical steps:

1. Start with safe-brave spaces

Create regular, psychologically safe forums where staff can critically discuss AI, the ethics and usage. Such spaces should allow people to share real AI use cases, dilemmas and drafts, try things together, and give peer feedback without judgement.

2. Provide clear, values-led guidance and support 

Institutions need to put in place simple, example-rich guidance that frames AI as an accessibility and quality tool (not a shortcut), with expectations for human review, transparency where appropriate, and data protection.

3. Offer privacy-safe tools 

Ensure institutionally approved AI options with strong data controls, so colleagues do not have to choose between help and risk.

4. Train for specific tasks 

Conduct institution-wide or departmental short, role-based sessions (such as, “From notes to clear email” or “Accessible assignment briefs in 10 steps”) with before-and-after examples and common pitfalls.

5. Seed reusable libraries

Colleagues need a mechanism through which to share editable prompts, templates and checklists for feedback frames, policy summaries, module guides, alt-text scaffolds and plain-language versions.

6. Model transparent practice

Leaders and programme teams should openly acknowledge AI-assisted drafting for admin and accessibility tasks to normalise ethical use.

7. Protect tinkering time 

Institutions should provide micro-allowances or mini-grants to pilot AI for inclusion, with quick write-ups of “what worked/what to avoid” for colleagues.

8. Pair up and mentor 

Set up peer “AI buddies” or small learning circles to troubleshoot live tasks and edge cases.

Practical do-now ideas for individuals

Those new or unsure about using AI should start with low-stakes text: meeting summaries, routine emails, student information pages. Use prompts that encode your standards: audience, tone, must-include points, accessibility requirements. One suggestion is to keep and iterate a personal prompt bank for repeat tasks. 

Bear in mind AI’s limitations, and always check names, dates and policy against independent, reliable sources. Treat outputs as drafts, not answers.

Add a discreet note when appropriate: “Drafted with AI assistance and human edited for accuracy.” This reduces stigma and sets norms.

Document time saved and quality gains. Use this to advocate for workload adjustments or team-level adoption.

Why this matters

When AI is framed as an access tool, not a threat, neurodiverse and international staff can spend less time wrestling with form and more time on substance: teaching, mentoring, research and collaboration. Students benefit from clearer communications, more consistent materials and faster, kinder responses. Teams benefit from fewer errors, more shared templates and reduced burnout.

Institutional positives are already emerging. Where these supports are in place, we see higher confidence among staff to use AI ethically and visibly; improved accessibility in course materials and communications as standard practice; reduced turnaround times for routine admin without loss of quality; and greater equity in who has time and energy for creative work and leadership.

Make accessibility-led AI use a normal, supported part of academic work. Clear guard rails, privacy-safe tools and transparent use are inclusion in action: the same expectations, reached by fairer routes.

Vanessa Mar-Molinero is a senior teaching fellow in academic practice at the University of Southampton.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Standfirst
Used transparently and ethically, GenAI can level the playing field and lower the cognitive load of repetitive tasks for admin staff, student support and teachers

comment