An Aladdin’s cave of valuable information is concealed in your virtual learning environment logs. Learner engagement analytics show how your students interact with online content. How to extract that crucial data, without hiring an analytics expert? Enter GenAI.
Building the prompt
A successful prompt tells the GenAI tool what role to play, what the context is and what you expect it to do. It should start with something like: “You are an expert at learner engagement analytics. You have an amazing reputation for your ability to gain insights from Moodle logs that help educators tailor their delivery to improve student experience. You are also an expert on undergraduate student learning behaviour.”
- Use GenAI to slow down and reflect more deeply
- How to harness AI for student support: a human-centred approach
- Miro and GenAI as drivers of online student engagement
You will need to instruct the tool to clean up the data. Virtual learning environment logs include automated activity. So add: “Remove any entries that are associated with automated maintenance.”
Now for the instructions. The GenAI tool will need clear direction about what is expected, but at the same time you want it to bring its own views. Remember, we have told it to be an expert learner engagement analytics advisor. Here is a good balance: “Please do a thorough job of analysing these logs, getting as many insights as you can based on what is in them and what you know of undergraduates, and give me a detailed, professional report of those insights. Make any recommendations that come to mind to improve student experience.”
The result will give you amazing insights into how the students are using your virtual learning environment page. You can talk back to the tool like you would to any consultant. “Your analysis is great but too technical for me. Can you rephrase it?”
What to do with the information
There are several other additional steps you can take to power your site and get students engaged. The first is to thank your GenAI helper – seriously, it provides the tool with clues as to the usefulness of its output. Then ask: “Based on the analysis you have performed and on your wider expert knowledge, please advise me what I can do to improve the construction of my virtual learning environment. Remember that we are using [your platform here, e.g. Moodle], so there are limits as to what I can do with the site.”
There are some even more useful things you can do. A clickstream analysis shows how students clicked through your site. You will need to give the tool more direction here, because the logs are not a full record of user activity. A suitable prompt here might be: “Now I want to do another kind of analysis on these Moodle logs. I would like you to look at individual students’ clickstreams, trying to work out the paths they take. This is made challenging because the logs only show individual clicks, and not things like when the user left the site. So please see what you can do. Ask me any questions to help me get the best out of the analysis.”
If you want a great visual that not only gives you insight into student clicking patterns but will really impress your colleagues, ask for a Sankey diagram.
Am I having an effect?
Having used the tool’s information to guide an intervention, you will want to measure its effects. The simplest way is to upload the logs afresh once you think the intervention has affected student behaviour, and use a prompt like the one above, but modified to fully explain what you want to do. Let the tool know in detail what changes you made and what you really want to achieve. For example: “You will notice in the logs that on 30th September, I added a 5HP activity and made several other changes to the virtual learning environment page to encourage students to use the feedback resources more. Please do a thorough analysis of the logs, including Sankey diagrams, both before and after my intervention on 30th September. Is there evidence that these interventions successfully moved towards my goal?”
Note that you will be uploading student information, so take data protection into account. Some institutions have an account with a GenAI provider, such as Copilot, which has built-in privacy protection.
GenAI is unpredictable and differences between sessions can make long-term analyses difficult. But you can partly overcome that. Ask the tool to provide a prompt you can use in the future to perform a similar analysis.
A fruitful partnership
Getting the best out of GenAI depends on casting it into the right role. Once done, you can then interact with it just as if it were a real, human expert. It enables a powerful partnership use, strategically combining the strengths and expertise of each player, yours and the tool’s. You have to work with GenAI, rather than expect it to do everything.
Stephen D Buckingham is a reader in biomedical sciences education at Queen Mary, University of London.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment