Tag Archives: AI

Add GenAI RAG categories to Moodle at the click of a button

Written by Dr. Richard Floyd, Digital Learning Facilitator (ISS)

Each assignment in Moodle needs to have a Red, Amber, or Green category for the use of generative AI. The University has provided some guidance for staff covering what these categories mean and how this should be communicated to students.

To make it easier for staff to share this information with students, we have now added a set of custom components to the text editor in Moodle. This will allow you to choose your category and automatically insert the official icon and text into your assignment description.

How to add an AI RAG category to your assignment description

  1. Edit your assignment settings
  2. In the description field, click the three dots to expand the toolbar, then select the Components for Learning (C4L) option (the Lego brick):The TinyMC text editor in Moodle with the 'Components 4 Learning" (C4L) lego brick icon circled in red.
  3. Under the Custom tab, select the relevant category and then click save:The custom tab for components in the C4L section of the TinyMC editor. The 'RAG AI - Red)' option is selected.
  4. The component will then appear in your assignment description:An example of an assignment with the Generative AI category applied in Moodle in the "description" section of the settings.

If you require any further support, get in touch with a Faculty Learning Technologist or a Digital Learning Facilitator.

Getting started with Gen AI

Written by Andy Holgate, Library Digital Skills Developer

Generative AI really is a disruptive technology, and it’s already altering the way we produce content across society: entertainment, education, marketing, and even software engineering.
There are good and bad uses, especially in education, but it is a technology that we can embrace and use well, and more than anything else I want to stress WE DON’T HAVE TO BE SCARED OF GEN AI!

There is so much misinformation out there on the subject (and yes, ironically, most of it has probably been created by Generative AI). I don’t know about you but for me it was difficult knowing where to start. I had so many questions, I’d heard so many things, and I had no clue where to go for answers.

Well ok that’s not entirely true, unless you were living under a rock, you will have been aware that back in 2023 the University issued guidelines on the subject: Principles for the Educational Application of Generative AI in which the University states “Lancaster University promotes a culture of Generative AI (Gen AI) awareness, criticality and expertise. We encourage effective and responsible use of Gen AI technologies in learning and teaching, fostering student and staff digital and academic literacies.” This was further expanded on with guidance on various types of AI, how it could be used, the pitfalls of using it etc. That further guidance is available here: Using AI in your learning and assessment and it’s a really good starting point.

So, let me explain a little more about how I found myself stood at the foot of a sheer rock face called Gen AI. I work in professional services, the library to be exact, and part of my role is to help colleagues with increasing their digital skills. Back in 2023 I was presented with the phrase “We need to run a training session on Gen AI for the library staff. Introducing it, how it could used etc…” I’m not going to lie, I looked at my manager and behind my smile I was thinking “Where do I start?”.

So, I ran some internet searches, read a few blogs and articles and started to get an idea of what it was all about. Turns out it wasn’t as alien as I thought, I’d been using forms of it for years in google translate, MS office products, library databases, that app on my phone that let me edit photos in lots of weird and wonderful ways. Seeing that familiarity suddenly made it less scary.

The tool for me that really brought my knowledge up to speed was LinkedIn Learning. The University had bought a subscription in 2023 giving all students and staff access to over 22,000 training videos and courses. Now cards on the table, I have to admit that part of my role is the promotion of the platform at the University, so maybe I am slightly biased BUT hey I’m not on commission! I looked at LinkedIn Learning and discovered that there was a lot of material available on the subject. I started watching videos to learn more. I was pressed for time and working to a deadline so I deliberately chose videos rather than full courses, meaning I could watch a three minute video from an expert explaining just what Chat GPT was. I watched quite a few on various subjects such as Google Bard, Canva, ethics of AI, brainstorming with Gen AI, searching with Gen AI and a whole lot more. I had massively increased my awareness in a short amount of time, and all the while using a professional, reliable source. Now, you have the opportunity to do the same.

Working with others across the University, we have created a LinkedIn Learning pathway called A beginner’s guide to generative ai.
This is split into sections and the first few are comprised of the short videos I mentioned above, you don’t have to watch them all, you can dip in and out as you please. At the start there is a general introduction which formed part of the training session I ran for library staff – it’s the basics in very plain English, written how I would want to be taught. Next, you get the videos which really will introduce you to most aspects of Gen AI and where it stands as of now in early 2024.
Now don’t be put off when you see the pathway says it contains 22 hours of learning, the first 24 videos ( the short introduction ones ) come in at under 90 minutes total. The final section of the pathway is courses, so if you want to climb further up that AI rock face you can do, but understandably not everyone needs to.

I hope that all makes sense, and if you’ve made it this far, thanks for reading. So go have a look at the pathway, quell any fears you may have and embrace the technology (no harm in questioning it either). I’m old enough to remember the dawn of the modern internet (not the original military one) and people feared it, today it’s woven into every fabric of modern life. Gen AI is the next thread in that rich tapestry.

If you have any questions or comments about this blog post or the LinkedIn Learning pathway, please get in touch with me.

Disclaimer: the opinions expressed in this blog are my own and do not necessarily reflect those of the university.

Ethics and AI: Some ‘takeaways’ from the ALT Winter Summit 2023

On Tuesday 12th December, I attended the ALT (Association for Learning Technologists) Winter Summit. The theme of the Conference was Ethics and Artificial Intelligence (AI). We heard from the following speakers:

  • Helen Beetham, Researcher and Consultant
  • Mary Jacob, Aberystwyth University
  • Olatunde Durowoju, Liverpool Business School
  • Dr Tarsem Singh Cooner, The University of Birmingham

There was also a student panel at the end which looked at the student experience of AI and their ideas for moving forward with using AI in a higher education context.

The speakers covered topics such as ethical problems and concerns with the use of AI, accessibility and AI and there was a case study showing how AI was used in a session with social work students at the University of Birmingham. It was a really enjoyable day and I learned a lot!

I’d like to share three ‘takeaways’ that I had from the ALT Summit.

Firstly, many of us are aware of some of the issues surrounding the use of AI. For example, the use AI tools takes a huge amount of energy; there are issues around equity and student access to paid for services and tools; there is an impact on student learning; we have also heard in the news that a number of larger companies have sacked ethical advisors and much more! However, one thing I have had experience of in my own work and that was raised at the Summit, was the sheer amount of secrecy behind the training of artificial intelligence models. How are these models trained exactly? Who is training them? What impact does this have? We know about the issues around encoded bias the models and we know that some of the people involved in this training process are underpaid but I definitely want to find out more!

Dr. Olatunde Durowoju spoke on ‘Achieving Inclusive Education in AI’ and has written an article on this in 2023. His talk was really interesting and spoke about some of the positive aspects of using AI and how this can benefit students such as those with neurodiversity, who are learning English as a second language or who have a disability. Examples were given of the use of AI in higher education and how this can help students improve their English language proficiency, how it can help address a cultural gap, how it can help reduce student anxiety around the quality of the work they produce and it can assist those students who may have diminished capacity due to other responsibilities (e.g. caregivers). Some of the examples given included using AI to summarise text to help gain a better understanding of it, improving the quality of writing through prompt engineering, using Chatbots to help with out-of-hours engagement and using AI to pluralise perspectives on a specific topic.

Finally, the student panel gave a real insight into how students are using AI currently, how they perceive its future use and what they would like to see moving forward from their institutions. Students on the panel were from universities such as Sheffield Hallam University, Edinburgh Napier University and the University of Kent. The students were also at different places in their University studies so there was a good range of perspectives across the students. Two of the things that students said they wanted were:

  1. More consistent guidance on the acceptable use of AI in their studies across the departments in their institutions but also across UK institutions in general. They believed this would help students be less confused about when and how they can use it.
  2. More education on the use of AI from their institutions on the problems of using AI such as perpetuating bias and academic integrity concerns.

Thank you to ALT for another wonderful conference and to the speakers for sharing their knowledge and expertise. I will definitely be exploring some of the issues and opportunities that have been mentioned in more detail.

AI tools: A Christmassy Showcase

The 12 Days of AI is  a self-directed online course run by the University of the Arts London. Participants can learn about a new AI tool each day. I’ve signed up via Eventbrite and places are still available if you want to join  up. It is free!

Each day, participants receive an email that directs them to a ‘daily task’. You are then introduced to an AI tool and given a quick run down on how it works. You can then follow the instructions to familiarise yourself with the tool. There are other resources shared on the website too including articles, demos and more!

So far this course has introduced the following tools:

  1. Hour One – a tool to create videos. An example of this tool is showcased on the ‘What is 12 Days of AI?’ webpage.
  2. Chat GPT – the one we have probably all heard of! Participants are given some tasks to prompt this tool to generate information and there is some guidance on ‘Prompt Engineering’.
  3. Claude.Ai – a tool that can summarise text. You can create an account and test out this tool on an article or some text that you have permission to upload.

I’ll be working my way through these tools and hopefully some of you will be able to join in too!