Currently viewing the category: "Case Studies"
Click to view interview with Anne-Marie Langford, UON Learning Development tutor on uses of GenAI

In this short video UON Learning Development tutor Anne-Marie Langford discusses her work employing generative AI to produce sample passages of academic writing for analysis and refinement in development workshops.

Anne-Marie notes that the use of AI-generated text can prompt students to critique academic writing, encouraging them to develop higher order thinking skills. This proves particularly valuable in scrutinising shortcomings in generative AI-generated text which can prove useful in identifying and presenting knowledge but are less adept and applying, analysing and evaluating it.

While recognising the time-saving potential of chatbots such as ChatGPT and their uses in enhancing student learning, she underscores the limitations of GAI in academic writing and referencing. Anne-Marie emphasises the importance of students adopting a critical, ethical and well-informed approach to using generative AI, urging them to cultivate their own critical voices and refine their skills.

By incorporating text from generative tools into her sessions, Anne-Marie exemplifies the advantages of modelling critical use of generative AI with students.

Tagged with:
 

In 2023 the University appointed its first student Digital Skills Ambassador (DSA), the purpose of the role being to allow students to get digital skills support from other students. While it’s often assumed that most people are now confident and competent users of digital systems, especially young people (the so-called ‘digital natives’), the reality is that some students come to university without the basic digital skills they need to flourish on their courses. The University of Northampton is rightfully proud of the excellent digital facilities that support teaching and learning here, but being mindful of the pernicious effects that the digital divide can have in education, chose to create the student DSA role in order not to leave any student in the digital darkness. To understand a little more about what it means to be a DSA, we interviewed the current incumbent, Faith Kiragu, and asked them to explain in their own words how the role works.

1. Can you tell me a little about you and your role? How does the support work?

“As the Digital Skills Ambassador, my role primarily revolves around providing support and guidance to fellow students on various aspects of digital skills, with a focus on Microsoft Office Packages, NILE (Northampton Integrated Learning Environment), the student Hub, LinkedIn Learning, and related queries.

Students can seek my help by booking appointments through the Learning Technology platform. Upon visiting the platform, they fill out a form detailing their query briefly. After submission, they receive a confirmation email containing the details of their appointment. Additionally, to ensure they do not miss their session, students receive reminders a day before their scheduled appointment time. During the session, I address their queries, provide guidance, and offer practical assistance to help them navigate through any challenges they may encounter with digital tools and platforms. My aim is to empower students with the necessary digital skills to enhance their academic journey and future career prospects.”

2. What are the common support requests and how do you support these?

“The most common support requests I receive are related to navigating NILE, submitting assignments, accessing online classes on Collaborate, and Microsoft PowerPoint tasks like adding images and textboxes.

To support these requests, I provide personalized guidance during the one-on-one appointments. I offer step-by-step demonstrations, share relevant resources such as Linked-In Learning, and address specific queries to ensure students feel confident in handling these on their own. Additionally, I offer troubleshooting assistance and encourage students to practice these skills independently to enhance their proficiency over time.

3. Is the support used by students across all courses, or some areas more than others?

“Yes, I have noticed that more students from health-related courses seek digital skills support compared to other courses, Public Health being the course I have encountered most students. Students from the Business and Law Faculty come a close second.”

4. Do you have any (anonymous) examples of how you have helped students with their problems?

“A student asked for help with accessing their online classes on Collaborate via NILE. During our appointment, I guided them through the process of navigating to the correct module on NILE, locating the scheduled Collaborate session, and joining the virtual classroom. By the end of the session, the student could successfully participate in their online class without further difficulties.

Another student sought help creating a presentation on Microsoft PowerPoint, specifically needing guidance on how to add images and textboxes effectively. I provided a step-by-step demonstration of inserting images into slides, resizing and positioning them, and formatting textboxes for adding content and captions. Additionally, I shared tips on utilising PowerPoint’s features for enhancing visual appeal and maintaining a cohesive layout throughout the presentation. I also supported the student in accessing Linked-In Learning, and the student left the session equipped with the skills and confidence to complete their assignment using PowerPoint effectively.”

5. What do you think are the main benefits to students who have received support? 

“The support I offer to students entails providing guidance and assistance with various digital tools and platforms, including NILE, Microsoft PowerPoint, and Collaborate. Through personalized appointments, students receive practical help in navigating these systems. This support not only enhances their digital skills but also boosts their confidence in engaging with coursework effectively. As a result, students experience improved academic performance and save valuable time by overcoming challenges efficiently. Furthermore, the support empowers students to take ownership of their learning journey, fostering independence and lifelong learning skills. Overall, the support provided equips students with the necessary resources and confidence to succeed academically in today’s digital-centric educational landscape.”

6. What have you learnt from your time in the role?

“In my role as the Student Digital Skills Ambassador, I have learned invaluable lessons that have enriched both my technical and interpersonal skills. Effective communication has been paramount as I translate complex technical information into accessible guidance for students with varying levels of digital literacy. Adaptability has been key as I tailor support to accommodate diverse learning styles and preferences. Through addressing queries, I have honed my problem-solving abilities while cultivating patience and empathy for students’ individual challenges. Additionally, this role has emphasized the importance of continuous learning, prompting me to stay updated on emerging technologies and digital trends. Overall, my experience has deepened my understanding of digital tools and platforms while enhancing my ability to support others in their learning journey, fostering a collaborative and empowering environment for student success.”

Click to view video - Fashion GAI
Click to view this video case study in a new tab

This short film features three BA Fashion, Textiles, Footwear & Accessories students discussing their experiences using Generative AI (GAI) in their projects. The students demonstrate diverse applications of GAI, highlighting how they tailor the technology to their individual creative needs.

The film features Subject Head Jane Mills, who discusses the potential of AI to support students, and outlines the introduction of a new AI logbook – designed to provide a framework for students to confidently explore and utilize GAI for brainstorming and research purposes.

Click to view David Meechan’s abridged talk from the Vulcan Sessions on 26/01/24.

In this condensed talk from the Vulcan Sessions on 26/01/24, Senior Lecturer in Education David Meechan discusses the opportunities and considerations of using AI in education.

Introducing the concept of Generative Artificial Intelligence (GAI) as a diverse and constantly evolving field without a consistent definition among scholars. He shares personal examples of how GAI can help support students by scaffolding their learning and reducing the initial cognitive load through the creation of basic first drafts.

David expresses, ‘I’m a big believer in experiential learning, providing children, and now students, with experiences they can build on.’ Therefore, he advocates for the use of GenAI tools, which offer ‘varied, specific, and potentially creative results, revolutionising education and supporting lifelong learning.’

Emphasising the importance of the ethical use AI tools in education, he argues for engagement with a wide range of GenAI tools to prepare students for navigating future changes in the education and technological landscape.

Tagged with:
 
Jane Mills talk at Vulcan Sessions (abridged) 5 mins - Click to view.
Click to view of Jane Mill’s abridged talk from the Vulcan Sessions on 26/01/24.

In this short film Jane Mills delves into the realm of text-to-image Generative AI models, experimenting with platforms such as Stable Diffusion and Midjourney. Initially encountering what she described as “odd and distorted” images, she highlights the evolving landscape of Generative AI images during this period.

“In 2023 the images started to look better,” Jane explains, noting a significant breakthrough as these AI models began capturing intricate details, showcasing her expertise as a fashion specialist, particularly in facial features, colour pallets, fabric textures and embellishments.

By May 2023, AI integration became a reality in the discipline of Fashion teaching. Jane champions the fusion of human creativity with machine efficiency, enabling designers to conceptualise runway shots, intricate patterns, and expressive collages.

Highlighting the importance of designing detailed prompts, Jane illustrates how specifying techniques, mediums, and styles could lead to incredible results, ranging from watercolor cityscapes to photorealistic textures.

Generative AI serves as a powerful tool that provides fresh perspectives, preparing students for the ever-evolving fashion industry. This approach facilitates faster design processes, hones skills, and meets industry demands.

“It’s an assistive tool, a collaborator that empowers human imagination. As students gain valuable experience using this transformative technology, they’re not just designing the future of fashion; they’re shaping the way we think about its creation,” she emphasised.

Tagged with:
 
Play AI
Click to view this case study video – link opens in new tab.

In this short film, Theatre Director Matt Bond delves into the intricacies of his pioneering theater experiment, “PlayAI,” a collaborative venture with the AI tool ChatGPT.

Building on the success of his groundbreaking work at Riverside Studios in London in April 2023, this project challenges the traditional boundaries of playwriting by immersing itself in the realms of exploration and experimentation with Artificial Intelligence.

Over a transformative four-week period, Bond collaboratively engaged with UON BA Acting students to craft a new play that delves into profound themes. These themes encompass the nuanced emotions surrounding redundancy and belonging in the age of Artificial Intelligence, the complexities of forging relationships with digital avatars, and the conflicting dynamics between idealism and capitalism within a futuristic digital ‘metaverse’ society.

The film provides valuable insights as four BA acting students share their perspectives on how they have embraced AI technology as a powerful catalyst for innovation and exploration.

Moreover, the impact of the project transcends the realm of performance. It becomes evident that the students, in their exploration of key AI concepts, have not only expanded their digital literacies but have also delved into the ethical boundaries of AI. Their involvement reflects a meticulous and comprehensive approach to working with AI, showcasing a profound commitment to understanding and navigating the intricate facets of this transformative technology.

Tagged with:
 
Click to open video in new window
Click to open this video in a new window.

This January, UON hosted the Winter Game Jam, a four-day games development event where students battled it out with a brand new challenge: the Rexy Wheel, a digital controller designed for training video camera professionals.

In this short film, Games lecturer Vikaas Mistry discusses how Game Jam is about giving UON students experiences that will prepare them for new developments in the game industry, and to think outside of the box. 

Rob Portus, the Rexy Wheel’s inventor, shares how it was really exciting to see what they’ve done ‘in just four days, they all created games that with a few tweaks have the potential to be marketed commercially.’

Tagged with:
 

Dr Cleo Cameron (Senior Lecturer in Criminal Justice)

In this blog post, Dr Cleo Cameron reflects on the AI Design Assistant tool which was introduced into NILE Ultra courses in December 2023. More information about the tool is available here: Learning Technology Website – AI Design Assistant

Course structure tool

I used the guide prepared by the University’s Learning Technology Team (AI Design Assistant) to help me use this new AI functionality in Blackboard Ultra courses. The guide is easy to follow with useful steps and images to help the user make sense of how to deploy the new tools. Pointing out that AI-generated content may not always be factual and will require assessment and evaluation by academic staff before the material is used is an important point, and well made in the guide.

The course structure tool on first use is impressive. I used the key word ‘cybercrime’ and chose four learning modules with ‘topic’ as the heading and selected a high level of complexity. The learning modules topic headings and descriptions were indicative of exactly the material I would include for a short module.

I tried this again for fifteen learning modules (which would be the length of a semester course) and used the description, ‘Cybercrime what is it, how is it investigated, what are the challenges?’ This was less useful, and generated module topics that would not be included on the cybercrime module I deliver, such as ‘Cyber Insurance’ and a repeat of both ‘Cybercrime, laws and legislation’ and ‘Ethical and legal Implications of cybercrimes. So, on a smaller scale, I found it useful to generate ideas, but on a larger semesterised modular scale, unless more description is entered, it does not seem to be quite as beneficial. The auto-generated learning module images for the topic areas are very good for the most part though.

AI & Unsplash images

Once again, I used the very helpful LearnTech guide to use this functionality. To add a course banner, I selected Unsplash and used ‘cybercrime’ as a search term. The Unsplash images were excellent, but the scale was not always great for course banners. The first image I used could not get the sense of a keyboard and padlock, however, the second image I tried was more successful, and it displayed well as the course tile and banner on the course. Again, the tool is easy to use, and has some great content.

• Ultra course with cybercrime course banner

I also tried the AI image generator, using ‘cybercrime’ as a search term/keyword. The first set of images generated were not great and did not seem to bear any relation to the keyword, so I tried generating a second time and this was better. I then used the specific terms ‘cyber fraud’ and ‘cyber-enabled fraud’, and the results were not very good at all – I tried generating three times. I tried the same with ‘romance fraud’, and again, the selection was not indicative of the keywords. The AI generated attempt at romance fraud was better, although the picture definition was not very good.

Test question generation

The LearnTech guide informed the process again, although having used the functionality on the other tools, this was similar. The test question generation tool was very good – I used the term ‘What is cybercrime?’ and selected ‘Inspire me’ for five questions, with the level of complexity set to around 75%. The test that was generated was three matching questions to describe/explain cybercrime terminologies, one multiple choice question and a short answer text-based question. Each question was factually correct, with no errors. Maybe simplifying some of the language would be helpful, and also there were a couple of matched questions/answers which haven’t been covered in the usual topic material I use. But this tool was extremely useful and could save a lot of time for staff users, providing an effective knowledge check for students.

Question bank generation from Ultra documents.

By the time I tried out this tool I was familiar with the AI Design Assistant and I didn’t need to use the LearnTech guide for this one. I auto-generated four questions, set the complexity to 75%, and chose ‘Inspire me’ for question types. There were two fill-in-the-blanks, an essay question, and a true/false question which populated the question bank – all were useful and correct. What I didn’t know was how to use the questions that were saved to the Ultra question bank within a new or existing test, and this is where the LearnTech guide was invaluable with its ‘Reuse question’ in the test dropdown guidance. I tested this process and added two questions from the bank to an existing test.

Rubric generation

This tool was easily navigable, and I didn’t require the guide for this one, but the tool itself, on first use, is less effective than the others in that it took my description word for word without a different interpretation. I used the following description, with six rows and the rubric type set to ‘points range’:

‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies, the social, ethical and legal implications of cybercrime and digital evidence collection. Harvard referencing and writing skills.’

I then changed the description to:

‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies. Analyse and evaluate the social, ethical and legal implications of cybercrime and digital evidence collection. Demonstrate application of criminological theories. Demonstrate use of accurate UON Harvard referencing. Demonstrate effective written communication skills.’

At first generation, it only generated five of the six required rows. I tried again and it generated the same thing with only five rows, even though six was selected. It did not seem to want to separate out the knowledge and understanding of investigations and investigative strategies into its own row.
I definitely had to be much more specific with this tool than with the other AI tools I used. It saved time in that I did not have to manually fill in the points descriptions and point ranges myself, but I found that I did have to be very specific about what I wanted in the learning outcome rubric rows with the description.

Journal and discussion board prompts

This tool is very easy to deploy and actually generates some very useful results. I kept the description relatively simple and used some text from the course definition of hacking:

‘What is hacking? Hacking involves the break-in or intrusion into a networked system. Although hacking is a term that applies to cyber networks, networks have existed since the early 1900s. Individuals who attempted to break-in to the first electronic communication systems to make free long distance phonecalls were known as phreakers; those who were able to break-in to or compromise a network’s security were known as crackers. Today’s crackers are hackers who are able to “crack” into networked systems by cracking passwords (see Cross et al., 2008, p. 44).’

I used the ‘Inspire me’ cognitive level, set the complexity level to 75%, and checked the option to generate discussion titles. Three questions were generated that cover three cognitive processes:

• Discussion prompts auto-generated in an Ultra course

The second question was the most relevant to this area of the existing course, the other two slightly more advanced and students would not have covered this material (nor have work related experience in this area). I decided to lower the complexity level to see what would be generated on a second run:

• Discussion prompts auto-generated in an Ultra course

Again, the second question – to analyse – seemed the most relevant to the more theory-based cybercrime course than the other two questions. I tried again and lowered the complexity level to 25%. This time two of the questions were more relevant to the students’ knowledge and ability for where this material appears in the course (i.e., in the first few weeks):

• Discussion prompts auto-generated in an Ultra course

It was easy to add the selected question to the Ultra discussion.

I also tested the journal prompts and this was a more successful generation first time around. The text I used was:

‘“Government and industry hire white and gray hats who want to have their fun legally, which can defuse part of the threat”, Ruiu said, “…Many hackers are willing to help the government, particularly in fighting terrorism. Loveless said that after the 2001 terrorist attacks, several individuals approached him to offer their services in fighting Al Qaeda.” (in Arnone, 2005, 19(2)).’

I used the cognitive level ‘Inspire me’ once again and ‘generate journal title’ and this time placed complexity half-way. All three questions generated were relevant and usable.

• Journal prompts auto-generated in an Ultra course

My only issue with both the discussion and journal prompts is that I could not find a way to save all of the generated questions – it would only allow me to select one, so I could not save all the prompts for possible reuse at a later date. Other than this issue, the functionality and usability and relevance of the auto-generated discussion and journal prompt, was very good.

In the heart of the Waterside Campus, a new art installation by Senior Digital Marketing Lecturer and creative artist Kardi Somerfield is rewriting the rules of engagement, merging art and education to create a unique learning experience and visual identity for the newly refurbished Waterside bar. We recently had the opportunity to meet with Kardi Somerfield, to discuss her work. 

Kardi’s work stands as an extraordinary tribute to Northampton, stretching three meters in height and an impressive nine meters in length. It encapsulates the very essence of Northampton. Boasting over 200 distinct locations and nearly 300 characters, this monumental piece symbolizes the heart and soul of the town. The installation, at its core, epitomizes inclusivity in our local community.

Creating a work of these dimensions came with its own set of challenges. Transitioning from drawing on a digital screen to delivering a huge-format vinyl involved creating a vast Photoshop file with over 1000 layered elements including buildings, characters, and wildlife. 

One of the most intriguing aspects of Kardi’s creation is its interactive dimension. By integrating QR codes, she created a digital-physical bridge, allowing visitors to interact with the artwork in unique ways. This innovative artwork blends digital and analog technologies and transcends the visual spectacle to become a powerful pedagogical tool, particularly for storytelling within the realm of education.

Click here to watch the interview on Kaltura Player.

Interview with Kardi Somerfield - Interactive Northampton Wall

Robin Crockett (Academic Integrity Lead – University of Northampton) has run a small scale study investigating two AI detectors with a range of AI created assignments and has shared some of the initial results.

He used ChatGPT to generate 25 nominal 1000-word essays: five subjects, five different versions of each subject. For each subject, he instructed ChatGPT to vary the sentence length as follows: ‘default’ (i.e. I didn’t give it an instruction re. sentence length), ‘use long sentences’, ‘use short sentences’, ‘use complex sentences’, ‘use simple sentences’.

The table below shows the amount of the assignment which was detected as using AI in two different products: Turnitin and Copyleaks

 Essay 1 Essay 2 Essay 3 Essay 4 Essay 5 
Turnitin      
Default 100% AI 100% AI 76% AI 100% AI 64% AI 
Long 0% AI 26% AI 59% AI 67% AI 51% AI 
Short 0% AI 31% AI 82% AI 27% AI 
Complex 33% AI 15% AI 0% AI 63% AI 0% AI 
Simple 100% AI 0% AI 100% AI 100% AI 71% AI 
Copyleaks      
Default 100% AI at p=80.6% 100% AI at p=83.5% 100% AI at p=88.5% 100% AI at p=81.3% 100% AI at p=85.4% 
Long ~80% AI at p=65-75% 100% AI at p=81.5% ~95% AI at p=75-85% 100% AI at p=79.1% 100% AI at p=80.6% 
Short ~70% AI at p=66-72% 100% AI at p=76.9% 100% AI at p=87.3% ~85% AI at p=77-79% 100% AI at p=78.4% 
Complex 100% AI at p=72.9% 100% AI at p=81.0% ~90% AI at p=62-73% 100% AI at p=77.7% 0% AI 
Simple 100% AI at p=83.6% ~90% AI at p=73-81% 100% AI at p=95.2% ~90% AI at p=76-82% 100% AI at p=84.9% 

X = “Unavailable as submission failed to meet requirements”.

0% -> complete false negative.

Robin noted:

Turnitin highlights/returns a percentage of ‘qualifying’ text that it sees as AI-generated, but no probability of AI-ness.

Copyleaks highlights sections of text it sees as AI-generated, each section tagged with the probability of AI-ness, but doesn’t state the overall proportion of the text it sees as AI-generated (hence his estimates).

Additional reading: Jisc blog on AI detection

Tagged with: