In this condensed talk from the Vulcan Sessions on 26/01/24, Senior Lecturer in Education David Meechan discusses the opportunities and considerations of using AI in education.
Introducing the concept of Generative Artificial Intelligence (GAI) as a diverse and constantly evolving field without a consistent definition among scholars. He shares personal examples of how GAI can help support students by scaffolding their learning and reducing the initial cognitive load through the creation of basic first drafts.
David expresses, ‘I’m a big believer in experiential learning, providing children, and now students, with experiences they can build on.’ Therefore, he advocates for the use of GenAI tools, which offer ‘varied, specific, and potentially creative results, revolutionising education and supporting lifelong learning.’
Emphasising the importance of the ethical use AI tools in education, he argues for engagement with a wide range of GenAI tools to prepare students for navigating future changes in the education and technological landscape.
In this short film Jane Mills delves into the realm of text-to-image Generative AI models, experimenting with platforms such as Stable Diffusion and Midjourney. Initially encountering what she described as “odd and distorted” images, she highlights the evolving landscape of Generative AI images during this period.
“In 2023 the images started to look better,” Jane explains, noting a significant breakthrough as these AI models began capturing intricate details, showcasing her expertise as a fashion specialist, particularly in facial features, colour pallets, fabric textures and embellishments.
By May 2023, AI integration became a reality in the discipline of Fashion teaching. Jane champions the fusion of human creativity with machine efficiency, enabling designers to conceptualise runway shots, intricate patterns, and expressive collages.
Highlighting the importance of designing detailed prompts, Jane illustrates how specifying techniques, mediums, and styles could lead to incredible results, ranging from watercolor cityscapes to photorealistic textures.
Generative AI serves as a powerful tool that provides fresh perspectives, preparing students for the ever-evolving fashion industry. This approach facilitates faster design processes, hones skills, and meets industry demands.
“It’s an assistive tool, a collaborator that empowers human imagination. As students gain valuable experience using this transformative technology, they’re not just designing the future of fashion; they’re shaping the way we think about its creation,” she emphasised.
In this short film, Theatre Director Matt Bond delves into the intricacies of his pioneering theater experiment, “PlayAI,” a collaborative venture with the AI tool ChatGPT.
Building on the success of his groundbreaking work at Riverside Studios in London in April 2023, this project challenges the traditional boundaries of playwriting by immersing itself in the realms of exploration and experimentation with Artificial Intelligence.
Over a transformative four-week period, Bond collaboratively engaged with UON BA Acting students to craft a new play that delves into profound themes. These themes encompass the nuanced emotions surrounding redundancy and belonging in the age of Artificial Intelligence, the complexities of forging relationships with digital avatars, and the conflicting dynamics between idealism and capitalism within a futuristic digital ‘metaverse’ society.
The film provides valuable insights as four BA acting students share their perspectives on how they have embraced AI technology as a powerful catalyst for innovation and exploration.
Moreover, the impact of the project transcends the realm of performance. It becomes evident that the students, in their exploration of key AI concepts, have not only expanded their digital literacies but have also delved into the ethical boundaries of AI. Their involvement reflects a meticulous and comprehensive approach to working with AI, showcasing a profound commitment to understanding and navigating the intricate facets of this transformative technology.
The new features in Blackboard’s February upgrade will be available on Friday 2nd February. This month’s upgrade includes the following new/improved features to Ultra courses:
Forms/surveys
February upgrade’s will enable to staff to use forms/surveys in Ultra courses. The forms tool is very similar to the Ultra test tool, but with a slightly different set of questions that can be used, including a likert scale question. Forms are intended to be used where staff want to collect information from students but where there is no correct answer, and, usually, where there is no requirement to grade or provide feedback on the response. However, there is an option to grade a form response and to add feedback, but the default state for forms is to be automatically marked as complete upon submission.
To use a form in an Ultra course, select the + (plus) button in the course content area, and choose ‘Form’.
The following question types are supported in forms:
- Essay question (i.e., an open/free response question)
- Likert question
- Multiple choice question
- True/false question
In addition, staff can use ‘Add text’ to add sections of text that do not require a response from students (e.g., to explain more about a particular section of the form), and can use ‘Add Page Break’ to split-up longer forms. ‘Add local file’ can be used where staff want to a upload file for use by students when completing the form.
Current limitations of forms
Please note that in this initial release of forms, it is not possible to collect responses anonymously. Note also that there is no single-select multiple choice tool in forms as there is in tests, as the ‘Add Multiple Choice Question’ tool works differently in forms and tests. When setting a multiple choice question in a form, students will be able to select more than one option, including all options. Additionally, the likert question type only allows ranges of 3, 5, and 7 to be used.
Data protection considerations
Forms should not normally be used to collect and store personal or confidential information about students. If you are considering using forms for this purpose, advice should be sought from the University’s data protection officer beforehand, or, if using forms for research, approval should be sought from one of the University’s ethics committees.
Course activity report improvements
Getting an overview of student engagement in Ultra courses is very quick and easy compared to how it was in Original courses. Engagement data in Ultra is also considerably more reliable in Ultra than it was in Original as Ultra takes account of access via mobile devices, which was not the case with Original courses.
Following the February upgrade, as well as providing information about the number of days since each student last accessed the course, and number of hours each student spent in the course, missed assessment deadlines will also be included in the course activity page on Ultra courses, and staff can set flags for these in the alert settings.
In the following screenshot, the alert settings have been set up to flag students who have missed one or more deadlines, and who have not accessed the Ultra course for more than seven days.
Missed due dates and non-submissions
Please note that in the course activity page, missed due dates should not be taken to mean non-submissions, as an assignment that is submitted late is counted as a missed due date. However, non-submitters can be viewed and quickly contacted in Ultra courses via the student progress report for the assignment (see links below).
- More information about the course activity page is available from: Blackboard Help – Course Activity Report
- Information about how to view and message non-submitters of Turnitin assignments is available from: Learning Technology Team, Ultra Workflow 1: Turnitin – Identifying and contacting non-submitters
- Information about how to view and message non-submitters of Blackboard assignments is available from: Learning Technology Team, Ultra Workflow 2: Blackboard Assignment – Identifying and contacting non-submitters
More information
As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?
This January, UON hosted the Winter Game Jam, a four-day games development event where students battled it out with a brand new challenge: the Rexy Wheel, a digital controller designed for training video camera professionals.
In this short film, Games lecturer Vikaas Mistry discusses how Game Jam is about giving UON students experiences that will prepare them for new developments in the game industry, and to think outside of the box.
Rob Portus, the Rexy Wheel’s inventor, shares how it was really exciting to see what they’ve done ‘in just four days, they all created games that with a few tweaks have the potential to be marketed commercially.’
The new features in Blackboard’s January upgrade will be available between Friday 5th and Monday 8th January. The January upgrade includes the following new/improved features to Ultra courses:
- AI Design Assistant: Authentic assessment prompt generation
- AI Design Assistant: Generate rubric Improvements
- Total & weighted calculations in the Ultra gradebook
AI Design Assistant: Authentic assessment prompt generation
Currently, staff are able to auto-generate prompts for Ultra discussions and journals. The January upgrade will add the ability to auto-generate prompts for Ultra assignments too. Following the upgrade, when setting up an Ultra assignment, staff will see the ‘Auto-generate assignment’ option in the top right-hand corner of the screen.
Selecting ‘Auto-generate assignment’ will generate three prompts which staff can refine by adding additional context in description field, selecting the desired cognitive level and complexity, and then re-generating the prompts.
Once selected and added, the prompt can then be manually edited by staff prior to releasing the assignment to students.
AI Design Assistant: Generate rubric Improvements
Following feedback from users, the January upgrade will improve auto-generated rubrics. The initial version of the AI Design Assistant’s auto-generated rubrics did not handle column and row labels properly, and this will be improved in the January upgrade. Also improved in the January upgrade will be the distribution of percentages/points across the criteria, which were inconsistently applied in the initial release.
Total & weighted calculations in the Ultra gradebook
The Ultra gradebook currently allows for the creation of calculated columns using the ‘Add Calculation’ feature. However, the functionality of these calculated columns makes the creation of weighted calculations difficult, e.g., when generating the total score for two pieces of work where one is worth 60% of the mark and the other 40%. At present, this would have to be done in a calculated column by using the formula AS1 x 0.6 + AS2 x 0.4, like so:
However, weighting problems can be further compounded if the pieces of work are not all out of 100 points, which can often be the case when using computer-marked tests. Following feedback about this issue, the January upgrade will bring in an ‘Add Total Calculation’ option, which will allow staff to more easily generate an overall score for assessments with multiple sub-components. The new ‘Add Total Calculation’ column will simply require staff to choose the assessments which are to be used in the calculation, and specify how they are to be weighted. Using the same example as above, the calculation would look like so:
More information
To find out more about all of the AI Design Assistant tools available in NILE, full guidance is available at: Learning Technology – AI Design Assistant
And as ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?
Learning Technology (based in LLSS) recently hosted a day of training workshops on the use of the H5P (HTML5 package), a content creation platform that is integrated into NILE to allow educators to create interactive online e-learning content, such as interactive videos, quizzes, drag and drop, flash cards and interactive presentations.
The day-long event was spearheaded by E-Learning/Multimedia Resources Developer Anne Misselbrook. Trond Skeie, an H5P Account Manager, led masterclasses, while academic staff presented compelling case studies that demonstrated the platforms effectiveness in enhancing teaching and learning.
Trond evidenced the H5P philosophy; “to make it easy for anyone to create interactive content”. In the masterclass sessions Trond demonstrated how to use a number of the platforms most popular content types, such as quizzes, interactive video, drag-and-drop and presentations.
In the Showcase session several University of Northampton faculty members shared their experiences of using H5P.
Yvonne Yelland and Biannca Harris, Lecturers in Nursing, demonstrated how they had used H5P to create a highly effective branching scenario with videos to help their students learn about how to approach a crisis situation.
“I’m one of the mental health nursing lecturers, and we’ve been using H5P to teach some of our first years all about mental health and how that’s experienced by patients. We used a branching scenario, and we got some actors in, and we filmed a scenario of a patient on a ward, and we used that to help students to look at the different alternative approaches to a crisis situation”. Yvonne Yelland.
Branching activity feedback from the students
Yvonne and Bianca had provided students with a survey to gather feedback about their experience of the branching activity, you can read the extract of comments below.
“Very clear detailed information”.
“It’s a good way of learning”.
“It was interactive and interactive learning can be fun”.
H5P Interactive books
Jenny Devers, Senior Lecturer in Occupational Therapy provided her H5P Interactive Books for Anne to showcase. Included in the books are a video, a Padlet, text to read, questions and text entry where students key in and save their work in a Word document.
Jenny Devers feedback from a staff perspective:
“Staff find this (H5P) really helpful as it tracks who has completed the work and we can access that work…that’s really helpful for self-directed work”. Jenny Devers.
Live tool The Chase
Mosavar Farahani presented a real-time live classroom activity ‘The Chase’, where staff attending in person grouped into teams to answer the quiz questions.
“I have used quizzes, especially the Chase which has made them (the students) compete together, so as a competition tool, it would be something that makes it (learning) more fun”. Mosavar Farahani.
Reflection of the day
The workshops were an opportunity for faculty staff to learn how to use some of the content types available on the platform and to connect with other educators who are using H5P.
Quotes from colleagues who attended: “I think what I found most useful apart from having a play with some of the tools was seeing how it is being used at UON. I was especially inspired/impressed with the branching video that nursing put together, I thought it was so effective and a great representation of how something so simple can be used in such an impressive way”. Kelly Lea, Learning Technologist.
“I enjoyed the H5P workshop day on the 15th of November. It was especially interesting attending the H5P Showcase event in the afternoon, and seeing examples of how colleagues from around the University are making good use of H5P with their students”. Rob Farmer, Learning Technology Manager.
Further information
If you would like to be enrolled on to the H5P Community support Organisation on NILE please contact Anne Misselbrook. To book a place on the H5P virtual session use the LibCal booking link here.
Dr Cleo Cameron (Senior Lecturer in Criminal Justice)
In this blog post, Dr Cleo Cameron reflects on the AI Design Assistant tool which was introduced into NILE Ultra courses in December 2023. More information about the tool is available here: Learning Technology Website – AI Design Assistant
Course structure tool
I used the guide prepared by the University’s Learning Technology Team (AI Design Assistant) to help me use this new AI functionality in Blackboard Ultra courses. The guide is easy to follow with useful steps and images to help the user make sense of how to deploy the new tools. Pointing out that AI-generated content may not always be factual and will require assessment and evaluation by academic staff before the material is used is an important point, and well made in the guide.
The course structure tool on first use is impressive. I used the key word ‘cybercrime’ and chose four learning modules with ‘topic’ as the heading and selected a high level of complexity. The learning modules topic headings and descriptions were indicative of exactly the material I would include for a short module.
I tried this again for fifteen learning modules (which would be the length of a semester course) and used the description, ‘Cybercrime what is it, how is it investigated, what are the challenges?’ This was less useful, and generated module topics that would not be included on the cybercrime module I deliver, such as ‘Cyber Insurance’ and a repeat of both ‘Cybercrime, laws and legislation’ and ‘Ethical and legal Implications of cybercrimes. So, on a smaller scale, I found it useful to generate ideas, but on a larger semesterised modular scale, unless more description is entered, it does not seem to be quite as beneficial. The auto-generated learning module images for the topic areas are very good for the most part though.
AI & Unsplash images
Once again, I used the very helpful LearnTech guide to use this functionality. To add a course banner, I selected Unsplash and used ‘cybercrime’ as a search term. The Unsplash images were excellent, but the scale was not always great for course banners. The first image I used could not get the sense of a keyboard and padlock, however, the second image I tried was more successful, and it displayed well as the course tile and banner on the course. Again, the tool is easy to use, and has some great content.
I also tried the AI image generator, using ‘cybercrime’ as a search term/keyword. The first set of images generated were not great and did not seem to bear any relation to the keyword, so I tried generating a second time and this was better. I then used the specific terms ‘cyber fraud’ and ‘cyber-enabled fraud’, and the results were not very good at all – I tried generating three times. I tried the same with ‘romance fraud’, and again, the selection was not indicative of the keywords. The AI generated attempt at romance fraud was better, although the picture definition was not very good.
Test question generation
The LearnTech guide informed the process again, although having used the functionality on the other tools, this was similar. The test question generation tool was very good – I used the term ‘What is cybercrime?’ and selected ‘Inspire me’ for five questions, with the level of complexity set to around 75%. The test that was generated was three matching questions to describe/explain cybercrime terminologies, one multiple choice question and a short answer text-based question. Each question was factually correct, with no errors. Maybe simplifying some of the language would be helpful, and also there were a couple of matched questions/answers which haven’t been covered in the usual topic material I use. But this tool was extremely useful and could save a lot of time for staff users, providing an effective knowledge check for students.
Question bank generation from Ultra documents.
By the time I tried out this tool I was familiar with the AI Design Assistant and I didn’t need to use the LearnTech guide for this one. I auto-generated four questions, set the complexity to 75%, and chose ‘Inspire me’ for question types. There were two fill-in-the-blanks, an essay question, and a true/false question which populated the question bank – all were useful and correct. What I didn’t know was how to use the questions that were saved to the Ultra question bank within a new or existing test, and this is where the LearnTech guide was invaluable with its ‘Reuse question’ in the test dropdown guidance. I tested this process and added two questions from the bank to an existing test.
Rubric generation
This tool was easily navigable, and I didn’t require the guide for this one, but the tool itself, on first use, is less effective than the others in that it took my description word for word without a different interpretation. I used the following description, with six rows and the rubric type set to ‘points range’:
‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies, the social, ethical and legal implications of cybercrime and digital evidence collection. Harvard referencing and writing skills.’
I then changed the description to:
‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies. Analyse and evaluate the social, ethical and legal implications of cybercrime and digital evidence collection. Demonstrate application of criminological theories. Demonstrate use of accurate UON Harvard referencing. Demonstrate effective written communication skills.’
At first generation, it only generated five of the six required rows. I tried again and it generated the same thing with only five rows, even though six was selected. It did not seem to want to separate out the knowledge and understanding of investigations and investigative strategies into its own row.
I definitely had to be much more specific with this tool than with the other AI tools I used. It saved time in that I did not have to manually fill in the points descriptions and point ranges myself, but I found that I did have to be very specific about what I wanted in the learning outcome rubric rows with the description.
Journal and discussion board prompts
This tool is very easy to deploy and actually generates some very useful results. I kept the description relatively simple and used some text from the course definition of hacking:
‘What is hacking? Hacking involves the break-in or intrusion into a networked system. Although hacking is a term that applies to cyber networks, networks have existed since the early 1900s. Individuals who attempted to break-in to the first electronic communication systems to make free long distance phonecalls were known as phreakers; those who were able to break-in to or compromise a network’s security were known as crackers. Today’s crackers are hackers who are able to “crack” into networked systems by cracking passwords (see Cross et al., 2008, p. 44).’
I used the ‘Inspire me’ cognitive level, set the complexity level to 75%, and checked the option to generate discussion titles. Three questions were generated that cover three cognitive processes:
The second question was the most relevant to this area of the existing course, the other two slightly more advanced and students would not have covered this material (nor have work related experience in this area). I decided to lower the complexity level to see what would be generated on a second run:
Again, the second question – to analyse – seemed the most relevant to the more theory-based cybercrime course than the other two questions. I tried again and lowered the complexity level to 25%. This time two of the questions were more relevant to the students’ knowledge and ability for where this material appears in the course (i.e., in the first few weeks):
It was easy to add the selected question to the Ultra discussion.
I also tested the journal prompts and this was a more successful generation first time around. The text I used was:
‘“Government and industry hire white and gray hats who want to have their fun legally, which can defuse part of the threat”, Ruiu said, “…Many hackers are willing to help the government, particularly in fighting terrorism. Loveless said that after the 2001 terrorist attacks, several individuals approached him to offer their services in fighting Al Qaeda.” (in Arnone, 2005, 19(2)).’
I used the cognitive level ‘Inspire me’ once again and ‘generate journal title’ and this time placed complexity half-way. All three questions generated were relevant and usable.
My only issue with both the discussion and journal prompts is that I could not find a way to save all of the generated questions – it would only allow me to select one, so I could not save all the prompts for possible reuse at a later date. Other than this issue, the functionality and usability and relevance of the auto-generated discussion and journal prompt, was very good.
On Friday 8th December, 2023, three new NILE tools will become available to staff; the AI Design Assistant, AI Image Generator, and the Unsplash Image Library.
AI Design Assistant
AI Design Assistant is a new feature of Ultra courses, and may be used by academic staff to generate ideas regarding:
- Course structure
- Images
- Tests/quizzes
- Discussion and journal prompts
- Rubrics
The AI Design Assistant only generates suggestions when it is asked to by members of academic staff, and cannot automatically add or change anything in NILE courses. Academic staff are always in control of the AI Design Assistant, and can quickly and easily reject any AI generated ideas before they are added to a NILE course. Anything generated by the AI Design Assistant can only be added to a NILE course once it has been approved by academic staff teaching on the module, and all AI generated suggestions can be edited by staff before being made available to students.
AI Image Generator & Unsplash Image Library
Wherever you can currently add an image in an Ultra course, following the upgrade on the 8th of December, as well as being able to upload images from your computer, you will also be able to search for and add images from the Unsplash image library. And, in many places in Ultra courses, you will be able to add AI-generated images too.
First Thoughts on AI Design Assistant
Find out more about what UON staff think about AI Design Assistant in our blog post from Dr Cleo Cameron (Senior Lecturer in Criminal Justice): First Thoughts on AI Design Assistant
Other items included in the December upgrade
The December upgrade will see the ‘Add image’ button in a number of new places in Ultra courses for staff, including announcements (upload from device or Unsplash), and Ultra tests and assignments (upload from device, Unsplash, and AI-generated images). However, please note that images embedded in announcements will not be included in the emailed copy of the announcement; they will only be visible to students when viewing the announcement in the Ultra course.
Ultra rubrics will be enhanced in the December upgrade. Currently these are limited to a maximum of 15 rows and columns, but following the upgrade there will be no limit on the number of rows and columns when using an Ultra rubric.
More information
To find out more about these new tools, full guidance is already available at: Learning Technology – AI Design Assistant
You can also find out more by coming along to the LearnTech Jingle Mingle on Tuesday 12th December, 2023, between 12:30 and 13:45 in T-Pod C (2nd Floor, Learning Hub).
As ever, please get in touch with your learning technologist if you would like any more information about these new NILE features: Who is my learning technologist?
“The Smart Import AI engine is a highly advanced tool that is learning at an impressive rate.” Svein-Tore Griff With, CEO, H5P. Quote used with permission from H5P.
H5P (HTML5 package) software has been available to staff at the University of Northampton since August 2022.
In June 2023 H5P released a feature called Smart Import, which uses Artificial Intelligence technology to assist in the creation of teaching content. This is an extension to the current suite of H5P tools we currently have at the university.
A two-week trial of Smart Import took place at the University of Northampton (UON) in August 2023.
We looked for a range of staff to participate in the trial as ‘testers’ to assess whether this has value for UON. In total 15 staff took part in the trial. All participants of the trial were provided with the agreement text to read before the trial began, as a prerequisite for using H5P Smart Import. Participants also received a paragraph of text to read which was provided from H5P, which might help in regards to copyright and so forth of the material provided.
Staff taking part in the trial were asked to perform one or more of the tasks listed below:
- Provide a small chunk of preferably ‘open education resource’ learning material.
- Provide prompt text of over 500 characters.
- Provide a YouTube video link.
- Provide a website link which you include in your learning resources.
- All the above tasks.
Using the resource or text provided to H5P, the Smart Import function, very quickly, with the power of AI technology, generated a range of interactive learning resources including Interactive Book, Dialog Cards, Quiz, Drag the Words, Crossword and more.
The ‘testers’ were asked for feedback on their experience of using H5P Smart Import, some quotes are shown below:
The speed at which the content types were produced, and the variety of the content types was quite incredible.
Kelly Lea, Learning Technologist, LLSS.
Smart Import AI engine is an advanced tool that make creating interactive activities very quick and save the lecturer a lot of time.
Mosavar Farahani, Senior Lecturer Biomedical Science.
I’ve just used a PDF (that I found online) to create an interactive book on creating content for neurodivergent students. I have to say I’m absolutely astounded by how well H5P did this. It took less than 10 mins to create, and the content is really engaging and interesting. (I can see this being a huge benefit to students).
Richard Byles, Learning Technologist, LLSS.
As well as quickness of producing interactive online content using the Smart Import feature, comments about educational value were received.
I think the AI generated book, could easily be used to help students with revision and assignment related prompts.
David Meechan, Senior Lecturer in Education, FHES
I tried it for a webpage and find it very useful and informative for my teaching/assessments/quiz sessions, especially the dialog cards, question set, crossword, drag the words, etc. options.
Rahul Mor, Senior Lecturer in Business Systems and Operations, FBL
I liked the fact it could be split up into different components and copied elsewhere. Good that it provides a citation URL.
Helena Beeson, Learning Development Tutor, LLSS
The package makes thinking of interactive activities simple.
Kate Swinton, Learning Development Tutor, LLSS
Observations include:
Rob Howe, Head of Learning Technology LLSS states: “The review process is really important, as everything else tends to work from it. This is an important step for tutors”.
Liz Sear, Senior Lecturer in Health and Social Care FHES adds: “I think it is a matter of learning how to use this to best advantage and being mindful of checking the resources made“.
Jim Atkinson, HR Staff Development Trainer and E-Learning developer says: “I think with training and tips on how to get the most out of the tool, it would be an excellent tool to use“.
H5P in NILE
The H5P content generated by Smart Import AI technology is provided in NILE in the same way as manually generated H5P content.
H5P Smart Import benefits
The automatically generated Interactive Book provides a structured online e-learning package with multiple activities including quiz questions. Alongside the Interactive Book other content types can be generated such as Dialog cards, Accordion, Question set etc. This automation speeds up work.
Rob Howe states: “Allow tutors to quickly achieve outputs which would have taken far longer than without the Smart Import”.
What happens next?
The university is currently considering whether to license Smart Import.
Further information about H5P Smart Import can be found by visiting the site below.
Recent Posts
- Spotlight on Excellence
- Building on Success: Fix Your Content Day at UON with Deborah Gardner
- Blackboard Upgrade – October 2024
- Small Changes, Big Impact: Fix Your Content Day
- “I can use the tools available to me to create online video tutorials for students”
- Stress-Free Submissions: How Practice Can Transform Digital Assessments
- AI Conversation: New feature September 2024
- Unlocking the Power of Kaltura: My Content Analytics
- Enhance Accessibility with AI-Generated Alternative Text in Blackboard Ultra
- Blackboard Upgrade – September 2024
Tags
ABL Practitioner Stories Academic Skills Accessibility Active Blended Learning (ABL) ADE AI Artificial Intelligence Assessment Design Assessment Tools Blackboard Blackboard Learn Blackboard Upgrade Blended Learning Blogs CAIeRO Collaborate Collaboration Distance Learning Feedback FHES Flipped Learning iNorthampton iPad Kaltura Learner Experience MALT Mobile Newsletter NILE NILE Ultra Outside the box Panopto Presentations Quality Reflection SHED Submitting and Grading Electronically (SaGE) Turnitin Ultra Ultra Upgrade Update Updates Video Waterside XerteArchives
Site Admin