Currently viewing the tag: "Blackboard"

The new features in Blackboard’s June upgrade will be available on Friday 7th June. This month’s upgrade includes the following new/improved features to Ultra courses:

Post announcements immediately

Following feedback from staff, June’s upgrade will make posting announcements in Ultra courses easier and more intuitive. Currently, an announcement has to be saved and closed before it can be posted from the announcements panel. However, this sometimes causes confusion, as the save button does not actually post the announcement. June’s upgrade will add the ‘Post’ button directly into the panel in which the announcement is created, allowing staff to compose and post the announcement in the same panel. The option to compose the announcement, save it, and post it later will still be available though.

• Upgraded announcements screen with ‘Post’ button highlighted

Back to top ↑

Print Ultra tests and assessments – staff only

The June upgrade will introduce a print button which will allow a PDF copy of an Ultra test or assignment to be generated, or for a printable copy of a test or assignment to be sent to a printer. The print button is available to staff only, and there are no plans to make the print button available to students.

Please note that there are certain limitations with the initial release of the print functionality, specifically that when a question pool is used as a question type, these are not included in the print.

• Ultra test with print button highlighted

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s May upgrade will be available on Friday 2nd May. This month’s upgrade includes the following new/improved features to Ultra courses:

Improved gradebook navigation

There are currently three different views of the Ultra gradebook which are available under two gradebook sub-tabs, one of which has a grid/list selector. These views are:

  • Markable items (list view)
  • Markable items (grid view)
  • Students

As this can make navigating the gradebook rather confusing, May’s upgrade will provide each gradebook view with its own sub-tab as follows:

  • Markable items (list view) will become Markable items
  • Markable items (grid view) will become Marks
  • Students will remain as it is
• Ultra gradebook current view (top) and new view (bottom)

Back to top ↑

Supporting multiple performance criteria in release conditions

Staff can currently set performance-based release conditions for items in Ultra courses against only a single performance criteria (i.e, a gradable item for which students have a grade in the Ultra gradebook). For example, if there are three tests in a course and staff want to make an item available only to students who have scored above a certain score in each test, this is not possible. As a workaround, staff can use cascading release criteria, e.g., releasing the second test when the first is passed, the third when the second is passed, and the content item when the third test is passed. However, using cascading release criteria is not always what is desired, and they can be complicated to set up.

Following the May release, staff will be able to set release conditions against multiple performance criteria, allowing staff to selectively release content in Ultra courses to students who have fulfilled multiple conditions. Where staff want to continue using a series of cascading release conditions they can continue to do so, but will no longer have to use them as a workaround. When setting up performance-based release conditions, staff can set content to be released to students who have scored n points/% or higher, or can specify custom ranges.

Release conditions can be especially useful for staff who want to automatically release content to students based upon their assessment scores. For example, releasing one set of content designed to support students who have failed an assessment, another set to students who have passed the assessment, and more challenging content to students who have performed exceptionally well.

• Setting release conditions with multiple performance-based criteria

More information about using release conditions in Ultra courses is available from: Blackboard Help – Content Release Conditions

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s April upgrade will be available on Friday 5th April. This month’s upgrade includes the following new/improved features to Ultra courses:

Anonymous discussions

Following feedback from staff, April’s upgrade will allow staff to set up Ultra discussions to allow students to post and reply to posts anonymously. After the upgrade, the option to allow anonymous responses and replied will be available in the ‘Discussion Settings’ panel.

• Discussion settings with ‘Allow anonymous responses and replies’ highlighted

Please note that selecting ‘Allow anonymous responses and replies’ does not mean that all replies and reponses will be anonymous; rather it means that students and staff can choose to post anonymously if they want to. To post anonymously, the ‘Post anonymously’ checkbox will need to be selected. Once posted, the anonymity of a post cannot be changed – i.e., an anonymous post cannot be de-anonymised by the person who posted it, and a non-anonymous post cannot be changed to anonymous.

• Discussion post prior to posting with ‘Post anonymously’ selected

Back to top ↑

AI Design Assistant: Select course items/context picker enhancements

Following last month’s upgrade which introduced the context picker (the ‘Select course items’ tool) for auto-generated test questions, April’s upgrade introduces the option to select course items when auto-generating learning modules, assignments, and discussion and journal prompts.

The purpose of the ‘Select course items’ tool is to allow staff to specify exactly which resources should be used when auto-generating content. If ‘Select course items’ is used, the auto-generated content will be based only upon the items selected. Where no course items are selected, auto-generated content will be based upon the course title.

• AI Design Assistant tool with ‘Select course items’ (context picker) highlighted

You can find out more about the AI Design Assistant and how to use it it at: Learning Technology Team: AI Design Assistant

Back to top ↑

Duplicate test/form question option, plus change to default test question value

The April upgrade introduces the ability for staff to duplicate test and form questions. Additionally, following the upgrade the default point value for newly created test questions will be changed from 10 points to 1 point.

• Test question with ‘Duplicate’ option selected

Back to top ↑

Likert form questions includes options for 4 and 6, as well as 3, 5, and 7

The February 2024 upgrade introduced the ‘Forms’ tool to Ultra courses. One of the question types available in forms is a Likert question; however, the original release only included options for staff to select Likert scales with 3, 5, or 7 points. April’s upgrade will add options to choose scales with 4 or 6 points.

• Form with Likert question and scale range selected

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s March upgrade will be available on Friday 8th March. This month’s upgrade includes the following new/improved features to Ultra courses:

AI Design Assistant – Context picker for test question auto-generation

Following the March upgrade, when using the auto-generate question tool in Blackboard tests, staff will be able to use the new ‘Select course items’ option to specify exactly which resources the AI Design Assistant auto-generate tool should use when generating questions. Prior to this, any auto-generated questions would be based on the course title (i.e., the module name).

The ‘Select course items’ option may be especially useful for staff wanting to create multiple tests, each based on one or more specific content items in the course, or for staff wanting to create a longer test built up from multiple auto-generated questions from different sections of the course, thus ensuring that the test represents questions testing students’ knowledge from across the entire course.

• Selecting ‘Auto-generate question’ from within a Blackboard test
• Refining the context of the question auto-generation by choosing ‘Select course items’
• Selecting the items for inclusion

More information about the AI Design Assistant is available from: Learning Technology Team – AI Design Assistant

Back to top ↑

‘No due date’ option for Blackboard assignments, tests, and forms

After the March upgrade, staff will no longer have to specify a due date when setting up a Blackboard assignment, test, or form. Please note that this change does not affect Turnitin assignments, which will continue to require a due date to be specified.

• Blackboard assignment with ‘No due date’ selected

Back to top ↑

Gradebook item statistics

The March upgrade provides staff with the option to select a column in the Ultra gradebook and access summary statistics for any graded item. The statistics page displays key metrics including:

  • Minimum and maximum value;
  • Range;
  • Average;
  • Median;
  • Standard deviation;
  • Variance.

The number of submissions requiring grading and the distribution of grades is also displayed.

• Accessing the statistics page from a Gradebook item with the Gradebook in grid view

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s February upgrade will be available on Friday 2nd February. This month’s upgrade includes the following new/improved features to Ultra courses:

Forms/surveys

February upgrade’s will enable to staff to use forms/surveys in Ultra courses. The forms tool is very similar to the Ultra test tool, but with a slightly different set of questions that can be used, including a likert scale question. Forms are intended to be used where staff want to collect information from students but where there is no correct answer, and, usually, where there is no requirement to grade or provide feedback on the response. However, there is an option to grade a form response and to add feedback, but the default state for forms is to be automatically marked as complete upon submission.

To use a form in an Ultra course, select the + (plus) button in the course content area, and choose ‘Form’.

• Accessing a form in an Ultra course

The following question types are supported in forms:

  • Essay question (i.e., an open/free response question)
  • Likert question
  • Multiple choice question
  • True/false question

In addition, staff can use ‘Add text’ to add sections of text that do not require a response from students (e.g., to explain more about a particular section of the form), and can use ‘Add Page Break’ to split-up longer forms. ‘Add local file’ can be used where staff want to a upload file for use by students when completing the form.

• Question types in Ultra forms

Current limitations of forms
Please note that in this initial release of forms, it is not possible to collect responses anonymously. Note also that there is no single-select multiple choice tool in forms as there is in tests, as the ‘Add Multiple Choice Question’ tool works differently in forms and tests. When setting a multiple choice question in a form, students will be able to select more than one option, including all options. Additionally, the likert question type only allows ranges of 3, 5, and 7 to be used.

Data protection considerations
Forms should not normally be used to collect and store personal or confidential information about students. If you are considering using forms for this purpose, advice should be sought from the University’s data protection officer beforehand, or, if using forms for research, approval should be sought from one of the University’s ethics committees.

Back to top ↑

Course activity report improvements

Getting an overview of student engagement in Ultra courses is very quick and easy compared to how it was in Original courses. Engagement data in Ultra is also considerably more reliable in Ultra than it was in Original as Ultra takes account of access via mobile devices, which was not the case with Original courses.

Following the February upgrade, as well as providing information about the number of days since each student last accessed the course, and number of hours each student spent in the course, missed assessment deadlines will also be included in the course activity page on Ultra courses, and staff can set flags for these in the alert settings.

In the following screenshot, the alert settings have been set up to flag students who have missed one or more deadlines, and who have not accessed the Ultra course for more than seven days.

• Course activity page in Ultra courses

Missed due dates and non-submissions
Please note that in the course activity page, missed due dates should not be taken to mean non-submissions, as an assignment that is submitted late is counted as a missed due date. However, non-submitters can be viewed and quickly contacted in Ultra courses via the student progress report for the assignment (see links below).

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s January upgrade will be available between Friday 5th and Monday 8th January. The January upgrade includes the following new/improved features to Ultra courses:

AI Design Assistant: Authentic assessment prompt generation

Currently, staff are able to auto-generate prompts for Ultra discussions and journals. The January upgrade will add the ability to auto-generate prompts for Ultra assignments too. Following the upgrade, when setting up an Ultra assignment, staff will see the ‘Auto-generate assignment’ option in the top right-hand corner of the screen.

• Ultra assignment with ‘Auto-generate assignment’

Selecting ‘Auto-generate assignment’ will generate three prompts which staff can refine by adding additional context in description field, selecting the desired cognitive level and complexity, and then re-generating the prompts.

• Auto-generated assignment prompts

Once selected and added, the prompt can then be manually edited by staff prior to releasing the assignment to students.

Back to top ↑

AI Design Assistant: Generate rubric Improvements

Following feedback from users, the January upgrade will improve auto-generated rubrics. The initial version of the AI Design Assistant’s auto-generated rubrics did not handle column and row labels properly, and this will be improved in the January upgrade. Also improved in the January upgrade will be the distribution of percentages/points across the criteria, which were inconsistently applied in the initial release.

• An auto-generated rubric

Back to top ↑

Total & weighted calculations in the Ultra gradebook

The Ultra gradebook currently allows for the creation of calculated columns using the ‘Add Calculation’ feature. However, the functionality of these calculated columns makes the creation of weighted calculations difficult, e.g., when generating the total score for two pieces of work where one is worth 60% of the mark and the other 40%. At present, this would have to be done in a calculated column by using the formula AS1 x 0.6 + AS2 x 0.4, like so:

• Using the ‘Add Calculation’ option in the Ultra gradebook to generate an overall grade for two pieces of work weighted at 60% and 40%

However, weighting problems can be further compounded if the pieces of work are not all out of 100 points, which can often be the case when using computer-marked tests. Following feedback about this issue, the January upgrade will bring in an ‘Add Total Calculation’ option, which will allow staff to more easily generate an overall score for assessments with multiple sub-components. The new ‘Add Total Calculation’ column will simply require staff to choose the assessments which are to be used in the calculation, and specify how they are to be weighted. Using the same example as above, the calculation would look like so:

• Using the ‘Add Total Calculation’ option in the Ultra gradebook to generate an overall grade for two pieces of work weighted at 60% and 40%

Back to top ↑

More information

To find out more about all of the AI Design Assistant tools available in NILE, full guidance is available at: Learning Technology – AI Design Assistant

And as ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

Dr Cleo Cameron (Senior Lecturer in Criminal Justice)

In this blog post, Dr Cleo Cameron reflects on the AI Design Assistant tool which was introduced into NILE Ultra courses in December 2023. More information about the tool is available here: Learning Technology Website – AI Design Assistant

Course structure tool

I used the guide prepared by the University’s Learning Technology Team (AI Design Assistant) to help me use this new AI functionality in Blackboard Ultra courses. The guide is easy to follow with useful steps and images to help the user make sense of how to deploy the new tools. Pointing out that AI-generated content may not always be factual and will require assessment and evaluation by academic staff before the material is used is an important point, and well made in the guide.

The course structure tool on first use is impressive. I used the key word ‘cybercrime’ and chose four learning modules with ‘topic’ as the heading and selected a high level of complexity. The learning modules topic headings and descriptions were indicative of exactly the material I would include for a short module.

I tried this again for fifteen learning modules (which would be the length of a semester course) and used the description, ‘Cybercrime what is it, how is it investigated, what are the challenges?’ This was less useful, and generated module topics that would not be included on the cybercrime module I deliver, such as ‘Cyber Insurance’ and a repeat of both ‘Cybercrime, laws and legislation’ and ‘Ethical and legal Implications of cybercrimes. So, on a smaller scale, I found it useful to generate ideas, but on a larger semesterised modular scale, unless more description is entered, it does not seem to be quite as beneficial. The auto-generated learning module images for the topic areas are very good for the most part though.

AI & Unsplash images

Once again, I used the very helpful LearnTech guide to use this functionality. To add a course banner, I selected Unsplash and used ‘cybercrime’ as a search term. The Unsplash images were excellent, but the scale was not always great for course banners. The first image I used could not get the sense of a keyboard and padlock, however, the second image I tried was more successful, and it displayed well as the course tile and banner on the course. Again, the tool is easy to use, and has some great content.

• Ultra course with cybercrime course banner

I also tried the AI image generator, using ‘cybercrime’ as a search term/keyword. The first set of images generated were not great and did not seem to bear any relation to the keyword, so I tried generating a second time and this was better. I then used the specific terms ‘cyber fraud’ and ‘cyber-enabled fraud’, and the results were not very good at all – I tried generating three times. I tried the same with ‘romance fraud’, and again, the selection was not indicative of the keywords. The AI generated attempt at romance fraud was better, although the picture definition was not very good.

Test question generation

The LearnTech guide informed the process again, although having used the functionality on the other tools, this was similar. The test question generation tool was very good – I used the term ‘What is cybercrime?’ and selected ‘Inspire me’ for five questions, with the level of complexity set to around 75%. The test that was generated was three matching questions to describe/explain cybercrime terminologies, one multiple choice question and a short answer text-based question. Each question was factually correct, with no errors. Maybe simplifying some of the language would be helpful, and also there were a couple of matched questions/answers which haven’t been covered in the usual topic material I use. But this tool was extremely useful and could save a lot of time for staff users, providing an effective knowledge check for students.

Question bank generation from Ultra documents.

By the time I tried out this tool I was familiar with the AI Design Assistant and I didn’t need to use the LearnTech guide for this one. I auto-generated four questions, set the complexity to 75%, and chose ‘Inspire me’ for question types. There were two fill-in-the-blanks, an essay question, and a true/false question which populated the question bank – all were useful and correct. What I didn’t know was how to use the questions that were saved to the Ultra question bank within a new or existing test, and this is where the LearnTech guide was invaluable with its ‘Reuse question’ in the test dropdown guidance. I tested this process and added two questions from the bank to an existing test.

Rubric generation

This tool was easily navigable, and I didn’t require the guide for this one, but the tool itself, on first use, is less effective than the others in that it took my description word for word without a different interpretation. I used the following description, with six rows and the rubric type set to ‘points range’:

‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies, the social, ethical and legal implications of cybercrime and digital evidence collection. Harvard referencing and writing skills.’

I then changed the description to:

‘Demonstrate knowledge and understanding of cybercrime, technologies used, methodologies employed by cybercriminals, investigations and investigative strategies. Analyse and evaluate the social, ethical and legal implications of cybercrime and digital evidence collection. Demonstrate application of criminological theories. Demonstrate use of accurate UON Harvard referencing. Demonstrate effective written communication skills.’

At first generation, it only generated five of the six required rows. I tried again and it generated the same thing with only five rows, even though six was selected. It did not seem to want to separate out the knowledge and understanding of investigations and investigative strategies into its own row.
I definitely had to be much more specific with this tool than with the other AI tools I used. It saved time in that I did not have to manually fill in the points descriptions and point ranges myself, but I found that I did have to be very specific about what I wanted in the learning outcome rubric rows with the description.

Journal and discussion board prompts

This tool is very easy to deploy and actually generates some very useful results. I kept the description relatively simple and used some text from the course definition of hacking:

‘What is hacking? Hacking involves the break-in or intrusion into a networked system. Although hacking is a term that applies to cyber networks, networks have existed since the early 1900s. Individuals who attempted to break-in to the first electronic communication systems to make free long distance phonecalls were known as phreakers; those who were able to break-in to or compromise a network’s security were known as crackers. Today’s crackers are hackers who are able to “crack” into networked systems by cracking passwords (see Cross et al., 2008, p. 44).’

I used the ‘Inspire me’ cognitive level, set the complexity level to 75%, and checked the option to generate discussion titles. Three questions were generated that cover three cognitive processes:

• Discussion prompts auto-generated in an Ultra course

The second question was the most relevant to this area of the existing course, the other two slightly more advanced and students would not have covered this material (nor have work related experience in this area). I decided to lower the complexity level to see what would be generated on a second run:

• Discussion prompts auto-generated in an Ultra course

Again, the second question – to analyse – seemed the most relevant to the more theory-based cybercrime course than the other two questions. I tried again and lowered the complexity level to 25%. This time two of the questions were more relevant to the students’ knowledge and ability for where this material appears in the course (i.e., in the first few weeks):

• Discussion prompts auto-generated in an Ultra course

It was easy to add the selected question to the Ultra discussion.

I also tested the journal prompts and this was a more successful generation first time around. The text I used was:

‘“Government and industry hire white and gray hats who want to have their fun legally, which can defuse part of the threat”, Ruiu said, “…Many hackers are willing to help the government, particularly in fighting terrorism. Loveless said that after the 2001 terrorist attacks, several individuals approached him to offer their services in fighting Al Qaeda.” (in Arnone, 2005, 19(2)).’

I used the cognitive level ‘Inspire me’ once again and ‘generate journal title’ and this time placed complexity half-way. All three questions generated were relevant and usable.

• Journal prompts auto-generated in an Ultra course

My only issue with both the discussion and journal prompts is that I could not find a way to save all of the generated questions – it would only allow me to select one, so I could not save all the prompts for possible reuse at a later date. Other than this issue, the functionality and usability and relevance of the auto-generated discussion and journal prompt, was very good.

On Friday 8th December, 2023, three new NILE tools will become available to staff; the AI Design Assistant, AI Image Generator, and the Unsplash Image Library.

AI Design Assistant

AI Design Assistant is a new feature of Ultra courses, and may be used by academic staff to generate ideas regarding:

  • Course structure
  • Images
  • Tests/quizzes
  • Discussion and journal prompts
  • Rubrics

The AI Design Assistant only generates suggestions when it is asked to by members of academic staff, and cannot automatically add or change anything in NILE courses. Academic staff are always in control of the AI Design Assistant, and can quickly and easily reject any AI generated ideas before they are added to a NILE course. Anything generated by the AI Design Assistant can only be added to a NILE course once it has been approved by academic staff teaching on the module, and all AI generated suggestions can be edited by staff before being made available to students.

• Auto-generating ideas for the course structure
• Auto-generating ideas for a discussion prompt
• Auto-generating ideas for test questions

AI Image Generator & Unsplash Image Library

Wherever you can currently add an image in an Ultra course, following the upgrade on the 8th of December, as well as being able to upload images from your computer, you will also be able to search for and add images from the Unsplash image library. And, in many places in Ultra courses, you will be able to add AI-generated images too.

• Selecting the image source (upload from device, Unsplash, or AI-generated)
• Searching for an image using Unsplash
• Generating an image using AI Design Assistant

First Thoughts on AI Design Assistant

Find out more about what UON staff think about AI Design Assistant in our blog post from Dr Cleo Cameron (Senior Lecturer in Criminal Justice): First Thoughts on AI Design Assistant

Other items included in the December upgrade

The December upgrade will see the ‘Add image’ button in a number of new places in Ultra courses for staff, including announcements (upload from device or Unsplash), and Ultra tests and assignments (upload from device, Unsplash, and AI-generated images). However, please note that images embedded in announcements will not be included in the emailed copy of the announcement; they will only be visible to students when viewing the announcement in the Ultra course.

Ultra rubrics will be enhanced in the December upgrade. Currently these are limited to a maximum of 15 rows and columns, but following the upgrade there will be no limit on the number of rows and columns when using an Ultra rubric.

More information

To find out more about these new tools, full guidance is already available at: Learning Technology – AI Design Assistant

You can also find out more by coming along to the LearnTech Jingle Mingle on Tuesday 12th December, 2023, between 12:30 and 13:45 in T-Pod C (2nd Floor, Learning Hub).

As ever, please get in touch with your learning technologist if you would like any more information about these new NILE features: Who is my learning technologist?

The new features in Blackboard’s November upgrade will be available from the morning of Friday 3rd November. This month’s upgrade includes the following new features to Ultra courses:

Ability to change ‘Mark using’ option without updating the Turnitin assignment due date

A source of frustration with Turnitin and Ultra courses has been that once the assignment due date has passed it is not possible to change the ‘Mark using’ option. This means that if the UnderGrad Letter or PostGrad Letter schema has not been selected when setting up the assignment, it cannot be selected during the marking process without moving the due date to the future, which in most cases is not an advisable course of action. This has meant that in cases where the correct marking schema has not been selected, a mapped Gradebook item has to be created to show the numeric grades as letters. This restriction has now been removed, and ‘Mark using’ can be changed without changing the due date.

• View of the Ultra gradebook showing the same assignment with marks displayed first as numbers and second as letters

Staff can update the ‘Mark using’ option in the gradebook by selecting the assignment in the column header and choosing ‘Edit’.

• Editing an assignment’s options via the Ultra gradebook

Please note that once the desired ‘Mark using’ option has been selected and saved, the gradebook will continue to show the original mark until the page is refreshed.

Back to top ↑

Improved image insertion tool

Following November’s upgrade, the Ultra rich text editor will have a dedicated button for image insertion. Previously, images were inserted using the attachment button. As well as being more intuitive, the new image insertion tool will allow images to be zoomed into, and to have the aspect ratio adjusted prior to insertion. Once inserted, images can be resized by using the grab handles on the inserted image.

• New image insertion tool in the Ultra rich text editor
• Options to zoom into and adjust the aspect ratio of images using the new Ultra image insertion tool
• Resizing an image

Back to top ↑

Improvements to matching questions in tests

Building on last month’s upgrade, which improved multiple choice questions in Ultra tests, the November upgrade improves matching questions in tests.

When using matching questions, the options to select partial and negative credit and to allow a negative overall score are now easier to select. Partial and negative credit is on by default, and credit is automatically distributed equally across the answers.

• Improved matching questions in Ultra tests

Additional answers (i.e., answers for which there is no corresponding prompt) has been renamed ‘Distractors’ to more accurately reflect its function in matching questions.

• Adding distractors (previously called additional answers)

In the above example, students would have seven possible answers to match to four prompts, and would score 25% of the total value of the question for each correct match.

Please note that images, video, and mathematical formulae can also be used in matching questions, as well as in most other types of question in Ultra tests.

More information about using matching questions is available at: Blackboard Help – Matching Questions

More information about setting up tests in Ultra courses is available at: Blackboard Help – Create Tests

Back to top ↑

Improvements to journals navigation

After the November upgrade, the ‘Marks and Participation’ option in Ultra journals will be available via the tab navigation on the left-hand side of the screen, providing consistency of navigation with Ultra discussions and assignments.

• Journals in Ultra courses – old view (top) and new view (bottom)

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?

The new features in Blackboard’s October upgrade will be available from the morning of Friday 6th October. This month’s upgrade includes the following new features to Ultra courses:

Send reminder from gradebook

New in the October upgrade is the addition of a ‘Send Reminder’ tool in the Ultra gradebook. However, please be aware that this tool is in a very early stage of development, and, due to its limited functionality, staff are recommended not to use it.

Instead of using the the send reminder tool, staff are advised to continue using the current process of contacting non-submitters via the student progress tool, as reminders sent via this method will always be sent as emails as well as Ultra messages:

In it’s current form, reminders sent via the new send reminder tool cannot be sent as emails to students; instead, students will receive an Ultra message only. Additionally, reminders cannot be sent for assignments which have release conditions applied, which are automatically applied to all Turnitin assignments. Send reminder messages cannot be customised by staff, and students who receive a reminder will receive the following Ultra message:

Important Notification: Your assessment has not been submitted.
Your assessment named ‘[Name of Assessment]’ has not been submitted.
If you have already received mitigating circumstances or an extension for this assessment, you can ignore this message.
Important information about late submissions, extensions, and mitigating circumstances:
In accordance with University policy, you can submit assessments up to one week late, but your grade will be capped to a bare pass. Extensions are available through your module leader if you have unforeseen circumstances that prevent you from meeting an assessment deadline. The maximum extension period is two weeks, and grades for assessments which have an extension will not be capped. Please note that late submissions and extensions are only available at the first submission point.
If unforeseen circumstances mean that you will need longer than two weeks to submit your assessment, you may be able to apply for mitigating circumstances. If your application for mitigating circumstances is successful this will defer submission of your assessment to the resit submission point, so you can submit to this for an uncapped grade. If a mitigating circumstances application is approved at the resit submission point, it is recognition of extenuating circumstances at that time, but there is no further opportunity to resubmit the assessment.
More information, help and support:
More information about late submissions, extensions, and mitigating circumstances is available from: University of Northampton Guide to Mitigating Circumstances and Extensions.
If you need any other support regarding your assessment, please contact the module leader for help as soon as possible.
The deadline for the above-named assessment is shown below:
Due date: [Assessment due date]

The send reminder tool can be accessed in the Ultra gradebook, from both list and grid view:

• Sending reminders via the gradebook in list view
• Sending reminders via the gradebook in grid view

However, at the current time the assessment submission point must be set to ‘Visible to students’ for the message to be sent. If the assessment submission point is set to ‘Hidden from students’ or ‘Release conditions’ an error message will be received when trying to send the message.

• Assessment submission point set to ‘Visible to students’

Once the send reminder tool reaches a sufficient level of functionality, we will update our guidance accordingly.

Back to top ↑

Partial credit auto-distribution for correct answers for Multiple Choice questions

Following the October upgrade, when setting up multiple option test questions with partial and negative credit allowed, percentages will be automatically allocated and equally distributed between the correct options. However, these can be overwritten should staff prefer to weight the distribution unequally.

• Multiple option test question with partial credit selected

More information about setting up tests in Ultra courses is available from: Blackboard Help – Create Tests

Back to top ↑

Delegated marking option for Ultra assignments

Delegated marking is already available with Turnitin assignments. After the October upgrade, staff marking Ultra assignments will also be able to create marking groups allowing the different members of staff marking an assignment to see only the assignments that they are marking.

Please note that this first version of delegated grading only supports assignment submissions from individual students. Tests, group assessments, and anonymous submissions are not supported at this time.

The option to allow delegated marking for Ultra assessments is available in the assignment settings.

• Assignment settings with delegated marking highlighted

After selecting the delegated grading option, select the appropriate group set. Staff can assign one or more members of staff to each group in the group set. If multiple markers are assigned to the same group, they will share the grading responsibility for the group members. Staff assigned to a group of students will only see submissions for those students on the assignment’s submission page, and they can only post grades for their assigned group members. Any unassigned staff enrolled in the course will see all student submissions on the assignment’s submission page, and will be able to post grades for all students.

Back to top ↑

More information

As ever, please get in touch with your learning technologist if you would like any more information about the new features available in this month’s upgrade: Who is my learning technologist?