Authors: Elizabeth Palmer, (University of Northampton, Learning Designer,) Sylvie Lomer, (University of Manchester, Lecturer in Education) and Ivelina Bashliyska (3rd Year Undergraduate Student and Assistant Researcher).
With thanks to Nadine Shambrooke and David Cousens for support with transcription and coding.
The University of Northampton has taken an institutional approach to learning and teaching through the widespread adoption of Active Blended Learning (ABL) as its new ‘normal’. To find out more please visit: https://www.northampton.ac.uk/ilt/current-projects/waterside-readiness/
However, student engagement has been highly variable, which has created a number of challenges for staff. Semi-structured qualitative focus groups have been undertaken with 201 undergraduate students across all the year groups and faculties during the academic year 16/17 based on a pilot study of 24 students in academic year 15/16. These focus groups have been looking at trying to uncover the students own perceptions and experiences of ABL in order to unpick the reasons behind varying patterns and engagements and to glean student insight into the factors that inhibit or encourage engagement with ABL.
The study has revealed a number of key factors which students identify as having significant impact on their engagement. Key success factors include effective pedagogical design, in particular establishing a clear and explicit relationship between online and face to face components of modules, and scaffolding the development of digital skills and literacies in the process of establishing online tasks. A strong relationship between staff and students is also critical, where students trust in the decisions and motivations of staff. This is signalled by following up on online tasks, providing feedback where relevant, and explicitly discussing the value of online tasks to module learning outcomes and employability skills. A key finding is that students’ conceptions of learning, teaching & knowledge impact on their engagement with ABL, and are not necessarily compatible with ABL principles. These factors are complex, interdependent and have varying loci of control. Staff can take a number of measures to increase the likelihood of student engagement, although certain factors remain ultimately within the agency of students. Understanding these issues is critical to the success of ABL.
The following artefacts provide the results of the study to date:
The Quick Overview:
• Where students need to carry out online surveys, and where academic staff do not have a preference as to which tool the students use, we recommend eSurv: http://esurv.org
• A tutorial video explaining how to use eSurv is also available here: http://bit.ly/esurv-tutorial
One area where students sometimes come unstuck with their research projects is when they try to extract data from the free online survey tool they have used. While it is often easy to create a simple online survey for free, and easy for a limited number of respondents to take part in the survey, it is not always so easy for the researcher to access their data.
There are a large number of free online survey tools available for use, and choosing the most appropriate one is not always easy. In almost all cases, accessing the full-functionality of the survey tool is not free. For example, the free version of the survey tool may be limited by number and type of questions available (a maximum of ten questions, for example, and only basic questions). It may also be limited to a maximum number of responses (fifty responses per survey, for example). Another common restriction is to limit access to the survey data, and not to allow the researcher to download the data for analysis in a statistical package. While all these restrictions can be overcome by paying a monthly subscription to the survey tool provider, students often feel rather cheated when they find out that it will cost them, in some cases, £60 to download their data for analysis in SPSS. They often feel especially annoyed when they find out that if they chosen different tool they could have had free access to their data.
As part of a recent University of Northampton URB@N project, Paul Rice, Phil Oakman, Clive Howe and Rob Farmer decided to find out whether there was a genuinely free online survey tool out there somewhere. And they decided to make things more difficult by trying to find one that was also easy to use and that stored data in a way that was compliant with the UK Data Protection Act. The good news is that they found one!
If you would like to find out more then you can read all about it in their paper published in the journal MSOR Connections: https://journals.gre.ac.uk/index.php/msor/article/view/311
Background to the Exchange
Aside from the opportunity to network, my aims in attending the exchange was to examine two main areas – how technology can support the process of innovation and the potential for incorporating System Thinking and Design Thinking into the design of material and even courses. This document summarises my experience and the four lessons I have learned.
Technology and Innovation
Two items on the agenda were particularly relevant here. The MICA Social Design Lab ran during one afternoon – this was a social space designed to encourage interaction between delegates and facilitate discussions, given the question ‘How might we advance social innovation in Higher Education?’
Given the rather spartan conference room environment, the range of fun, brightly coloured physical items to record, connect and visualise responses was attractive and facilitators easy to identify. But while idea capture was strong, collection and dissemination was somewhat weaker. Personally, I never encountered any analysis or results from it, though it may have just passed me by. The physical location hampered the exercise too – delegates could too easily pass by and without their physical presence the exercise was reduced in value. Could technology have supported this process better? Yes, I am convinced it could. At the very least, video or photographic capture needs to be on hand to ensure that contributions can still provoke ideas and actions after the event, along with a clear mechanisim to access it. Ultimately, technologies to engage participants, then capture and disseminate material are essential features of an environment that truly wishes to engage stakeholders. How often has a pile of flip chart paper – containing several person hours of contributions at enormous cost – lingered in the corner of my office?
Lesson #1: Low tech is fun and has its place, but technology to engage in, capture and share group deliberation is essential if the exercise is to make a real difference in a design process.
I attended a session entitled ‘Are we succeeding and how would we know?’, where three case studies were discussing in respect of their attempts to measure success. Drew Bewick of the University of Maryland discussed the use of a ‘Return on Engagement’ grid – very much along the lines of a rubric – to measure the operational value, strategic value and risks of projects on a scale of one to five, and recording the resources used, activities, outputs and impact at the same time.
Lizzie Pollock, from Brown University, discussed the measurement of the learning outcomes for individuals being assessed as part of their Social Innovation Fellowship. The items for inclusion included empathy, creative thinking, critical thinking and entrepreneurial ‘grit’. She was still struggling with ways to evidence and measure these attributes – the Torrance Test, for example, was tried, but rejected on the grounds that it was too broad. Brown are also now beginning to consider – like Maryland – impact, including enterprise survival rates and generated revenue.
John Isham, of Middle bury College, had done some interesting work on the three impact areas of the project itself, the student(s) concerned and the Campus, emphasising the inter-relation of all three. He identified a weakness in project management skills amongst participants in projects and was conscious that just ‘building stuff’ is an inadequate measure of success. Students were beginning to be involved with evaluating other students’ projects but this was at a fairly early stage.
Two points struck me here in particular – the lack of pre-determined project management structures or tools can be a barrier both for students who have little or no experience of managing a project and supervisors who have no ‘dashboard’ view of the progress of a project or its outcomes. Secondly, we seem locked into a ‘new year, fresh start’ approach to developing social innovation projects and ignore the lessons of the previous year.
Lesson #2: A project management system – simple and free to use – is needed to support students and their mentors/supervisors/assessors.
Lesson #3: Evaluation of previous social innovation ventures by students before they start their own, would be a valuable learning experience for them and provide data for the hosting institution.
Systems and Design Thinking
Unfortunately, both sessions related to these topics – ‘Systems Thinking for Leading Changemakers’ and ‘Can Everyone be a Designer? ‘Provocations in the Pedagogy of Design Thinking’ failed to fully deliver to my expectations, the latter being a discussion about a process I didn’t understand! Mary Anne Gobble’s summary article (Gobble 2014) has assisted me to a great extent on the topic of Design Thinking. Whether you believe this to be fad or fact, the importance of taking the “beneficiary’s” perspective into account during the design phase of any social innovation would seem to be a critical success factor.
Lesson #4: Empathy is not just a desirable personal attribute; it is a critical success factor in the design process.
Systems Thinking seems to sit uncomfortably in social innovation design, being apparently more suited to translating the messiness of real life into computer software. However, there are clear connections here to the knotty problem of measuring success – by establishing the ‘units’ that exist within a process flow and their rates of change (along with auxiliary variables) we can begin to pinpoint objective measure of success. Overall, I couldn’t see how a non-specialist could apply these techniques easily, though David Castro did provide some interesting resources and links (including free modeling tools such as InsightMaker) that I may well do some more exploration with.
Clearly there was a lot more that I got out of the visit, some of which are on http://ashokaun15.weebly.com/. I have an excellent contact in Waterloo, Canada who is sharing her experience of embedding Flipboard into teaching with me, along with the Tophat student response system and met a wide range of contacts from around the world. Many of the delegates leave you speechless at the problems they are seeking to overcome and the relentless enthusiasm they still have to press on. Wrangling with a few NILE issues pales into insignificance when trying to develop a system to support 100,000 students in Indian rural schools with no Internet connection!
But as Wray Irwin pointed out before I left, you would be surprised just how far ahead we are in the field of social innovation compared with most. Developing the support infrastructure for prospective social innovators and evaluating our successes and failures more effectively will push us ahead further still.
Gobble, MM. (2014). ‘Design Thinking’, Research Technology Management, 57(3), pp. 59-61
Many thanks to Tim Curtis for inviting me to attend, Rob Howe and Chris Powis for allowing me to go and the ‘awesome’ support of my fellow delegates in Washington.
(a copy of the fully hyper-text linked version of this document can be found at http://1drv.ms/1FEGoad)
Dr. Naomi Holmes (School of Science and Technology) undertook the use of low-stakes continuous weekly summative e-assessment with a cohort of level 5 (2nd year) students. Biggs and Tang (2011) state that it is assessment and not the curriculum that determines how and what students learn. Learning needs to be aligned with assessment as much as possible to increase engagement, even if the result is that the student is “learning for the assessment”, and therefore accreditation. With this in mind the use of low-stakes weekly assessments was undertaken to help support learning (formative), and lead to accreditation (summative). Results show that both physical and virtual engagement with this (optional) module, and students’ learning and understanding of the subject increased because of this method of assessment.
Written by Rebecca Heaton
On 10th July an art and ICT Teach Meet event /art exhibition was held at the Northampton Contemporary Art Gallery showcasing the work of students and local school teachers who have been influenced by the universities Innovation fund projects Stem to SteAm and Technology Outdoors, supported by the School of Education, colleagues from UCEE and the LearnTech team.
The event took the form of an artistic ‘happening’ bringing the outside inside to celebrate a year’s worth of work surrounding the projects. Everyone involved had an enjoyable evening, participants could take part in a number of workshops: light-trails, animation and batik whilst artist Emma Davis created collaborative work in response to the event. To explore and share the evening take a look at the storify created.
Prizes were awarded to many teachers at the event with Bridgewater school winning a set of resources to support iPad use in the curriculum donated by Rising Stars. The school developed a whole school project ‘Bridgecraft’ aligned with the Stem to SteAm agenda. The teachers presenting praised how the university supported and inspired developments in their practice by providing project websites, CPD network groups and media days as part of the innovation projects.
“Northampton 2018: Planning, Designing and Delivering Student Success”
The University of Northampton’s Institute of Learning and Teaching in Higher Education is to host a one-day Learning and Teaching conference, entitled Northampton 2018: Planning, Designing and Delivering Student Success. The event will provide an opportunity to celebrate research from within the institution. More details…..
On the 31st October, 2013, Rob Howe was invited to Education 3.0 in Moscow to present on the strategic development of Learning Technology at The University of Northampton.
Rob was the only speaker from the UK and joined speakers from leading Russian institutions: FEFU, KAI, LGU, MISiS, and Ural Mining to an audience of Rectors, Provosts, Heads of IT.
The forum was part of a major education exhibition, hosted by the Ministry of Education of the Russian Federation, Government of Moscow and the Moscow Department of Education. The goal was to enhance the access to, and quality of, education through public-private partnership, promoting international best practices that will boost the personal and professional development of individuals.
The Russian government has earmarked £178 million to enable its leading national universities to break into the top 100 in the global league tables. Vladimir Putin, Russia’s president, announced plans that would see at least five of the country’s universities enter the top 100 by 2020. Education 3.0 was part of a number of events which are encouraging institutions to be more innovative in their outlook.
There was a real interest in the Learning Technology work and developments at Northampton including our Changemaker status…..and surprisingly a question on the Northampton clown during the drinks reception ! As a result of the presentations, Northampton has already had requests for greater collaboration and potential sharing of resources between ourselves and Russian institutions.
Thomas Cochrane and Vickel Narayan from AUT University in Auckland New Zealand have piloted the use of an intentional community of practice model to transform lecturer CPD through the embedding of mobile web 2.0 technologies (http://goo.gl/eEQLZ / DOI: 10.3402/rlt.v21i0.19226). Their research over two iterations of the course has significant implications for transforming how lecturing staff approach their role, moving from a heavily pedagogical approach through andragogy to heutagogy. Heutagogy (student-directed learning) requires lecturers to undergo a reconceptualization of their role and to take advantage of the mobility offered by the various Web 2.0 tools (including Twitter, blogs, wikis, Skype) with their own learning experiences being scaffolded through sustained engagement and support; these latter two elements proving essential to their success.
Although they offer a different approach to the five stage approach to e-learning offered by Gilly Salmon (http://www.gillysalmon.com/five-stage-model.html) Cochrane and Narayan’s approach is not new per se, just not so widely reported in academic circles. For example, they argue that “heutagogy … need not be the domain of postgraduate research students only” and having attempted a similar approach myself when teaching HNC law I would agree, but would also reflect that moving away from a didactic approach to a place where individual learners control their own learning journey requires a willingness to relinquish that control and permit a transformation of the teacher role into that of co-learner and facilitator.
Conceptualising students as transformative agents of change is not new, and the ability to take advantage of new technologies like the iPad have a real potential to see learning move up Bloom’s taxonomy to a place where creativity is not only more possible, but also more likely and even encouraged.
Cochrane and Narayan’s redesigned CPD course is actually similar to the Moderating Online Groups (MOG) / Collaborative Learning Experience Online (CLEO) CPD course co-ordinated by the Institute of Learning and Teaching here at Northampton, in that it encourages staff to take advantage of the benefits offered by new technologies and to incorporate them in the classroom by allowing them to experience using those technologies as a student. However, Cochrane and Narayan necessarily have the opportunity to provide lectures with real opportunities to implement their learning and experiment with Web 2.0 in their own learning environments as their course is run over 6 weeks as opposed to 6 hours which is the case with the MOG/CLEO. It will be interesting to see if their model can be implemented at Northampton resulting in a deeper embedding of Web 2.0 in our practice, particularly in more theoretical and academic, rather than the vocational programmes which formed the majority of the subjects taught by the New Zealand staff.
LearnTech have been trialing the Swivl for a while, but I had my first opportunity to try it out in anger at the CAMHS Children and Young People’s Mental Heath Conference at the Sunley Management Centre on 3rd July 2013.
The organiser’s were particularly interesting in capturing video (and particulalrly audio) of two keynote addresses but had limited facilities and budget. Panopto – which is installed on the Sunley lectern – was considered but the limited range and mobility of the web cam was deemed too limiting for a guest speaker and there was a risk that the presenter would move out of microphone range. It was subsequently discovered that the first presenter had a piece of video that was important that video viewers could not view in detail. This could have been subsequently edited from the Panopto capture, but would have added complexity.
The Swivl system is designed to track and record video from an iPhone or iPod. The base unit tracks the position of the presenter using the ‘necklace’, which also contains a microphone. It is a virtually ‘one button’ system which does not distract the presenter. The device can tilt too, but this feature wasn’t required on this occasion.
A fully charged iPod could be expected to video for around 90 minutes, which is adequate for most purposes. The front or back facing cameras can be used. The latter is higher quality, but circumstances often dictate that the front is used for visibility – as a presenter you can confirm where you are in frame from time to time. High quality means a larger file too – the low-quality front camera on an iPod Touch will generate 2GB of video in 90 minutes which is not trivial for rapid processing.
In this instance we were using Swivl as a robotic camera operator – speakers had no prior experience or training, just a short briefing on using the tracker necklace. They proved very good at managing the necklace but were unaware of the impact of their position on the camera’s perspective. A dramatic reduction in lighting during the first keynote speech had a significant impact – as an experienced self-presenter would have appreciated the issue and rectified it.
Although the free Swivl software offers direct upload to YouTube over wifi, the size of the video file made that impractical. Files were transferred to PC and uploaded to Kaltura, where they could be ‘topped and tailed’. File size makes this a little slow, but it is perfectly possible to get an hour’s presentation ready for public streaming in 2-3 hours.
The lessons learned? As a video blogger’s tool the Swivl is superb. Using it as a robotic cameraman works in ideal circumstances, but if there is no opportunity to brief presenters it is very likely to fail. Its particular strength is as an ad-hoc mobile audio capture device – for, at the very worst, the audio stream can be easily separated from the video. And the audio quality is remarkaby good. It doesn’t capture displayed slides particularly well, but these could be edited in if required.
The first keynote presentation (after minimal editing) can be seen here : http://tinyurl.com/cahmskey1
Finally, use the power adapter or have plenty of pairs of AA batteries around – the base uses a lot of power (the AAA batteries in the necklace seem to last a long time – the app has a battery meter). Take more than one iPod/iPhone to avoid running out of space or power. It is a pity the base unit power supply does not charge the iPod at the same time.
More details on the equipment at www.swivl.com . The unit costs around £180. LearnTech are happy to loan the equipment, help train users and supervise pilot exercises to establish if this equipment is right for you – however, we are not an event videoing service!
Catherine Fritz demonstrated the concept of flipped teaching – moving assignments into the classroom and delivering lectures as self-paced and scheduled events.
Lectures can be paused by the student to enable research to take place, and give students struggling with vocabulary the chance to look up a word. The lecture is also a much more powerful revision tool. Class work can be more active and collaborative as a result.
The University provides a number of applications to host flipped lectures – Panopto is probably the most suitable, but Kaltura video or NILE based tools like Xerte are also possible delivery mechanisms. In this case Catherine described how Powerpoint can be used to create slides supported with audio. Her presentation contained a step-by-step guide in how to do so.
Powerpoint proved an effective alternative, particularly when access to Panopto is not available. In some respects it is simpler to use than Panopto – amending text on a slide is very easy to do. However, long presentations can result in quite large files which are a problem for some distance learners. Dividing these lectures into sections may well be necessary. As with all asynchronous delivery, support for questions and discussion needs to be available for students at the same time. This will require monitoring, and often moderation, from the tutor.
Overall, this presentation is an excellent example of innovative teaching making used of simple technology and is well worth consideration as an approach. Many thanks to Catherine for producing what is effectively a multimedia instruction manual!
Since the Expo, a new version of Panopto for the iPad has been launched which offers offers a much better recording experience for tutors and an attractive and useful viewing platform for students. It is free to download from the App Store. Ensure you connect to northampton.hosted.panopto.com and login using NILE.
- Bug Zapper
- Case Studies (All)
- Case Studies: Arts, Science and Technology
- Case Studies: Business and Law
- Case Studies: Education and Humanities
- Case Studies: Health & Society
- Case Studies: Library and Learning Services
- Conferences and publications
- Feed Back: you said, we did…
- Learning Design
- LearnTech News
- LearnTech Radar
- Quick Tips
Tag cloudacademic skills accessibility Android apps assessment design assessment tools blended learning blogs CAIeRO collaboration distance learning feedback Flipcam flipped Flipped Classroom flipped learning flipped teaching GradeMark images iNorthampton iPad Kaltura learner's experience MALT mobile MyPad Newsletter NILE OERs Outside the box Panopto podcast Powerpoint presentations Quality reflection Rubrics SaGE SHED Skype Turnitin video Waterside wikis Xerte
- No public Twitter messages.