Archive for the ‘under development’ Category
Where the JISC Assessment and Feedback Programme Strand A projects are focused on identifying and introducing new assessment and feedback processes and practices, the Strand B Evidence and Evaluation projects are reviewing the impact and implications of innovations and transformations that have already taken place, and exploring how these can be extended to new contexts and environments. These eight projects cover a broad range of approaches and will provide invaluable insight into the value of such changes for institutions, learners and staff.
The EFFECT: Evaluating feedback for elearning: centralised tutors project at the University of Dundee will examine the success of esubmission and their TQFE-Tutor system, a centralised email account, blog and microblog supporting their online PDP programme for tertiary tutors. The project aimed to significantly increase response times and teaching quality through the use of this centralised system, as well as providing opportunities for peer interaction and collaboration for both students and staff. As well as rigorously evaluating the impact of the programme, EFFECT will produce valuable guidance on how to adapt the system for implementation by other institutions and courses.
Student-Generated Content for Learning (SGC4L) at the University of Edinburgh is evaluating the impact of PeerWise, a freely available web based system which not only allows students to author questions for sharing with their peers but also supports extensive social discussion tools around that content, providing a greatly enhanced learning experience that students have responded to enthusiastically. The project will emphasise the production of generic information that will be applicable across a wide range of subject areas and institutional contexts.
OCME: Online Coursework Management Evaluation, based at the University of Exeter, is rolling out and evaluating a fully paperless coursework management system, involving the integration of Moodle and Turnitin. This will inform the development of best practice guidelines for the deployment and management of online assessment.
The problems of assignment bundling and timeliness of feedback are being considered by the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project. This project will be evaluating the use of student diaries to highlight to both staff and students when assignments are due to encourage good time management and planning. The project is also looking at the use of GradeMark, an online grading system available as part of Turnitin, to provide timely, personalised and high quality feedback on assignments.
Evaluating the Benefits of Electronic Assessment Management (EBEAM) at the University of Huddersfield is also looking at the impact of Turnitin and GradeMark on student satisfaction, retention, progression and institutional efficiency. This project will benefit from their early adoption of the systems to provide detailed insights and recommendations for their implementation in a wide range of subject areas and across different institutions.
The University of Hertfordshire’s Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS) project will provide an exhaustive examination of the use, benefits and best practice around the use of electronic voting systems in formative and summative assessment.
The eFeedback Evaluation Project (eFEP) builds on The Open University’s extensive experience in providing distance learning to explore the use of written and spoken feedback in modern language studies. The value of such feedback in face-to-face courses will be examined through deployment in similar classes at the University of Manchester. Detailed reports on the value of different feedback techniques together with training resources for staff and students will provide valuable advice for other institutions considering adopting such approaches.
The University of Westminster’s Making Assessment Count Evaluation (MACE) project builds on the success of the Making Assessment Count project which will be familiar to those who attended our joint event with them this February. This project will not only evaluate the impact of the MAC eReflect self-review questionnaire within Westminster, but also pilot implementations at six other institutions including the transformation of the MAC SOS (Subject, Operational and Strategic) feedback principles into Moodle and GradeBook at City University, London. By demonstrating the effectiveness of the system in a wide variety of subject areas and institutional contexts, the project will provide a valuable resource for those considering adopting the system.
JISC has a long tradition of providing support and encouragement for innovative assessment activities, recognising the crucial role assessment plays in education and the significant concerns about the current state of university assessment and feedback repeatedly revealed by the National Student Survey, stimulus for the National Union of Students’ recent high profile Feedback Amnesty campaign.
Their latest work in this area is focused on a substantial programme of projects funded under the three strands of the current Assessment and Feedback Programme, covering institutional change, evidence and evaluation, and technology transfer. The twenty projects that successfully bid for funding under this programme address a wide range of assessment and feedback processes and educational contexts, illustrated by Grainne Hamilton’s excellent Storified account of the programme start up meeting earlier this month. These projects are focused on using technology to increase the quality and efficiency of assessment and feedback practice on a large scale. Crucially, there is a strong element of sharing running throughout the programme, both in supporting the transfer of technology to new institutions, and in sharing outcomes and learning from previous work to help support future practice - literally feeding forward to the future.
Strand A focuses on institutional change, with the eight projects funded reviewing and revising assessment and feedback processes and practices, and using technology to support major changes and best practice in their chosen area.
Feedback and Assessment for Students with Technology (FASTECH) is working with the Higher Education Academy-funded Transforming the Experience of Students through Assessment (TESTA) project to implement technology-enhanced assessment and feedback processes in a large number of degree programmes at Bath Spa and Winchester Universities. The project will learn about the approaches teachers and learners take to assessment and how technology can be used to affect this. In addition, a range of resources and support will be made available to support practitioners in transforming individual and institutional practice.
COLLABORATE: working with employers and students to design assessment enhanced by the use of digital technologies at the University of Exeter is focused on employability issues, and on ensuring that assessment is designed with students’ future career prospects firmly in mind. The project is structured around a series of collaborations: with employers, with programme and institutional teams, and with students and recent graduates, redesigning assessment to ensure that it is pedagogically sound and realistic in preparing students for life beyond graduation.
FAST (Feedback and Assessment Strategy Testing) led by Cornwall College is facing the intriguing issue of embedding technology-supported assessment in a geographic area which lacks full broadband roll out and with students whose ability to physically visit the college campus is limited by poor transport links or employment. A small-scale pilot on a small cohort in a single campus will be followed with full scale roll out in a Personal and Employability Development module studied by over 700 students on 43 different courses in seven different campuses. The information on technical and support issues encountered and methods adopted to overcome them will be disseminated to the wider community, as will model CPD packages for potential adoption in other institutions.
InterACT at the University of Dundee is also dealing with a rather unique student cohort. Providing continuing education and CPD for practicing doctors, their courses are entirely distance taught and, crucially, the timing of assessment submission is entirely at the discretion of the individual student. This leads to issues around timeliness of feedback and feed forward, which may have an impact on learner satisfaction and, subsequently, on retention. This project will examine the ways in which a range of technologies such as blogs, FriendFeed, VOIP, webinars and chat tools can enable personalised, timely and focused feedback that encourages reflection and engagement, and enhances the student experience.
The timeliness and effectiveness of feedback is also a focus of the Integrating Technology-Enhanced Assessment Methods (iTEAM) project at the University of Hertfordshire, which is exploring the ways in which electronic voting systems and increased embedding of QuestionMark Perception can be used to provide prompt personalised feedback. A student dashboard will be developed to integrate information from EVS, QMP, the institution’s Managed Learning Environment (MLE) and other relevant sources to provide a central point for information on a student’s performance across all subjects, enabling personal tutors to provide meaningful and personalised support and students to better understand their own learning behaviours.
The Institute of Education’s Assessment Careers: enhancing learning pathways through assessment project will explore the construction of assessment frameworks, incorporating multi-stage assessment and structured feedback and feed forward. There is a strong emphasis on assessment as an holistic whole rather than single, stand-alone events, with assessment instances part of a larger learning journey rather than marking the end point of a phase of study. The framework will be piloted in a number of Masters modules, with learner and tutor experiences then informing the model as it is scaled up for use on an institutional level.
TRAFFIC: TRansforming Assessment + Feedback for Institutional Change at Manchester Metropolitan University builds on MMU’s work in the JISC Curriculum Design and Delivery programme to implement an institution-wide assessment transformation programme. The project will undertake an extremely thorough review of assessment across the institution, explore ways in which technology can enhance and support assessment and feedback processes, and provide a very rich range of resources for the broader community.
eAFFECT: eAssessment and Feedback for Effective Course Transformation is the culmination of a range of activities examining assessment and feedback processes undertaken recently at Queen’s University Belfast. The project will examine staff and student approaches to assessment, in particular addressing concerns about workload, learning styles and how technology can support transformation in assessment processes. A practice-based website and extensive supporting documentation will help individual practitioners and institutional change managers apply the lessons of this project to their own contexts.
Meaningful work placements and graduate employability have always been an important part of university education and preparation for a professional future in certain disciplines, and are arguably even more so today in a climate of limited employment opportunities, with high university fees and loans positioning students as customers investing in their future careers. Certain subject areas enjoy good relationships with industry, providing industrial placements to give students real-world experience in their future fields, while local companies benefit from the expertise and cutting edge knowledge these students can bring to the workplace. Universities and colleges similarly benefit from this ongoing engagement with industry, ensuring their courses remain relevant and meaningful.
Shrinking university staff numbers have increased workloads, limiting the time staff have to spend assisting individual students in seeking suitable placements and opportunities for work-based learning. In any case, reliance on university staff is not necessarily the best way in which students can prepare themselves for seeking suitable, fulfilling employment on graduation, or establish fruitful relationships with potential employers.
The Sharing Higher Education Data (SHED) project attempts to address these issues through the delivery of a ‘matchmaking’ service for students and employers, which will both facilitate communication between them and enable students to plan their learning paths in the light of the expectations and requirements of their chosen profession. Sample case studies included in the student and employer information sheets about the service help illustrate the range of ways in which SHED can benefit both user groups while increasing interaction between academia and industry.
SHED uses the popular Mahara open source eportfolio tool to allow students to develop their profiles, and, vitally, provides them with strict control over what information is made publicly viewable by potential recruiters. Students can also view common employer search terms within their particular field in order to better understand the employment market in that area and to support the review and revision of their profiles to enhance their employability. The integration of the XCRI information model and specification (eXchanging Course Related Information) provides a common framework for describing and sharing course information, while Leap2A and InterOperability provide support for the sharing of eportfolio and competence information.
As a partnership between the Centre for International ePortfolio Development at the University of Nottingham and Derby College, SHED will also be able to demonstrate how the system can be used across a number of different institutions without compromising privacy while maximising opportunities for placement and project work and professional development. Although small-scale and local to begin with, it is intended that the system be scalable to include many institutions, subject areas and locations, and provide both a valuable service for students and employers and insight into regional and national trends in industry and development.
When I was studying English at university, one of the more engaging and intriguing sites of discussion and debate was the margins of printed texts. These are the ultimate asynchronous discussions, taking place over decades in some cases, rarely revisted by their participants once they’d left their comment on previous comments. It was fascinating to encounter often very different perceptions on both primary and secondary texts, and they encouraged me to reflect on my own interpretations and arguments as well as articulating them in the form of comments added to those already there. These serendipitous discoveries definitely enhanced my learning experience, providing the opportunity to discuss texts and solidify my understanding significantly beyond that provided by limited tutorial time and the very few other opportunities for debate available. Similarly, I encouraged my students to write on their books to increase engagement with the texts they read and legitimise their interpretations and opinions, although that was often met with askance looks that clearly said, ’sod that, I’m selling them later.’
So I was very interested to learn about the eMargin project, which is developing an online collaborative textual annotation resource as part of the JISC Learning and Teaching Innovation Grants funding round six. The eMargin system allows a range of annotation activities for electronic editions of texts, encompassing notes and comments on individual sections, highlighting, underlining and so on, all personalisable to support different tastes and access requirements. What takes this beyond the usual functionality offered by ebook readers is the ability to share these annotations with class-mates and students from other institutions, enabling their use as educational resources by design rather than chance. Teachers are able to control the degree of exposure of annotations in line with institutional policies on student IPR, and the system may be developed further to allow students to control which comments they wish to share and which to keep private, allowing them to use the same system for personal study as well as class work. By providing an easy means for sharing ideas, together with a wiki feature for building and capturing consensus, this system will be of value in all disciplines, not just English Literature where it is being developed.
The project team, Andrew Kehoe and Matt Gee of the Research and Development Unit for English Studies at Birmingham City University, are developing the system through a number of iterations in the light of feedback from teachers and learners, and engaging participants in other institutions and other disciplines to demonstrate its versatility. The team is also exploring the possibility of integrating eMargin with VLEs, and its potential as an eassessment tool; it may also have value in tracking the development of learners’ ideas in order to reduce opportunities for plagiarism.
The VWVLE project, or Supporting Education in Virtual Worlds with Virtual Learning Environments to give it its full name, has been funded as part of the JISC Learning and Teaching Innovation Grants round 5 to examine the wide range of emerging pedagogical opportunities offered through the integration of virtual worlds and web-based virtual learning environments.
Led by the University of the West of Scotland, with partners including Imperial College London, The Open University and the University of Ulster, the project builds on the considerable experience and expertise the project team have developed through their work on SLOODLE and the use of games for learning within virtual environments. SLOODLE (Simulation Linked Object Oriented Dynamic Learning Environment) provides seamless integration between the virtual world Second Life and Moodle, the popular open source VLE. Pilot courses will see students in engineering, computing and medicine explore aspects of the core question of how web-based virtual learning environments can effectively support learning and teaching in virtual worlds, particularly focusing on personalisation and reuse of content, and gaming in VWs, and demonstrating the applicability of such technologies across different institutional and disciplinary contexts.
A number of outputs will be produced, including guidance for practitioners, a range of extensions or plug-ins for Moodle/SLOODLE, and a guide to producing reusable content in virtual worlds which will attempt to address some of the issues that present a significant barrier to the easy and effective exchange of such resources. The emphasis on the integration of VWs and games with educational systems such as VLEs will both highlight the pedagogic benefits of such integration and attempt to clarify and address the challenges of doing so. By making explicit the range of technologies and support resources relied upon by educators working with VWs, and identifying and sharing good practice, the project can make a real impact on practice in this area and future activities.
A couple of months ago, JISC released an Invitiation to Tender for a QTI v2.1 implementation and profiling support project. A consortium of experts produced the successful bid, bringing together some of the leading experts on QTI in UK HE, and the project formally kicked off this week. It concludes in mid-September this year.
The consortium is led by the University of Glasgow, and includes experts from the University of Edinburgh and Kingston University, contributions from the IMS QTI working group chairs and tool developers, independent consultants Sue Milne, Graham Smith and Dick Bacon, and input from us here at JISC CETIS. QTI experts at the University of Southampton are advisors to the project.
A project blog has been set up which will provide a central point for dissemination to the wider QTI community. Information on how to get involved with the QTI interoperability testing process is also available there.
The project aims include:
- Contributing to the definition of the main profile of QTI 2.1;
- Implementation of the main profile in at least one existing open source test rendering/responding system;
- Providing support in the use of QTI 2.1 and the conversion of other question and test formats to QTI 2.1 to those developing assessment tools and authoring questions;
- Providing a publicly available reference implementation of the QTI main profile that will enable question and test item authors to test whether their material is valid, and how it renders and responds.
Follow the project blog for future developments!
The Peer Evaluation in Education Review (PEER) project based here at the University of Strathclyde is one of five projects funded in the JISC Learning and Teaching Innovation Grants programme round 5. Running from 1 June 2010 to 30 June 2011, the project explores a range of issues around the implementation and delivery of peer assessment within higher education.
PEER is led by David Nicol and Catherine Milligan, building on the highly influential Re-engineering Assessment Practices in Higher Education (REAP) project. The interplay between the two projects is clear from the extensive information available through the umbrella site that links the two, offering a wealth of information and resources around assessment, peer assessment and feedback. The website is constantly under development, so is well worth regular revisiting to see the latest developments.
The project’s first phase involves the development of a framework for peer review and a detailed evaluation of existing peer review software. A range of tools was evaluated in relation to a list of desirable features, and outcomes from this exercise are being added to the website for future reference. Stage two involves a series of small scale pilots in a range of subject areas and institutions: the project team are also very interested in hearing from others piloting peer review software for potential inclusion within this research activity. The final phase will see the development of a number of resources including guidelines on implementing peer review within a course of study and a literature review.
Unlike some LTIG projects, technical development activities are limited to those necessary to integrate those systems chosen for the pilot phase with the institutional LMS, Moodle. Both the PeerMark functionality within Turnitin, and Aropa, developed by the University of Auckland, New Zealand, will be tested during the pilots.
Charlie Balch at Arizona Western College has been working on a simple, web-based question system for networked computers and hand held devices. AskClass.net currently functions as a poll tool, allowing the creation of multiple-choice, single-answer questions that can be used in a variety of educational settings and as a consensus building tool. It’s still very much under development, so while it’s not possible at the moment to specify a correct answer, this and other functionality may emerge in the future depending on user feedback and viability. A simple marking system would make this an extremely useful tool for both classroom and remote teaching support.
AskClass is written in ASP with a MS Access backend, although Balch is considering making future versions in PHP or Java with an XML backend to increase portability. Other applications may be integrated with the tool over time. This is intended as a learning aid, therefore security and multiple voting are not addressed in this version.
Contact details for requesting source code and suggesting features are available on the site’s FAQ.
Eric Shepherd of Questionmark has been developing an assessment maturity model ‘that provides a way for organisations’ executives and managers to monitor, chart and improve their use of assessments’, and has recently begun to formalise this in an online model.
The model separates the entire assessment process into three key areas (development, delivery and reporting), each comprised of six measures (stakeholder satisfaction, security, strategic alignment, maturity, data management and communication). Shepherd also identifies four phases of assessment maturity which can help identify the needs and requirements of an organisations (ad hoc, managed, refined and aligned). Each of these elements is or will be expanded and further developed as the model itself matures, with ongoing development being focused on the project wiki.
There is also a great deal of useful information available on the site, such as learning resources to help assessment managers understand their own processes’ maturity and helpful links to relevant material.
The model is still in development but should already be of value to users and will continue to develop over time.
Thanks to Steve Lay for the heads up!