More and more, faculty and instructors are thinking differently about assessment. They are seeing it as a continuous rather than occasional process, a collaborative activity focusing on verifying knowledge, competence and capabilities rather than status within a group of learners. Technology is enabling a different way to practice assessment.
As views change of what assessment is intended to achieve and how it might be done, the following ten developments provide the resources to make change possible.
1. Assessment-Based Credentials
Western Governors University offers its degrees on competency-based assessment, not course work. Others are now following their lead. The University of Wisconsin, for example, offers flex-based degrees in commerce, technology, nursing where the student only needs to complete assessments of learning to gain the credential. Colleges and universities in New Hampshire, Ohio, Michigan, and Arizona now also offer such credentials.
2. MOOCs for Degrees, Certificates and Diplomas
FutureLearn – the world’s third largest massive open online course (MOOC) provider with over 11 million learners worldwide – now offers undergraduate degrees and graduate diplomas and degrees.
Through partnerships with Deakin University (Australia), Herriot-Watt (Scotland) and Coventry University (UK) among others, thirty-four-degree programs, in a variety of subjects, are now available. Across all MOOC providers, there are now over fifty degrees available on demand with many more anticipated. The model varies by MOOC provider, with some offering course work for free, but charge for assessment. Other models see fees for a variety of components of this learning.
3. On Demand Assessment
Made possible by machine and artificial intelligence systems, institutions are enabling students to call for assessment on demand, both assessment for learning and assessment of learning.
Kentucky Community Technical College System, for example, has on demand courses available 365 days each year. Students call for a course and complete a pretest before they begin studying; if they are successful on the pretest, they can call for the final assessment at any time for their course.
4. Automated Assessment Generation
A number of machine intelligent and artificial intelligence systems are now available, which are capable of developing assessment and test items automatically.
Such systems include Varafy and Tao (open source). They all require sample items and an assessment rubric and then generate hundreds of thousands of versions of these items in a variety of formats, including long and short essay forms, graphic-based assessments, multiple choice and scenario-based assessment.
5. Automated Marking of Student Assignments
Tools to support the task of marking by instructors have been available for some time such as eMarkin Assistant, Rubric-O-Matic, Gradeassist. But now, machine and artificial intelligence systems are also capable of evaluating student work.
Programs such as MarkUS and Oto were developed to assess computer science programming activities of learners (as well as to detect cheating), but there are now many more systems that can grade and review assignments (formative and summative), using the instructor specifications and rubrics.
For example, Intellimetric uses past evaluations of long-form essays to create a rubric and framework for evaluating new assignments. It then uses these frameworks to assess the students work. It provides feedback on both the content and structure of the students work. eRater is a similar machine intelligent engine, which also weighs key features of the students writing skills and provides feedback.
Automated grading of multiple choice and standard test items has been taking place for some time, but now neural network-based technologies are being applied to refine and expand the opportunity for such assessment on a broader range of assessment item types, including short essays and graphs.
6. Video-Based Assessment of Competencies
Does a pencil and paper assessment or an instructor / practitioner review of a demonstrable skill or competency pass the test of being legally defensible? The courts, when asked to make this judgement, have often said no, especially in relation to health care professionals.
With video-based demonstrations of competencies assessed by trained assessors and the assessment verified by a verification process using timed video linked to statements of competence, courts are able to verify a specific person demonstrated on one or more occasion the specific competency and skill specified in the competency statement.
Products such as Valid-8 provide this service and is used by the United Kingdom’s National Health Service, Rolls Royce, the British Army as well as many other employers. It is soon to be deployed at scale for apprenticeship in Canada.
If competency can be specified as standard for a trade, profession or occupation - for example using Polaris® Competency Model – then demonstrating a competency, capturing the work on video, assessing that work against the agreed competency rubric and having that assessment verified by a third party provides a basis for public assurance and legal assurance the person has the skills and competency they claim.
7. Adaptive Assessment
All major learning management systems (LMS) have adaptive learning and assessment engines that adjust learning content based on student performance on an assessment. As the student progresses through a course, they complete tests, quizzes and assignments, which are marked automatically by the LMS. Using instructor designed sequences (or, increasingly, machine intelligent algorithms), the LMS then determines which content the learner now needs to master before they can progress to the next “lesson” or competency.
The use of analytics and algorithms is key to this work – past student behaviour (both previous students who took this same course or a particular student’s behaviour) determine what happens next for the student. Some systems are emerging, which enable the LMS to determine not only what the student should work on next (content), but what style of learning is most suited to this student – e.g. video, audio, graphic, text, game, simulation – based on the pattern of their use of learning resources and its output in terms of mastery of knowledge or skill.
Many learning game designs use adaptive learning. Varied groups of researchers developed, refined and validated the ALGAE (Adaptive Learning Game Design) model for this work. This is a comprehensive model based on game design theories and practices, instructional strategies, and adaptive learning models and algorithms, now all combined into a single model.
8. Peer-to-Peer Assessment
Because of the scale of MOOCs – some have over 100,000 learners enrolled – and the development of online learning, faculty members are making increasing use of peer-to-peer assessment to support the learning agenda of their students.
Peers provide written feedback, and sometimes grades, on assignments completed by their peers. How this is done varies a great deal, but there are now technologies which support this work, such as CrowdGrader, PeerScholar and PeerWise. Each of these technologies is subtly different, but provide a basis for peer-based learning, support and assessment. For example, CrowdGrader has the instructor creating an assignment, which students then respond to. Students then grade the work of other students and the instructor then assigns final grades.
9. e-Portfolios
Higher education institutions in the United Kingdom, and many elsewhere in the world, determined students need much more than an official transcript – they need to be able to share a portfolio of their work, including projects, exemplars, case studies, videos, testimonials and assignments, which they can share with potential employers, colleagues or others. Such portfolios are also used for personal development planning and program planning. The UK’s JISC developed a guide to e-portfolios , which is now mandatory in the UK.
Most LMS systems have e-portfolios built in, but there are also stand-alone e-portfolio systems, such as PebblePAD, EPET and RAPID which several institutions deployed.
One particular form of e-portfolio links strongly to assessment. RIIPEN, developed in Canada, is a platform which enables an instructor to partner with a colleague in industry or a profession to develop work-relevant assignments, which are marked and feedback provided by both the instructor and their partner in the private sector. The resultant assignment, feedback and testimonial are placed in that students e-Portfolio within RIIPEN and shared with relevant organizations chosen by the student. Using its “sister” product, Prollster, RIIPEN is also able to assess the “soft skills” students use and display in work-based projects or related activities.
10. Trans National Qualification Frameworks
There are now a great many international agreements, both educational and trade agreements, which seek to enable the mobility of learners and graduates of college and university programs. For example, the Comprehensive Economic and Trade Agreement (CETA) (unofficially, Canada-Europe Trade Agreement) contains a streamlined process for the mutual recognition of professional qualifications, which focuses on the development of mutual recognition agreements (MRAs) between professional bodies. Skills and competencies, especially as assessed through professional competency assessments and examinations, form a key component for such MRAs.
Amongst the most comprehensive educational agreements is that created by the small states of the Commonwealth in their Transnational Qualification Framework(TQF). This agreement, signed by some thirty-one countries, provides small states with more up-to-date procedures and guidelines and a referencing tool for alignment of qualifications in individual countries to an agreed international framework. The TQF functions as a translation device, making qualifications more readable, transferable and transparent, which in turn, helps learners and workers move between countries or change jobs.
What matters here is the integrity of assessment across jurisdictions does not weaken these agreements – the real quality control relates to competency and capability assessment as well as to course design and the quality of instruction.
A Renaissance in Assessment
Assessment is experiencing a renaissance as we explore ways in which authentic assessment and feedback can be used to enable learning. With COVID-19, new approaches to assessment are being widely explored, as the practicality of remote proctored examinations make many traditional approaches difficult if not impossible. Instructors are discovering the effectiveness of some of the approaches outlined here.