The benefits of online assessments have been discussed for well over a decade. Back in 2007, a JISC report stated:
"[Online assessment] is much more than just an alternative way of doing what we already do.
A growing body of evidence... indicates that well-designed and well-deployed diagnostic and formative assessments can foster more effective learning for a wider diversity of learners."
— JISC, Effective Practice with e-Assessment
Since the report’s publication, digital provisions have expanded greatly at higher education institutions. VLEs such as Moodle, Blackboard and Canvas are ubiquitous; teaching labs may come equipped with iPads at every station; and video streaming and lecture capture technology is transforming remote learning provisions.
However, digital infrastructure alone is not sufficient for truly effective online assessment. A foundation of solid pedagogical underpinnings is needed. Here are five strategies to enhance the effectiveness of your online assessment practices.
1. Align constructively
John Biggs, the founding academic of constructive alignment, wrote:
“The key is that all components in the teaching system - the curriculum and its intended outcomes, the teaching methods used, the assessment tasks - are aligned to each other…The learner finds it difficult to escape without learning appropriately.
— John Biggs, Aligning teaching for constructing learning
As a teacher, you want students to learn the intended learning outcomes (ILOs). As a student, you will spend your time learning what you believe you will be assessed on, regardless of the ILOs.
Constructive alignment acknowledges this dissonance and aims to reduce it by ensuring all three components (ILOs, teaching methods, assessment) are aligned to the same goal - your students’ learning.
Of course, this is not unique to online assessment, but it’s important to consider the variety of tools available. Some tools are more flexible and customisable than others, and these are much more likely to be capable of constructive alignment.
When considering the adoption of a new form of online assessment, check its flexibility for constructive alignment. Can it be adapted to suit your current ILOs and teaching methods? Or, will you have to adjust these two to suit the assessment?
2. Instil student confidence
One of the final points in Jisc’s e-assessment report reflected:
“We need to explore how best to develop the confidence of all concerned in learning and teaching in the efficacy and appropriateness of computers in assessment.
— JISC, Effective Practices in e-Assessment
Indeed, our survey of the Variety in Chemistry Education / Physics Higher Education Conference (ViCEPHEC) delegates stated “Confidence” as one of the key areas for development in their first year chemistry & physics lab sessions. This likely extends to assessment too.
First year students have a barrage of new experiences to contend with. In the first few weeks they must learn to navigate the geography of their new campus or city, hundreds of new people and potentially dozens of new ways of interacting with their university work.
Students are often adaptable and willing to try new things, but we shouldn’t expect them to intuitively understand everything, especially in the first term.
For each assessment, consider whether it needs to be summative or whether formative would work, and how high you want the stakes to be for that task.
Online assessments often have the benefit of repeatability, so consider whether you want to offer shorter “trial” assessments to get students used to a new format, especially if the main assessment is summative.
Whatever the format, make sure requirements are clearly signposted and common “tripping points” are acknowledged and clarified as soon as possible.
3. Personalise your feedback
In discussions of student satisfaction with their course, feedback nearly always comes up.
The “holy trinity” of feedback is personalised, immediate and consistent. Before online assessment, these three were mutually exclusive. There was always a balancing act between quick, general feedback and detailed specific feedback that was individually tailored.
While one-to-one teacher-student feedback will always be a gold standard, as cohort sizes continue to expand this isn’t always practical or possible on a regular basis.
Intelligently designed online assessment can help. Many tools, even basic Moodle quiz questions often come with a range of feedback options. You can offer feedback for students who get the answer correct, specific tips for students who answer incorrectly in a variety of pre-determined ways, and general feedback for all students.
More bespoke packages may give you even more flexibility - be sure to ask about customised feedback when researching options.
To master this, you will need to initially be quite creative in considering how students might trip up. Sample data from previous years can also be very useful, if you have access to it.
4. Harness the power of learning analytics
Learning analytics is the gathering, processing and reporting of data generated by students interacting with their digital learning environments. The goal is to understand and improve the experience of the learners.
In the report From Bricks to Clicks, the Higher Education Commission stated that learning analytics had:
“...enormous potential to improve the student experience at university, by allowing the institution to provide targeted and personalised support and assistance to each student.
— HEC, From Bricks to Clicks
Online assessments are critical for this goal. If students are depositing their work in a collection box, it’s difficult to know whether students are completing them in good time, or at the last minute. Which questions did students most struggle with, and what were the most common incorrect answers? You may get a general sense, but the resolution of data is much higher online.
Looking towards the future, a “Big Data” approach to learning analytics seems imminent. With this in mind, it is worth familiarising yourself with the possibilities now. There are things you can do at the local level, even if time is short.
For example, Moodle’s gradebook gives information like bar-graphs of grade distribution, time spent on each task, and the average grade received by students for each question, among other things. Functionality can also be extended with plug-ins, LTIs or external content.
This can help optimise your online assessments for a wider cohort, or identify topics which might need further exploration to ensure sufficient depth of understanding.
5. Check access and accessibility
Online assessments offer a chance for students to work around distance, disability, or illnesses more effectively, and options to complete coursework at a time and place most suitable to their needs. However, this benefit can only be reached if the requirements and deadlines are fair and clearly laid out to students.
In addition, even the greatest online assessment in the world is not going to help students learn if they cannot access it or understand how to use it.
Given appropriate training, students may have little issue adopting a new piece of technology. But if the resources are not intuitive, there may be teething problems.
Preparing new assessments in good time so you have time to familiarise yourself will boost the effectiveness of the experience for your students if they have questions.
Within your student cohort, you will likely have a diversity of digital capabilities. To help ensure online material is broadly accessible, the Web Content Accessibility Guidelines (WCAG 2.0) comprise a set of recommendations.
These include keyboard navigation, text alternatives for images, considerations for time-based media and accessible use of language. These can be checked against your online assessment at any time.
Takeaway and reflection
Introduced appropriately, online assessments can support constructive alignment and instil confidence in students at all stages of their course.
Furthermore utilising technology for assessments can offer something extra; by way of feedback quality, learning analytics and increased accessibility.
Have you taken steps recently to enhance your online assessment strategies? If so, has it been successful, and what metrics are you using the find out? If not, what do you think your main “sticking point” is, and which aspect from the list above might have the biggest impact?
We’d love to hear your thoughts on LinkedIn, Twitter or getting in touch directly.
———
Note: An earlier version of this article was published in October 2017