Report

Rethinking assessment

From fixes to foresight: Jisc and Emerge Education insights for universities and startups. Has the pandemic helped to move us to an assessment system that is more relevant, adaptable and trustworthy?

student studying at laptop

Foreword

Chris Cobb, chief executive, the Associated Board of the Royal Schools of Music (ABRSM):

"Who could have predicted in 2019 that the word “proctoring” would become part of the common parlance of universities in 2020 or that decades of assessment practice would be jettisoned overnight in favour of open book exams. We’ve all journeyed a long way in the past twelve months and, as we start to contemplate a relaxation on COVID-19 restrictions, it will be interesting to see what modifications to assessment persist.

"This rapid drive to digitise assessment has raised opportunities and challenges in equal number. Coherent decipherable text from keyboard entry must be the biggest gain for examiners but what of the learner? Less stressful? Greater authenticity and relevance to contemporary real-life circumstance? A more level playing field for those with disability? 

"Critics might say that the approach has compromised integrity and left the door open to cheating and fraud but comparisons suggest there has been no grade inflation. Others might question how we can have confidence against identity fraud and collusion when WhatsApp groups proliferate and ingenious students find ever more sophisticated ways to circumvent locked browsers.

"Is visual proctoring the only way forward? What of the intense bandwidth required by proctoring in regions of the world where high bandwidth cannot be relied upon (like south-west London!) or the high cost and time commitment to monitor the video streams?

"How can an institution protect against cheating? More to the point, is it too soon to determine whether employers and others have confidence in the new approach? Some professional bodies have been particularly unmoving on issues of proctoring so it seems that there remains some scepticism of how robust complete open-book approaches are.

"New concerns have also emerged such as digital poverty – or simply poverty in the case of not having a suitable quiet space and desk in which to take an exam at home – and new responses such as the advantages of on-demand examinations enabling equipment to be shared and assessing a candidate when they’re ready to be examined and not when the institution is ready to examine them.

"The shift to open book assessment has also surfaced the importance of academic writing skills to gain confidence and avoid susceptibility to essay mills. It is no longer enough to be confident in the subject matter, students must also be confident in being able to articulate and develop arguments succinctly and to cite correctly without resorting to peer support. Instead of essays, alternative formats such as videos and presentations might also be considered. As well as better supporting disability, these formats might also provide greater authenticity to workplace environments and be less prone to cheating.

"All these challenges and more will inevitably be met with further advances in technology and pedagogic guardrails to support skills, detect circumvention and provide ever more authenticity to an ever changing external context.

"I trust that you find this summary of recent experience, framed around the themes of relevance, adaptability and trustworthiness, to be a timely reminder of the lessons to be learned for the future."

Introduction

In February 2020, Jisc published the future of assessment. It set five targets for the following five years to progress assessment towards being more authentic, accessible, appropriately automated, continuous and secure. The ‘end of exams’ was floated as an idea but the overall slow pace of change in the sector when it came to assessment meant that few saw the pen and paper exam’s reign being toppled imminently. But less than a month later, we did indeed see the ‘end of exams’ – at least the traditional variety – come to pass in many universities.

A follow-up report, assessment rebooted, in May 2020, looked at the ‘quick fixes’ universities introduced to enable digital assessment, and the challenges and trade offs they faced between scale and security, trust and equity. The report also set out three goals looking forward to 2030. Assessment should become more:

  • Relevant
    Enabling universities to go beyond traditional forms of assessment, dictated by practical limitations of analogue exams, and build systems that are relevant to contemporary needs and reflective of the learning process, and make use of innovative assessment methods too impractical to deliver without digital tools
  • Adaptable
    Effective in addressing the needs of a growing and diverse student population, a range of providers and any number of geographies
  • Trustworthy
    Based on solid foundations of academic integrity, security, privacy and fairness

Now, a year later, how far have we come in meeting those goals? Is assessment any more relevant, adaptable and trustworthy than it was in May 2020? Are any of these changes going to last – and what needs to happen to continue the assessment revolution?

Has assessment become more relevant?

'Sitting down for three hours and handwriting on a piece of paper is medieval. It does not sit well with the capabilities that we have already or could have. Exams have to go digital.'

Klaus-Dieter Rossade, director of assessment programme, the Open University

And go digital it did in March 2020, opening up new possibilities for new kinds of assessment. Among all the varieties of exam offered – from open book essays and multiple choice online quizzes to digitally proctored traditional exams – some universities, some departments and even some individuals grasped the opportunity to make changes not only to the mode of assessment but to its very nature.

Take the environmental science department at Brunel University London, which ripped up the rulebook on what constitutes an assessment, making it more relevant into the bargain. The usual three-hour exam was replaced with a task that asked the students to address a complex question, drawing on a wide range of information from across the course, ‘showing their working’ and uploading references with highlighted annotations – and they had seven hours to tackle the problem. 

“They were transported into what it might be like to come to work one morning and your boss says, ‘Ditch everything, by the end of the day, I want you to have produced a report on this issue’.
"You might think that the end product would be some incredibly long reports, which would add to the marking load, but part of the task was to make sure that the work was also concise and effectively communicated. It tested higher-order thinking skills and quality rather than quantity.”
Mariann Rand-Weaver, vice-provost (education), Brunel University London

By moving away from the traditional essay or exam, this kind of more authentic assessment not only parallels the tasks and deadlines a student might experience when they enter the post-university workplace but also encourages the learner to integrate and apply knowledge and skills. It develops deeper, more integrative personal learning and understanding.

Bringing more authenticity to economics exams

At Stirling University, an economics lecturer had long been frustrated by the limitations of the traditional exam. The impossibility of following the usual rules during a pandemic offered him the opportunity to, finally, make some changes.

“I was never entirely happy saying, ‘Well, let's go away and do a written exam for two hours and that's the pinnacle of everything that you've done in terms of your learning on this module or this programme.’ But that's what we did.
“In economics, if you’re not in an exam hall setting, doing the calculations by hand, from memory, in a cold sterile environment, then it's not proper economics. And then students go into the workplace environment and that doesn't ever happen. They're using economics with others and creatively with other forms of skill-based learning. So what are we actually testing?”
Paul Cowell, lecturer in economics and deputy associate dean of learning and teaching, Stirling Management School, University of Stirling

In the case of Cowell’s students, they were now being tested on real-world problems and their flexibility and creativity in addressing them. For his first year students, it was applying strategic interactions to the pandemic, using their choice of medium, from blog posts and digital posters to videos, covering anything from panic buying, hoarding and price gouging to vaccine uptake and sharing. As Cowell describes it, assessment should give students the tools to understand what's going on. Crucially, it’s “assessment for learning, not assessment of learning”. 

Reflecting real-world medical skills in open book exams

On a wider scale, Imperial College London’s school of medicine set the first digital ‘open book’ exam delivered remotely for 280 final-year students. Shohaib Ali was one of them and found the experience enlightening. 

“The exam has made me realise that medical school finals test a higher level of learning than just remembering information. If being a doctor just meant googling key information, they would have been out of a job a long time ago. We have actually learnt so much during this degree, which is easy to forget. The ability to synthesise information to come up with a management, or interpret complex data, are skills I’ve somehow learnt. It took an open book exam for me to realise this.
“In this ever digitalised world, the skills of what makes a doctor have shifted, and our exams should reflect this. No parts of society will be the same after COVID-19, and medical education is no different.”
Shohaib Ali , student, Imperial College London’s school of medicine 

A turning tide

But is this more than a flourishing of different experiments? Does it add up to wholescale change? Back in February 2020, the future of assessment report noted that “there are pockets of good practice and innovation within institutions across the UK...There are many individuals within universities and colleges who recognise the issues and are experimenting with innovative tools and apps to effect change from the bottom up.”

Certainly, there are now more of these pockets but they do not – yet – represent a wholesale reinvention and redesign of assessment. However, the tide may be turning. 

According to Gravity Assist, the report of OfS chair Sir Michael Barber’s digital teaching and learning review:

“Some universities told us they had already decided that the days of exams in the lecture hall were over and were putting plans in place to ensure they did not revert to the closed-book, handwritten, essay-style exams as the main form of assessment. Practically, we heard that these assessment exercises were often huge paper-based exercises which had not changed for a long time, whereas digital assessment was much slicker and easier to mark. From an educational perspective, many reflected that they had overused summative assessment and needed to redesign their approach to take advantage of the possibilities presented by digital technology for formative and continuous assessment.”

In autumn 2020, Jisc, Universities UK (UUK) and Emerge Education launched results of a comprehensive survey of HE leaders, lecturers and students. It illustrates that while leaders feel overwhelmed by the challenges, they believe change is here to stay and see great potential in the use of technology to create flexibility, break down geographic barriers and extend the institution’s reach.

What do students think?

Students, meanwhile, are open to the idea of flexible learning’s continuance, post-pandemic. The latest Pearson and Wonkhe research into students’ experiences of online and blended learning (February 2021) found that the great majority want some aspects to continue, including recorded lectures, online access to support services such as wellbeing and careers, and online tutorials or check-ins with tutors.

61% of undergraduate students would like their assessments to be delivered online or through a combination of online and in-person delivery, once the pandemic is over.

Polling conducted for the Barber review showed that 61% of undergraduate students would like their assessments to be delivered online or through a combination of online and in-person delivery, once the pandemic is over. The student academic experience survey from Advance HE and Higher Education Policy Institute (HEPI) notes that results across a number of areas in the survey point towards assessment being one of the main areas that has seen a particularly clear improvement in 2020.

It’s also borne out by a recent survey of Open University students being undertaken by Maria Aristeidou at the OU’s Institute of Educational Technology (publication forthcoming).

It found that 74% of the students surveyed liked the idea of online exams and 17% did not (9% not applicable). Of those surveyed, female students were more likely than male students to support online exams and there was no difference between age groups or disciplines. The main reasons students gave for preferring online exams were travel concerns (money, time, stress, parking), uncomfortable/unfamiliar exam centres and mental health (anxiety).

The main concerns about online exams were to do with having an inappropriate environment at home (children, distractions, neighbours, technical concerns), worries about the value of their degree and cheating.

For Paul Cowell at Stirling, the student response vindicates his unorthodox approach to assessing economics:

“Of course, you're not going to please everyone all the time and there are still things to iron out, especially around the integrity of assessments online and collusion. But the more that you engage students in doing what they want to do, it becomes easier to get them on board with it. I'd say overall, the feedback's been really, really positive. They can see what we're trying to do with them.”

Enthusiasm from the student body for more relevant assessment also offers possibilities for universities to focus on relevant assessment as a differentiator. As Andy McGregor, Jisc’s director of edtech, suggests:

“It would be really interesting to see if any universities started to decide to try and differentiate themselves along these lines: ‘we are the institution of authentic assessments. We've thought a lot about our assessments. They are closely aligned with the real world. You won't come here and end up with RSI from writing a three-hour exam paper. You will come here and end up with skills that you will use throughout your time in the workplace’.”

Has assessment become more adaptable?

A well-designed digital assessment intended to be usable by everyone to the greatest possible extent benefits all students by allowing them to produce their best work while minimising costly and inefficient workarounds and adaptations for particular needs. Accessibility is also a legal requirement for UK universities under the 2018 public sector web accessibility regulations

Adaptable assessment is more inclusive and accessible to all

Adaptable assessment is more inclusive and accessible to all, meeting students where they are and adapting to their individual circumstances. Has the move to digital assessment helped or hindered? 

Disabled students

'Moving online removed some of the barriers that could have been removed previously, but without the theoretical understanding of disablement or disability that is an essential part of the process. It was taken out of context and so we didn't actually fix the problem. Instead, we’ve shifted who's being affected rather than working together comprehensively to solve the issue.'

Piers Wilkinson, accessibility and inclusion consultant, and Disabled Students’ Commissioner

For disabled students, the picture is nuanced. Generally, there has been a positive reaction in terms of the effect on some – but not all – disabled students. According to the Disabled Students' Commission, the most notable feedback provided by disabled students on blended learning more generally was that the flexibility and support they had been requesting for years (and had previously been told was not possible) had now been implemented by their provider in a short space of time as a result of the pandemic. For many disabled students, the increased flexibility was widely welcomed and helped to overcome existing barriers such as attendance.

The flexibility and support they had been requesting for years (and had previously been told was not possible) had now been implemented by their provider in a short space of time as a result of the pandemic.

However, the rapid shift from predominantly face-to-face interactions to virtual ones did not benefit all disabled students, and some reported issues with the blended learning approach and difficulties learning online. The concerns were nuanced and often differed by impairment type, highlighting the need to not treat disabled students as a homogenous group, and to recognise that the support requirements differ in complexity. 

The Barber review highlights a survey of 348 disabled students, conducted by the University of Sheffield, that found that almost all respondents who had been set ‘from home’ exams considered these a better way of completing assessments than sitting a more standard exam on campus. 

According to a Sheffield student:

“They have allowed 24-hour time slots for exams, which has been very helpful. As a dyslexic student it takes me a lot longer to read and process written information and so this time period has allowed me to show my knowledge to the best of my ability without having to panic about a two-hour time limit.”

The real meaning of 'reasonable adjustments'

However, Piers Wilkinson argues that this benefit is not the case for all disabled students and there is a deep misunderstanding of ‘reasonable adjustments’ that has come to the fore in the shift to open book, longer length exams:

“It fails to understand that the 25% extra time was a bandaid to address a problem that wasn't removed in the original assignment or exam situation. The adjustment is a blunt mitigation that wasn't extended into the way in which online assessments changed from the pandemic, when all students were given extra time. For the disabled students who still face a barrier, it's not a time-related barrier for them. It's a barrier that caused them to take more time. That barrier is still there, so giving them 24 hours and making an open book when it was a chronic pain, morphine opiate fog, or accessible adjustment regarding the content or the design of the format, or being better at vocalising your contributions rather than writing, makes little difference – and makes it harder to argue for alternative formats for additional adjustments.”

At the University of Cambridge, the students’ union’s disabled students’ group is actively campaigning for the continuation of digital assessment post-pandemic, arguing that “Diversified assessment such as coursework and 24+ hour exams have been revolutionary for disabled students. We risk slipping back into the old default of three-hour, handwritten, closed-book exams if we can't show that student support is there”. 

Easing anxiety

For some disabled students, the simple fact of being able to do an assessment from their own home has made a significant difference to their comfort, reducing exam hall anxiety. This was particularly the case for some neurodiverse students and those with mental health disabilities. 

“When a disabled student experiences physical discomfort, difficulty focusing, or severe anxiety in an exam hall setting, separate accommodation may be suggested as a reasonable adjustment. This can be difficult to implement smoothly when rooms and invigilators are limited, particularly when extra time is factored in. This also involves removing the student, not fixing the situation.
“The option to sit assessments at home in a familiar and controlled environment, or to be assessed in alternative ways, will free many disabled students to fully demonstrate their knowledge and skills. A more flexible approach to assessment does not mean less rigour. Creative approaches to assessment that draw upon principles of universal design for learning will embed accessibility by default, while removing some of the administrative burden, and a lot of the stress, that disabled students experience around exams.”
Kellie Mote, Jisc subject specialist: strategy (assistive technology)

Uncovering the issue of accessibility and third-party platforms

However, the accessibility of platforms used for assessment is becoming an increasing concern. It has emerged that some third party platforms used by universities are not set up to enable the assistive technology required by some students. This not only undermines accessibility for students, it also opens up universities to the risk of legal action under the Public Sector Body (Website and Mobile Applications) Accessibility Regulations 2018.

Piers Wilkinson is critical of the culture of “prioritising the possibility of someone devaluing the assignment by cheating over a disabled student's access, even when there's no evidence that a student is cheating or would cheat.”  

He gives the example of locked-down virtual desktops being used for exams that made it impossible to use screen readers many disabled students rely on. He also highlights the case of a university that contacted him requesting advice on making an image of some text accessible for blind and visually impaired students, or students that use a screen reader, in the context of a language test where they wanted to avoid the use of Google Translate. He says:

“My advice back then, and my advice still today is: just design the exam to be testing whatever linguistic knowledge it is you actually want to test, not knowledge that can be replaced by an inaccurate translation tool.
“We had – and we still have –  an excellent opportunity to get this right, but it involves understanding the nature of disablement and the experience for disabled students, and working that into how we design exams. Fundamentally, it should come down to choice. Choice of the way in which you want to express yourself to prove that you've understood the subject. And that benefits all students."

Productive partnerships

Productive partnerships with the third party providers of platforms can also make a difference. The Open University (OU) has been working with UNIwise for three years and is finding the collaboration highly beneficial, as Klaus-Dieter Rossade, director of assessment programme at the OU explains.

“UniWise isn't just a company that provides a solution, we are jointly developing their product. That's partly to meet our needs, which may be specific to distance education, but it also improves their product for everyone. For example, they worked on quite a lot of new accessibility features in conjunction with us because we have very, very high standards in that area. So, this is not buying off the shelf. This is about not producing it in-house. This is very much about working in partnership with an external provider and then designing a new solution. And that's a very promising relationship.”

Digital poverty

Adaptable assessment must also accommodate students’ differing access to the devices, connectivity and space required to complete a digital assessment effectively away from campus: digital poverty is a key issue that has come to the fore during the pandemic.

52% of students said their learning was impacted by slow or unreliable internet connection, with 8% ‘severely’ affected; 71% reported lack of access to a quiet study space and 18% were impacted by lack of access to a computer, laptop or tablet.

In the UK, according to OfS polling, 52% of students said their learning was impacted by slow or unreliable internet connection, with 8% ‘severely’ affected; 71% reported lack of access to a quiet study space and 18% were impacted by lack of access to a computer, laptop or tablet. Digital divides will also affect some international students studying remotely in their home countries. Assessment is not adaptable if it requires students to have two devices in order to implement digital proctoring, for example, or to run software that demands high bandwidth.

The Associated Board of the Royal Schools of Music (ABRSM), which runs a large number of international exams, has been experimenting with an ‘on demand’ model, which makes it easier for candidates to borrow devices from friends or family in order to take their exam. As Helen Coleman, chief operating officer at ABRSM, explains, 

“At ABRSM we decided to move from an event-based test model to an on demand model because there were clear benefits for both our candidates and our operations. Significantly, an on demand model means that the candidate can ‘test when ready’ rather than only when ABRSM is able to provide a test opportunity.
"The frequency of test availability in our on demand model also helps address some digital poverty issues as candidates can make use of shared or borrowed technology facilities in schools, libraries, or with friends and family and no longer have to travel significant distances to take their exam at a specified time. From an operational perspective the on demand model helps with demand management so we can ensure we have enough people to support our candidates throughout their test journey.”

Demographic differences

Data is beginning to trickle through about the effects of the move to digital open book assessment on different groups of students. Brunel University London, for example, uses the WISEflow digital assessment platform integrated to its student's record system, which enables sophisticated analysis of outcomes for students.

While there has been no grade inflation as a result of the changes, Brunel has discovered that students who come in with BTEC qualifications – and, in particular, black students with BTEC qualifications – benefited from the changes in assessment and did significantly better in terms of degree outcome than in previous years. Degree outcomes were not improved for white, black or Asian students with A-level qualifications.

“That's down to changing the assessments,"

asserts Mariann Rand-Weaver.

"Instead of sitting in a sports hall for three hours and having to rely on memory, students were able to use any resources available to complete the task – something that is probably more akin to what those with BTECs would have done previously. We still have a gap in awards between those entering with BTECs and those entering with A-levels but that gap is narrowing as a result of the change in assessment.”

Adaptable also means that assessment is appropriate for students in different geographies and different time zones.

Has assessment become more trustworthy? 

What does more relevant and adaptable assessment mean for academic integrity – and how can the security of exams be assured? 

Concerns about cheating and collusion have heightened with the move to digital assessment. There has been a rise in the use of digital proctoring, especially for exams with high requirements from professional, statutory and regulatory bodies (PSRBs). 

For Rachel Schoenig, chief science officer at online proctoring organisation Examity, it comes down to integrity:

“Part of what online proctoring does is help to protect the value of the degree that someone is earning. And I think that should be something that we all embrace. Whether we are employers looking to hire someone or learners who are going through a particular programme, we are all relying on these degrees or credentials or certifications that we might earn to represent something about our learning and our capabilities.”

Case study: Live online proctoring at the Royal College of Physicians and Surgeons of Glasgow

Before the pandemic, all exams set by the Royal College of Physicians and Surgeons of Glasgow (RCPSG) were face to face, whether written exams in halls in the UK, written exams in British Council centres for international candidates or clinical exams in hotels or clinics. Kirsty Fleming, deputy head of assessment at RCPSG remarks:

“We often flew people around the country, and indeed around the world, to deliver those exams,”

With the urgent need to shift to online exams quickly, to keep new doctors and surgeons completing their training in the middle of a medical pandemic, RCPSG chose to work with Fry’s Practique platform and combine it with Examity, which links smoothly with Pratique for live proctoring. 

RCPSG’s international exams can involve large numbers of candidates over multiple time zones with staggered starts, so keeping any move to a completely new system as simple as possible was crucial. Examity’s live proctoring service offered a couple of key benefits. Fiona Winter, director of education and assessment at RCPSG said:

“We chose live proctoring to give the security that if something went wrong during the exam the candidate wasn't left looking at a screen with nobody at the other end. They could have a conversation or use a chat box and the issue could be resolved there and then rather than it being left hanging and their exam attempt being annulled. Plus, the system is a single login, which is really important when you're trying to get instructions out to people all around the world who are often doing an exam in their second language. It needed to be straightforward."

To help the new system run smoothly, the assessment team put a significant amount of time and effort into communicating with candidates, using webinars, voiceovers and video clips to reassure them and allay any fears. It paid off in terms of candidate satisfaction. 

“Anecdotally, we found that once people got over the initial hurdle of, ‘How does this all work? Is it going to work? Is there going to be a technology problem? Is somebody watching me?’, our candidates were more relaxed in their own environment. There wasn't that pressure of being in an exam hall where the person next to you seems to be writing much quicker than you are, or they've got a squeaky desk, or the invigilator keeps wandering past with their keys jangling in their pockets. It was much more, ‘I'm in my home environment. I know where I am. I'm just concentrating on me’,” explains Fiona.

There were also financial and logistical benefits for some candidates. Kirsty said:

“One of the things we've talked about a lot as a college is the cost of training to young doctors. Previously, some candidates would need to fly to an exam centre in a different country. As they're not having to travel, they've saved financially. They're also taking less time out of work. They were really appreciative of that.”

Some candidates were concerned about a perceived lack of control, particularly around the technological aspects, such as the internet failing locally or software breaking. But, as Fiona points out, there were always unknowns, such as trains suffering points failures on the way to exam halls or flight delays and “these are just different unknowns”. Preparation and mitigation is the key: “we all used to take four pens to an exam in case one broke. So it's not that outrageous to think that you might have a second laptop in your house and you might have it nearby, just in case.”

For the assessment team, an unexpected change has been the amount of time and effort required after the exam to review the videos of candidates who did have technology issues, deal with any infractions and make judgement calls. It’s a far cry from the old days of collecting up papers and a brief invigilator’s report. 

However, for Fiona, these are minor hurdles on the road to an assessment system that is saving candidates time and money, opening up the possibility of RCPSG having the flexibility to consider additional exam locations  and, ultimately, proving to be a more responsible, sustainable system.  

“It's absolute madness when you think about it in the cold light of day that somebody would previously have got on a plane, flown to another country to sit in an exam hall with a pencil and complete a multiple choice answer sheet. Considering sustainability, climate change, and candidate experience, we have to ask if this is the best way to operate in the future. Assuming online exam security is robust, I would like to see written exams continue online, and I think we'll continue to use proctoring because I think that gives that certainty and guides the candidate,” she concludes.

To find out more about Examity’s approach to online proctoring, read the Q&A with Examity CEO Jim Holm.

Clear communication with students: a success factor

The importance of communication with students was also highlighted by CoSector University of London. It recently delivered, in partnership with Janison, several sets of high-stakes online exams using automated proctoring (which differs from the live/human online proctoring offered by Examity) for two UK higher education institutions: economics exams in December 2020 for Royal Holloway (140 students) and law exams and Bar exams for City, University of London in March and April 2021 (more than 2,000 exams taken in total).

Both projects were delivered at relatively short notice and, although new to the institutions and their staff and students, were completed successfully. All the assessments were carried out on the Janison platform and were fully proctored using Janison’s Proctortrack AI proctoring solution. The ability for the assessment platform and automated proctoring to cope with poor connectivity ensured successful completion of the assessments despite some students having limited and intermittent connectivity. 

Critically for the Bar exams, the continuous video proctoring was essential to meet the online security standard required by the Bar Standards Board. 

“In both cases the critical overall success factor was clear communication with students and good joined-up support for students ahead of and during assessment with the institution, CoSector and Janison working closely together.”
Mark Newton, managing director, University of London CoSector 

ABRSM has also been trialling digital proctoring but using a ‘record and review’ model as it feels it is less “intrusive” for candidates. According to Helen Coleman, chief operating officer,

“Since moving our music theory exams online we have proctored almost 25,000 exams. Proctoring provides ABRSM, our regulator, and our candidates with assurance of the integrity of the test taken and their outcomes. However, proctoring has brought with it challenges as it is a new process for our operations and has additional time and delivery costs.”

In contrast, Brunel University London has been experimenting with facial recognition for identity purposes, through its WISEflow platform, which takes pictures of the examinee at the start of the exam, with their ID card, and then at random points throughout the exam, finally offering a score at the end to indicate that the person taking the exam is indeed who they say they are. 

A question of design

However, for Mariann Rand-Weaver at Brunel University London, which saw no grade inflation after moving to open book exams, with no rise in the number of first class degrees awarded, the key lies in good question design rather than preventing access to resources or using digital proctoring. 

Jisc’s Andy McGregor agrees.

“There's got to be more than the desire to try and constantly make exams completely foolproof or cheat proof. That seems to be a losing battle, because cheating technology is always going to win. If it is an arms race, it's either going to keep pace or always be slightly ahead.
“If universities can find time to spend on assessment redesign, then new approaches to assessment could solve some of the security problems, as well as some of the other problems that exist around assessment.”

The integrity balancing act

The Academic Integrity Collective, a group of students' union officers concerned with the effect that contract cheating, and essay mills in particular, is having on students and the higher education sector, concurs. Rather than focusing on increasingly high-tech measures to prevent cheating, it points to a number of reasons why students cheat, including university assumptions that students have study skills, a lack of investment in academic skills leading to lack of confidence, a lack of student knowledge regarding consequences and a lack of staff support on academic integrity. It also notes the impact of over-assessment: 

“Students constantly speak to us about how many assessments they have, all due in around the same period, sometimes the same day...We often meet with students who cannot carry out their assessments due to the lack of accessibility. This secondary issue has never been more noticed than during the past year of the pandemic, where students struggled in digital poverty. We need to ensure we are not creating an environment where students find themselves with no choice but to utilise unfair means.”

It’s calling for universities to ensure assessments are accessible, appropriate and not excessive, educate all students on academic integrity every single year and invest in sustainable academic support skills for all.

It’s a balancing act, as Examity’s Rachel Schoenig explains:

“If you want a good reliable score, then you need to have good test content and you need to have good security. But if either one of those are gone, then whatever that degree or credential is purporting to represent, it won't be as trustworthy. A lot of the shifts that we've seen as we have started to incorporate different delivery methodologies, different proctoring methodologies and different testing methodologies has meant we have had to re-evaluate what we need to have in place from a security perspective.”

Conclusion

'One thing that we've learned from the pandemic is that there's a lot of creativity within us. We can do things differently, as a sector, as individuals. We need to make sure that we take the best things from that rather than reverting back - just because we can get everyone back in the exam halls again doesn’t mean we should.'

Paul Cowell, University of Stirling

The pandemic has offered a real opportunity for universities to reimagine assessment, to make it more relevant, adaptable and trustworthy. There has been, if not a widespread explosion of experimentation, at least an openness to explore the possibility of making assessment more authentic and relevant, testing knowledge and skills in a more realistic, contextualised and motivating way.

It has necessarily become more adaptable for most, if not all, students, while acknowledging the detrimental impact of digital poverty. And the trustworthiness of exams has been tested like never before, with the rise of digital proctoring at one end of the spectrum and the realisation that assessment design can itself help to make assessment more secure at the other. 

Where will assessment go from here? According to Jisc’s Andy McGregor:

“The assessment conversation has moved on three years in the space of one year because the situation has forced people to reconsider exams. A lot of people have realised that they can actually do without them. While I think exams will come roaring back with a vengeance, because they are an expected part of education, I also think a lot of people will conclude, actually we can do better, we can take different approaches and they can benefit everyone.”

In this ‘bridging year’ between last year’s frantic emergency response and next year’s possible return to ‘normal’, there is an exciting opportunity for a wholesale rethinking and reimagining of assessment across the higher education sector – a redesign of assessment that builds in authenticity and relevance through transforming assessments, not simply translating then for the digital mode. Such a redesign process builds in adaptability by using universal design for learning principles, so that no student is at a disadvantage through no fault of their own. And it builds in security through more thoughtful design that makes cheating ineffective - it enables collaboration rather than trying, and failing, to avoid collusion.

However, this transformation comes with a significant overhead. Each individual academic needs to spend time redesigning their assessments. Each department needs to reconfigure their policies for the exams and the structure of their courses. Each university needs to devise overarching policies and approaches to this new form of assessment. This takes resources, at a time when universities are still dealing with a crisis situation. Taking the time to grasp the opportunity offered now to reimagine assessment will require senior management to declare it to be a priority.

As Paul Cowell notes,

“There's a role for institutions themselves to play here because we know that making a piece of assessment more authentic, more problem-based, more unique, requires more input, both in the design of assessments and in the marking of assessments. It's very easy to design multiple choice quizzes, and just to let them mark themselves if you've got 400 students. So you can understand, if you're constrained with time, then it might be easier to say, ‘Well, let's go down the route of just trying to ensure academic integrity’, but there's a lot of creative and authentic assessment that could be unlocked if we supported staff in doing it.” 

Some of that support will need to focus on digital skills development (although, as Jisc’s learning and teaching reimagined surveys showed, the confidence of teaching staff in their digital skills already increased significantly between March 2020 and September 2020, from 49% to 74%).

Tools that help develop more authentic, real-world assessments at scale would be valuable.

Technology will also play a role, whether that’s end to end assessment platforms such as UNIwise’s WISEflow, or the use of AI to support marking or, in the future, providing reassurance on the integrity of work through pattern matching and forensic linguistics. Tools that help develop more authentic, real-world assessments at scale would be valuable.

"If we redesign exams, if we want to make them as meaningful and as useful as possible, we need to have the closest partnership with students. We need to talk about what might be possible and find out what would be useful. And we need to do that with employers, too."
Klaus-Dieter Rossade, director of assessment programme, the Open University

Ultimately, if any change to assessment is going to be more fundamental and more sustainable than the developments we have seen over the past year, then those changes need to be based on a firm foundation of stronger partnerships. Partnerships with employers to understand better how to make assessment more relevant, with third party product providers to ensure that their solutions are accessible as well as adaptable, and, most importantly of all, with the students who work towards and take the exams.

Q&A with Examity 

Examity was founded to meet the needs of assessment providers, colleges, and employers looking to protect test integrity. Since 2013, Examity has partnered with hundreds of organisations worldwide to provide cost-effective and flexible online proctoring. Jim Holm, CEO of Examity, answers our questions:

Can you give a quick overview of Examity proctoring?

Examity's role is to help preserve the value of academic quality assurance online – grades, degrees or professional credentials for certification and licensure – and we do that through three distinct activities. The first is identity management – we help organisations ensure that the candidate is who they say they are. The second is to help identify behaviours that present risk and be a deterrent to those behaviours. The third area is that we help organisations protect the intellectual property of the exam itself, so that the content isn’t exposed in a way that might necessitate creating new questions every time an exam is released.

Deloitte has named you the fastest growing edtech company in the US for the second year running. What has led to that success?

The reasons started before COVID-19, with a desire to give educational and professional students greater convenience in taking exams. Instead of the hassle and expense of a bricks-and-mortar test centre, perhaps in a costly city location, organisations were trying to figure out ways to provide the experience virtually, extending the capabilities of testing. That provided a nice rate of growth.

Then COVID-19 and the inability to use physical locations meant a move from convenience to need, continuing the growth pattern. That need is reducing again but we believe that the convenience of moving away from bricks and mortar will continue to be a major driver of growth.

What do you see as areas of future development?

There's growth in each of our three activity areas, and in evolving flexibility for institutions in what they can do in each area.

Some educational institutions may want to keep identity management very light – perhaps no record of IDs, just a login to the LMS system and personal recognition by the professor. But a professional certification programme might want to verify robustly that the candidate is exactly who they say they are.

On the risk front, Examity never makes the judgment call as to whether or not someone has done something inappropriate. Our role is to flag areas of risk. Then we refer them back to the educational institution to determine whether or not there was a problem with integrity or the exam experience or the test environment. We want to get better at identifying the behaviour of candidates while they're testing to see if we can continue to improve our predictability of events or activities that create risk. 

And we want to build up additional tools to help us protect the intellectual property of exams. We'll do that through scanning the web for questions, looking at new ways to identify the copying of questions by students or ways to reference questions and answers that other people may have given them.

How do you balance the roles of people and technology?

Technology is evolving rapidly, with increasing ways to augment people’s decision-making, but we don't believe it’s robust enough yet. As an example, when one of our proctors is remotely watching a candidate take an exam, they might see or hear something they question. But we also have technology-driven tools behind the scenes providing additional hints, so proctors know what they should pay attention to. That combination of a person and technology can be better than either one independently.

Do you think there'll be a time when technology will overtake people?

We use both and we're testing both – but a technical system as sophisticated as human judgment is years away. Every person is different, with different behaviours, habits and traits. Each one of those could trigger a system but we can't know exactly what causes those triggers to occur. We're still ways away from a system sophisticated enough to know if me twisting my head, like this, is me reading something off-screen or is just a habit. If I look up habitually when I think, should that flag a cheating experience? I don't think systems can make judgment calls like that yet.

How do you respond to negative perceptions of proctoring among students – issues of data security? Concerns about ‘creepiness’?

There’s a balance between data privacy and our ability to protect the value of the credential, which is primarily why we’re there. How do you measure a student’s performance without giving them recourse to behaviour? Professors would have to design exams very differently to do that. But the way exams are designed today, proctoring is the deterrent to inappropriate behaviour.

We take privacy very seriously. We talk every day about how we can reduce the footprint of the information we need in order to protect the exam experience.  And creepiness? We've done surveys and we know that roughly 70% of our students are very satisfied with the overall exam experience. They like the flexibility, they really enjoy what goes on.

There’s a very small percentage who feel uncomfortable and we’re always looking at ways to reduce that discomfort. How do we make it easier to get into an exam? How do we make sure our proctors are communicating with the student about exactly what's happening? How do we personalise proctors so that they can have a bit of a relationship with the candidate?

When institutions move back to testing virtually for convenience, rather than necessity, many students will still benefit from the convenience of taking an exam anywhere. When virtual testing is an option, they’ll be more comfortable with the trade off between privacy and convenience.

Thanks and acknowledgements

Emerge Education and Jisc would like to thank all the contributors to this long read for their time and expert insight. 

In particular, we would like to thank Chris Cobb, who chaired the working group for this report, and everyone who so kindly spared the time to be interviewed for a case study during an exceptionally busy time for all involved in higher education. Thank you.

  • Maria Aristeidou, lecturer in technology enhanced learning, the Open University
  • Sue Attewell, head of edtech, Jisc
  • Nadia Bentoua, product lead, Jisc
  • Helen Coleman, chief operating officer, ABRSM
  • Paul Cowell, lecturer in economics and deputy associate dean of learning and teaching at the Stirling Management School, University of Stirling
  • Kirsty Fleming, deputy head of assessment, the Royal College of Physicians and Surgeons of Glasgow
  • Rebecca Galley, director of academic services, the Open University
  • Paul Glover, chief operating officer, Fry
  • Jim Holm, CEO, Examity
  • Andy McGregor, director of edtech, Jisc
  • Kellie Mote, subject specialist: strategy (assistive technology), Jisc
  • Mark Newton, managing director, University of London CoSector
  • Mariann Rand-Weaver, vice-provost (education), Brunel University London
  • Klaus-Dieter Rossade, director of assessment programme, the Open University
  • Rachel Schoenig, chief science officer, Examity
  • Piers Wilkinson, accessibility and inclusion consultant and Disabled Students’ Commissioner
  • Fiona Winter, director of education and assessment, the Royal College of Physicians and Surgeons of Glasgow
  • Mark Woodcock, director, business development, Examity

Report sponsor

Thank you to report sponsor Examity.

The content of this report is independent of any particular solution provider.

Authors and editorial team

Michelle Pauli, Michelle Pauli Editorial

Our project partners

  • emerge education logo

About the author

Headshot of Michelle Pauli.
Michelle Pauli
Michelle Pauli editorial