Member storyUCL campus building

How UCL is redesigning assessment for the AI age

UCL is reimagining assessment and feedback in response to the rise of generative AI and the opportunities and challenges it brings.

At the start of the year, when the initial furore over the impact of generative AI on assessment integrity in universities exploded, one document was referred to over and over again. UCL’s guidance for students on how to use tools such as ChatGPT effectively and ethically stood out as a beacon of calm amid the panic.

Since then, UCL has remained in the vanguard of universities exploring how best to address the opportunities and challenges of generative AI in assessment. For Marieke Guy, head of digital assessment at UCL, tackling the topic head on is crucial.

“We need to expose students to AI to make sure that they understand how AI works. The very first step is for academics to talk to their students about AI. Let's not brush it under the carpet. We need to start having those discussions and understanding it.”

She gives the example of an academic in UCL’s Bartlett School of Environment, Energy and Resources who has been integrating AI into his assessments. However, he paired the AI assessment task with a survey the students completed before and after the assignments, which asked about their attitudes to generative AI. While before the task, students were very polarised in their views – “it was either going to save the world or destroy it!” – after the task, the students were much more critical in their thinking with a better understanding how AI works, and its limitations and advantages.

These kinds of informed discussions are key and, at UCL, include both staff and students, involving a number of fora and activities ranging from the AI expert group, which is considering four different theme areas (academic skills, assessment design, regulations, and opportunities for the future, such as learning and teaching using AI) to student ‘AI interns’ and the Student Changemaker programme.

As well as innovative ways of exploring the assessment space, such as a piece of work using Lydia Arnold's Assessment Top Trumps, to create resources on designing assessment for an AI-enabled world.

Members of the expert group are also taking a careful look at all the AI tools available.

“We've been working on a categorisation of tools which is really interesting. It looks at things like ethical values, the datasets, the benefits, so that students can understand the implications of using different tools."

This thoughtful approach extends to considering how different departments and faculties will need to adapt their assessment approaches in the new context of AI; it’s certainly not a one size fits all policy when it comes to AI and assessment.

Scanning the horizon

UCL’s digital assessment advisors have undertaken landscape reports of faculties, with a recognition that while e-portfolios and peer assessment approaches might work for the humanities, much more traditional forms of assessment might still be needed in some STEM subjects where knowledge-based learning is more fundamental, and especially where core skills are required. Marieke’s team will be thinking about how to best support the different disciplines over the forthcoming year.

Ultimately, it is all about ‘assessing for a world beyond assessment’ – setting students up in the best possible way for their future working lives. For Mary McHarg, activities and engagement officer at Students'​ Union UCL, there is no doubt that this will be a world in which AI is deeply embedded:

“Assessment needs to be able to take into account the technology that students are going to have access to in the wider world and the skills they will need for their careers. AI is becoming more and more integrated into everyday tools. Universities are going to have to acknowledge that fact and teach their students about AI tools as part of their regular education.

"In this same way that we are taught how to use a word processor or how to use a video editing software, we need to learn how to effectively use a tool like ChatGPT and understand its limitations.”

As a result, at UCL, students are a solid part of the ongoing dialogue, learning about and co-developing what good looks like around assessment and feedback.

However, it’s an area that’s not without challenge. In a university with 43,000 students that runs around 520,000 student to assessment instances, consistency is inevitably an issue. Mary comments:

“Depending on if you are in one department, in one faculty versus another, you may have assessments weekly, continuously throughout the year, and get very limited feedback. Or you may have everything in one massive, hours-long online exam at the end.

“There are such crazily different experiences, which makes it hard to tackle that disparity and make it so that everybody is experiencing the same good level of assessment and feedback. That's one of the big challenges that UCL is having to face right now in this space – levelling out the playing field so that everyone experiences the same thing no matter what they study.”

Within that diversity of frequency of assessment is also a diversity of tools being used, which is an area Marieke is keen to understand better.

“Our tools landscape is something that's very much up for discussion. We're hoping to look at that ecosystem and maybe align where we can to make sure that we understand the best tools that are for the particular assessment purpose.

“People might be using the wrong tool for a particular assessment approach, and we'd really like to make sure that we share a bit more information about the pros and cons of different tools in the space.”

Other challenges include digital capability for both staff and students, while, for Marieke’s team, the time spent dealing with Professional Statutory and Regulatory Bodies (PSRBs) and accreditation processes is a perennial issue, along with the challenge of scaling up all those interesting, unusual assessment approaches that take more work and effort. Marieke adds:

“I feel that's where my team comes in. We don't have all the answers but we're doing some amazing work with faculties to get to the heart of the questions, to understand the needs in different spaces. You need that expert group who can have those conversations at the coal face and understand the type of challenges the academics working with the students are facing.”

Advice to senior leaders

Marieke and Mary suggest:

  1. Be big, brave and bold, and communicate a clear vision
  2. Listen to students and their worries for their futures
  3. Enable staff and students to have potentially difficult conversations around assessment
  4. Provide resources and support so that students can truly be partners and staff are given the space and energy to participate in those discussions
  5. Encourage staff to take up CPD opportunities
  6. Invest in experts in digital assessment and trust them

You can listen to Marieke and Mary discuss redesigning assessment on our Beyond the Technology podcast.

Assessment and AI resources