“Persistent Performance among University Students: Evidence from High-Stakes Exams on Digital Platforms” (PERSIST) is a national project led by our fellow Jonas Radl and funded through a grant by the Spanish Ministry of Science and Innovation (PID2020-117525RB-I00, 2021-2024).
The proposal centers around novel data collection from online exams carried out at Universidad Carlos III de Madrid in the academic years 2019/20 and 2020/21. In the last evaluation period in mid-2020, around 15,000 exams were run on the university’s digital learning platform “Aula Global”. Based on the open-source system “Moodle”, the platform contains the option to randomize the order of questions in multiple-choice tests, allowing us to compute students’ exam persistence – the degree to which students sustain their performance during an exam. We propose matching these data to the university’s database containing students’ socio-demographic characteristics.
Personality traits such as persistence, perseverance and grit relate to an individual’s ability to sustain effort over time, and are argued to be highly relevant for socio-economic achievement. Yet, the valid measurement of these “non-cognitive skills” has proven to be difficult, particularly because of biases arising from self-reports in surveys. Hence, a recent methodological innovation is to measure the persistence of students while taking the well-known PISA test, i.e. the relative decrease of performance during the two-hour test, while controlling for the difficulty of each question. On average, performance drops by about 7% over the course of the test, with significant differences between students, demographic groups and countries.
However, the key limitation of this methodology is the lack of stakes in the PISA tests: students’ performance does not affect their grades, and they never even learn how well they did in the test. It remains unclear whether the observed drop in performance reflects low motivation, or whether it is representative for individual behavior in other settings. The external validity is also questionable because the PISA evaluation is a rare event in which students’ behavior may also depend on the messages conveyed on the ground. In Spain specifically, the results from the last PISA reading test were not released due to anomalies in implementation. University exams, by contrast, are a more natural setting in which students have strong incentives to perform and a lot at stake.
By exploiting the pervasive use of high-stakes exams on the university’s digital platform, especially after the COVID-19 pandemic, the project aims to address three objectives:
(1) uncover how sociodemographic characteristics – and especially gender – affect students’ persistence in high-stakes exams;
(2) establish the way in which students’ exam persistence has changed during the COVID-19 pandemic, depending on prior educational performance;
(3) evaluate how parameters of digital educational design (e.g. timing, duration and stakes of exams) affect students’ sustained effort under duress.
The availability of large-scale data on students’ exam persistence holds immense untapped potential to push the knowledge frontier. Moreover, many universities use the same digital learning platform, such that our approach could be scaled up efficiently, by using exam records at other educational institutions. The novel application of the test persistence method on digital platforms will therefore advance our understanding of educational achievement, and support evidence-based policies to enhance equality of opportunity.