“Persistence in Student Performance: Evidence from University Exams on Digital Platforms” (PERSIST) is a national project led by our colleague Jonas Radl and funded by a research grant from the Spanish Ministerio de Innovación y Ciencia (PID2020-117525RB-I00, 2021–2025). It is a collaboration with the Department of Economics, with Jan Stuhler serving as Co-PI. Patricia Lorente is a predoctoral researcher within the project. Other project members included William Foley (Universidad Carlos III de Madrid) and Francesca Baronchelli (Università della Svizzera italiana).
The project focuses on the novel collection of online exam data conducted at Universidad Carlos III de Madrid during the 2019/2020 academic year. In the summer of 2020, approximately 15,000 exams were taken through the university’s digital platform “Aula Global.”
The database shows great heterogeneity both in student profiles and in the characteristics of the exams administered, including, for example, variations in the weight assigned to each exam. This diversity provides ample empirical opportunities to analyze how student performance is determined in assessments.
Additionally, the “Moodle” platform includes an option to randomize the order of questions in multiple-choice exams. This feature allows us to add another analytical dimension beyond performance. Thanks to the random order of questions, we are able to estimate students’ test persistence —the ability to maintain consistent performance throughout the test. Personality traits such as persistence, perseverance, and determination resemble an individual’s ability to sustain effort over time and are believed to be highly relevant for socioeconomic achievement. However, accurately measuring these traits is difficult, especially due to biases in survey responses. This novel measure of student persistence—via performance decline throughout the exam—has been applied in the well-known PISA assessment, where a 7% drop in performance has been observed during the course of the test, with significant differences across individual students, demographic groups, and countries.
Nonetheless, a main limitation of this methodology is the question of its external validity: the PISA assessment is an unusual event in which student behavior may depend on the specific implementation, and in which students have no real incentives to exert effort. By contrast, university exams are a more naturalt setting, where students have strong incentives to perform their best. One of the strengths of this project is that, with our data, we can study student persistence in exams with real incentives.
Combining these Moodle data with the university’s administrative data—which provide sociodemographic and academic characteristics of the students—allows us to pursue the following research objectives:
- Explore ways to improve exam administration at the university, based on the project’s findings regarding the effect of time intervals between exams on students’ performance efficiency and equity. We also aim to explore the heterogeneity of these effects based on different student characteristics. This heterogeneity may reflect differences in baseline competence, study habits, or the ability to effectively use longer study periods.
- Study gender differences in performance and persistence in test-taking. We will also examine existing gaps between STEM and non-STEM subjects and consider how these gaps vary depending on the weight of the exam in the final course grade.
- Understand the interaction between students’ prior performance and exam characteristics (such as difficulty), question properties (such as relative difficulty), and exam administration (such as time limits). These analyses aim to shed light on why the phenomenon of performance decline may even be observed in high-stakes university exams. This offers a new perspective in the study of persistence in testing, highlighting the importance of interactions between student characteristics, exam properties, and test administration conditions when assessing this skill.
The availability of large-scale data on students’ exam persistence holds immense, untapped potential for research. Moreover, many universities use the same digital platform, which could make it possible to expand the analysis in the future. Studying student performance alongside this novel methodology of persistence in exams will allow us to advance our understanding of educational achievement and provide empirical evidence for public policy aimed at improving equal opportunities.
The project officially concluded in spring 2025 and received a positive evaluation. However, the team continues to work on the results. The ongoing working papers are listed below:
Working papers:
Baronchelli, Francesca; Foley, William; Lorente, Patricia; Radl, Jonas & Stuhler, Jan (2025): “Preparation Time and Exam Performance: Heterogeneous Effects by Student Background Characteristics”, Preprint, Handle: https://hdl.handle.net/10016/46652
Lorente, Patricia (2025): “Exploring Gender Differences in Persistence: Analyzing Test-Taking Behaviours in High-Stakes Assessments”, Preprint, Handle: https://hdl.handle.net/10016/46653