Imagine a world where getting answers wrong during practice actually propels students forward in their learning journey. New insights from Carnegie Mellon University’s School of Computer Science suggest just that, revealing how minor tweaks in online tutoring platforms can enhance student perseverance.
Researchers at the university examined how simple design modifications, such as changes in text and color, could motivate students to persist beyond their errors. These small changes could potentially transform how students perceive and respond to mistakes.
“Being wrong tends to be a very strong signal that people interpret as, ‘This is not working, I’m going to stop,'” stated Paulo Carvalho(opens in new window), an assistant professor in the Human-Computer Interaction Institute(opens in new window) (HCII). He emphasized the opposite: that errors often lead to valuable feedback that can aid learning, yet students often misinterpret them as a sign of failure.
Collaborating with the South African nonprofit Siyavula(opens in new window), which operates an intelligent AI-based tutoring system, the CMU team explored how these design adjustments could improve persistence. This collaboration was facilitated by HCII Associate Professor Amy Ogan(opens in new window) through her Learning Sciences for Innovators(opens in new window) program.
Their findings were detailed in the paper “Will They Try Again? A Large-Scale RCT on Scaffolds that Support Persistence in an Intelligent Tutoring System(opens in new window),” which earned an honorable mention at the Association for Computing Machinery Conference on Human Factors in Computing Systems (CHI 2026).
Michael Asher(opens in new window), a project scientist at HCII, explained how psychology-inspired designs led to introducing two key interventions: a written prompt and a visual nudge. After a student submitted an incorrect response, the prompt encouraged them to continue trying, while the nudge highlighted the “Try an exercise like this again” button in bright orange, much like familiar default options in human-computer interaction.
“We move through the world making decisions and we process tons of information, so we rely on a lot of shortcuts,” Asher mentioned. He noted that default options are influential in guiding decisions, which is a principle widely used in nudge-based interventions.
In a randomized controlled trial involving around 160,000 students tackling 17 million practice problems on Siyavula’s platform, the team found that the prompt and nudge increased student persistence by 2% and 9%, respectively. When combined, these interventions led to an 11% increase in persistence.
“What we found here — and this is something that was only possible because we were able to work with a really large sample to test this precisely — is that these two interventions actually stack on top of each other really nicely,” stated Asher. The study demonstrated the potential of these interventions in fostering resilience in learners.
In addition to Asher, Ogan, and Carvalho, the research team included HCII doctoral student Yumou Wei and the Siyavula Foundation’s Adam Reynolds. More details can be found in their research paper on the CHI 2026 website(opens in new window).
Read More Here









Comments are closed.