Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

Educational interventions are often shown to be effective in lab or pilot RCTs, but then subsequently fail to retain their treatment impact when applied at scale. This dissertation consists of three field experiments, each evaluating the impact of an educational intervention. The common thread in each of these is that the interventions I evaluate have at least one element that makes them more feasible to scale relative to similar interventions. In the first chapter, I conduct a large-scale field experiment on learning by teaching. While previous interventions show evidence of “learning by teaching” in lab settings, this study tests the impact in a field setting over an 8-week period. Classrooms are randomly assigned to have students (1) create explanation videos, (2) complete passive practice problems, or (3) placed in a control condition. The explanation treatment improved short-run scores by 0.17 SD and long-run grades by 0.07 SD relative to the practice-problem group. Notably, while both treatment groups improved relative to control, only the explanation treatment improved performance on novel problems, suggesting that explaining concepts enhances one's ability to understand deeply and generalize concepts. In the second chapter, I evaluate the impact of an in-school tutoring program. While schools aim to have both “high dosage” and “small groups”, budget constraints make it infeasible to deliver small group tutoring frequently at scale. In this paper, I test the relative importance of group size (quality) versus tutoring frequency (quantity). Students at a middle school were randomly assigned to either 1) a control condition, or to receive in-school math tutoring 2) twice a week in 2-student groups, or 3) three times a week in 3-student groups. Importantly, the total cost per student is the same in both treatment conditions. I find that the 2-student group tutoring led to a significant improvement in math skills (0.23 SD), whereas the equal-cost, more frequent tutoring in the 3-student groups did not lead to a significant improvement in math skills. In the third chapter, co-authored with Ariel Kalil and Susan Mayer, we evaluate an intervention to increase attendance at preschool parent engagement events. We designed an intervention using a combination of financial incentives and two tools from behavioral economics: loss-framing and reminder messages. The treatment parents were given a loss-framed $25 per event incentive to attend 8 events sponsored by their preschools, as well as weekly text message reminders about the events. Relative to other similar RCTs, our smaller financial incentive is more feasible to implement at scale. We find no extensive margin treatment effect: the intervention did not increase the fraction of parents who attended at least one event. However, we find a 32% intensive margin treatment effect. This tells us that while behavioral tools can help already-involved parents engage more with preschools, they are not enough to reach disengaged parents. This study was recently accepted for publication in Applied Economics.

Details

PDF

from
to
Export
Download Full History