Publications

2012
Jenkins, L., Wisdom, M., & Glover, S. (2012). Increasing College-Going Rates in Fulton County Schools: A Summer Intervention Based on the Strategic Use of Data. Purchase case study on Harvard Education Press websiteAbstract

This case study, published by Harvard Education Press, describes how to use data to challenge assumptions, reveal student needs, address these needs programmatically, and evaluate results. It shows a team of data specialists and educators working together, across institutional and departmental boundaries, to determine why some high school seniors who intend to go to college after graduation do not enroll in the fall. Together, they develop, implement, and evaluate a summer counseling intervention program called Summer PACE to ensure that more students enroll seamlessly in college.

Cascio, E. U., & Staiger, D. O. (2012). Knowledge, Tests, and Fadeout in Educational Interventions. Publisher's VersionAbstract

Educational interventions are often evaluated and compared on the basis of their impacts on test scores. Decades of research have produced two empirical regularities: interventions in later grades tend to have smaller effects than the same interventions in earlier grades, and the test score impacts of early educational interventions almost universally “fade out” over time. This paper explores whether these empirical regularities are an artifact of the common practice of rescaling test scores in terms of a student’s position in a widening distribution of knowledge. If a standard deviation in test scores in later grades translates into a larger difference in knowledge, an intervention’s effect on normalized test scores may fall even as its effect on knowledge does not. We evaluate this hypothesis by fitting a model of education production to correlations in test scores across grades and with college-going using both administrative and survey data. Our results imply that the variance in knowledge does indeed rise as children progress through school, but not enough for test score normalization to fully explain these empirical regularities.

Download full report
Hill, H. C., Charalambous, C. Y., Blazar, D., McGinn, D., Kraft, M. A., Beisiegel, M., Humez, A., et al. (2012). Validating Arguments for Observational Instruments: Attending to Multiple Sources of Variation. Educational Assessment , 17, 1-19.Abstract

Measurement scholars have recently constructed validity arguments in support of a variety of educational assessments, including classroom observation instruments. In this article, we note that users must examine the robustness of validity arguments to variation in the implementation of these instruments. We illustrate how such an analysis might be used to assess a validity argument constructed for the Mathematical Quality of Instruction instrument, focusing in particular on the effects of varying the rater pool, subject matter content, observation procedure, and district context. Variation in the subject matter content of lessons did not affect rater agreement with master scores, but the evaluation of other portions of the validity argument varied according to the composition of the rater pool, observation procedure, and district context. These results demonstrate the need for conducting such analyses, especially for classroom observation instruments that are subject to multiple sources of variation

Download full report
Tyler, J. H., Jacob, B. A., Dougherty, S. M., Hanson, H. J., Fullerton, J. B., & Herlihy, C. M. (2012). Are Practice-Based Teacher Evaluations and Teacher Effectiveness Linked in TNTP’s "Performance Assessment System (PAS)"? . Center for Education Policy Research at Harvard University.Abstract

The CEPR report, “Are Practice-Based Teacher Evaluations and Teacher Effectiveness Linked in TNTP’s Performance Assessment System (PAS)?” examines the evaluation system for first-year Louisiana teachers trained by TNTP, a national nonprofit organization focused on improving teacher performance.  The authors conclude that there is a modest positive relationship between teachers’ PAS scores and actual student achievement growth in math and reading.  The analysis also suggests that, with some technical improvements, the PAS could become an even better predictor of student academic outcomes.

Press Release Download full report
2011
(2011). SDP College-Going Diagnostic for Fulton County Schools . Strategic Data Project.Abstract

Fulton County Schools (FCS) partnered with SDP to produce the SDP College-Going and Human Capital Diagnostic for its district. The diagnostics are meant to demonstrate how districts can capitalize on existing data to understand its current performance, set future goals, and strategically plan responses.  The College-Going Diagnostic report illuminates students’ enrollment over time and compares these patterns across a variety of student characteristics and academic experiences.  The Human Capital Diagnostic report investigates teacher effectiveness with the intention of informing district leaders about patterns of teacher effectiveness and identifying areas for policy change that could leverage teacher effectiveness to improve student achievement.

Fulton County Schools College-Going Diagnostic Report Fulton County Schools College-Going Board of Education PowerPoint Presentation
(2011). SDP Human Capital Diagnostic for Fulton County Schools . Strategic Data Project.Abstract

Fulton County Schools (FCS) partnered with SDP to produce the SDP College-Going and Human Capital Diagnostic for its district. The diagnostics are meant to demonstrate how districts can capitalize on existing data to understand its current performance, set future goals, and strategically plan responses.  The College-Going Diagnostic report illuminates students’ enrollment over time and compares these patterns across a variety of student characteristics and academic experiences.  The Human Capital Diagnostic report investigates teacher effectiveness with the intention of informing district leaders about patterns of teacher effectiveness and identifying areas for policy change that could leverage teacher effectiveness to improve student achievement.

Download full report
(2011). SDP College-Going Diagnostic for Fort Worth Independent School District . Strategic Data Project.Abstract

Fort Worth Independent School District (FWISD) collaborated with SDP to create the SDP College-Going Diagnostic to examine the district’s college-going enrollment and persistence rates.  The diagnostic is designed to identify potential areas for action to increase students’ levels of academic achievement, preparedness for college, and postsecondary attainment.

Download full report
Kane, T. J., Jacob, B., Rockoff, J., & Staiger, D. O. (2011). Can You Recognize an Effective Teacher When You Recruit One? Association for Education Finance and Policy , 6 (1), 43-74. Publisher's VersionAbstract

The authors administered an in-depth survey to new math teachers in New York City and collected information on a number of non-traditional predictors of effectiveness: teaching specific content knowledge, cognitive ability, personality traits, feelings of self-efficacy, and scores on a commercially available teacher selection instrument. They find that a number of these predictors have statistically and economically significant relationships with student and teacher outcomes. The authors conclude that, while there may be no single factor that can predict success in teaching, using a broad set of measures can help schools improve the quality of their teachers.

Read the full report
Papay, J., West, M., Fullerton, J., & Kane, T. (2011). Does Practice-Based Teacher Preparation Increase Student Achievement? Early Evidence from the Boston Teacher Residency.Abstract

Center researchers John Papay, Martin West, Jon Fullerton, and Thomas Kane investigate the effectiveness of the Boston Teacher Residency (BTR) in their working paper Does Practice-Based Teacher Preparation Increase Student Achievement? Early Evidence from the Boston Teacher Residency.  BTR is an innovative practice-based preparation program in which candidates work alongside a mentor teacher for a year before becoming a teacher of record in Boston Public Schools.

Download full report
Taylor, E. S., & Tyler, J. H. (2011). The Effect of Evaluation on Performance: Evidence from Longitudinal Student Achievement Data of Mid-career Teachers. Publisher's VersionAbstract

The effect of evaluation on employee performance is traditionally studied in the context of the principal-agent problem. Evaluation can, however, also be characterized as an investment in the evaluated employee’s human capital. We study a sample of mid-career public school teachers where we can consider these two types of evaluation effect separately. Employee evaluation is a particularly salient topic in public schools where teacher effectiveness varies substantially and where teacher evaluation itself is increasingly a focus of public policy proposals. We find evidence that a quality classroom-observation-based evaluation and performance measures can improve mid-career teacher performance both during the period of evaluation, consistent with the traditional predictions; and in subsequent years, consistent with human capital investment. However the estimated improvements during evaluation are less precise. Additionally, the effects sizes represent a substantial gain in welfare given the program’s costs.

Download full report Download summary
Kane, T. J., Taylor, E., Tyler, J., & Wooten, A. (2011). Identifying Effective Classroom Practices Using Student Achievement Data. The Journal of Human Resources , 46 (3), 587-613.Abstract

This paper combines information from classroom-based observations and measures of teachers’ ability to improve student achievement as a step toward addressing the challenge of identifying effective teachers and teaching practices. The authors find that classroom-based measures of teaching effectiveness are related in substantial ways to student achievement growth. The authors conclude that the results point to the promise of teacher evaluation systems that would use information from both classroom observations and student test scores to identify effective teachers. Information on the types of practices that are most effective at raising achievement is also highlighted.

Download full report
Hill, H., & Herlihy, C. (2011). Prioritizing Teaching Quality in a New System of Teacher Evaluation.Abstract

Teachers are the most important school-level factor in student success--but as any parent knows, all teachers are not created equal. Reforms to the current quite cursory teacher evaluation system, if done well, have the potential to remove the worst-performing teachers and, even more important, to assist the majority in improving their craft. However, the US educational system often cannibalizes its own innovations, destroying their potential with a steady drip of rules, regulations, bureaucracy, and accommodations to the status quo. Because that status quo sets an unacceptably low bar for teaching quality, missing this opportunity now means new generations of students may suffer mediocre—or worse—classrooms.

Prioritizing Teaching Quality in a New System of Teacher Evaluation
Angrist, J. D., Cohodes, S. R., Dynarski, S. M., Fullerton, J. B., Kane, T. J., Pathak, P. A., & Walters, C. R. (2011). Student Achievement in Massachusetts' Charter Schools.Abstract

Researchers from the Harvard Graduate School of Education, MIT, and the University of Michigan have released the results of a new study that suggests that urban charter schools in Massachusetts have large positive effects on student achievement at both the middle and high school levels. Results for nonurban charter schools were less clear; some analyses indicated positive effects on student achievement at the high school level, while results for middle school students were much less encouraging.

View the Press Release

View the PowerPoint Presentation

Read the full report
2009
Abdulkadiroglu, A., Angrist, J., Cohodes, S., Dynarski, S., Fullerton, J., Kane, T., & Pathak, P. (2009). Informing the Debate: Comparing Boston's Charter, Pilot, and Traditional Schools.Abstract

Whether using the randomized lotteries or statistical controls for measured background characteristics, we generally find large positive effects for Charter Schools, at both the middle school and high school levels. For each year of attendance in middle school, we estimate that Charter Schools raise student achievement .09 to .17 standard deviations in English Language Arts and .18 to .54 standard deviations in math relative to those attending traditional schools in the Boston Public Schools. The estimated impact on math achievement for Charter middle schools is extraordinarily large. Increasing performance by .5 standard deviations is the same as moving from the 50th to the 69th percentile in student performance. This is roughly half the size of the blackwhite achievement gap. In high school, the estimated gains are somewhat smaller than in middle school: .16 to .19 standard deviations in English Language Arts; .16 to .19 in mathematics; .2 to .28 in writing topic development; and .13 to .17 in writing composition with the lottery-based results. The estimated impacts of middle schools and high school Charters are similar in both the “observational” and “lottery-based” results.

Download full report Technical Appendix
2008
Kane, T. J., & Staiger, D. O. (2008). Estimating Teacher Impacts on Student Achievement: An Experimental Evaluation.Abstract

The authors used a random-assignment experiment in Los Angeles Unified School District to evaluate various non-experimental methods for estimating teacher effects on student test scores. Having estimated teacher effects during a pre-experimental period, they used these estimates to predict student achievement following random assignment of teachers to classrooms. While all of the teacher effect estimates considered were significant predictors of student achievement under random assignment, those that controlled for prior student test scores yielded unbiased predictions and those that further controlled for mean classroom characteristics yielded the best prediction accuracy. In both the experimental and non-experimental data, the authors found that teacher effects faded out by roughly 50 percent per year in the two years following teacher assignment.

Download full report Download Summary
Cantrell, S., Fullerton, J., Kane, T. J., & Staiger, D. O. (2008). National Board Certification and Teacher Effectiveness: Evidence from a Random Assignment Experiment.Abstract

The National Board for Professional Teaching Standards (NBPTS) assesses teaching practice based on videos and essays submitted by teachers. For this study, the authors compared the performance of classrooms of elementary students in Los Angeles randomly assigned to NBPTS applicants and to comparison teachers. The authors conclude that students assigned to highly-rated applicants outperformed those in the comparison classrooms by more than those assigned to poorly-rated teachers. Moreover, the estimates with and without random assignment were similar.

Download full report

Pages