Resources

Resources by Type

Research Report

Kane, T. J., Blazar, D., Gehlbach, H., & Greenberg, M. (2020). Can Video Technology Improve Teacher Evaluations? An Experimental Study. The MIT Press Journals , 15 (3), 397-427.Abstract

Teacher evaluation reform has been among the most controversial education reforms in recent years. It also is one of the costliest in terms of the time teachers and principals must spend on classroom observations. We conducted a randomized field trial at four sites to evaluate whether substituting teacher-collected videos for in-person observations could improve the value of teacher observations for teachers, administrators, or students. Relative to teachers in the control group who participated in standard in-person observations, teachers in the video-based treatment group reported that post-observation meetings were more “supportive” and they were more able to identify a specific practice they changed afterward. Treatment principals were able to shift their observation work to noninstructional times. The program also substantially increased teacher retention. Nevertheless, the intervention did not improve students’ academic achievement or self-reported classroom experiences, either in the year of the intervention or for the next cohort of students. Following from the literature on observation and feedback cycles in low-stakes settings, we hypothesize that to improve student outcomes schools may need to pair video feedback with more specific supports for desired changes in practice.

Click to read full text on MIT Press Journals

Chin, M., Kane, T., Kozakowski, W., Schueler, B., & Staiger, D. (2017). Assessing the Impact of the Newark Education Reforms . Center for Education Policy Research at Harvard University.Abstract
Aided by $200 million in private philanthropy, city and state leaders launched a major school reform effort in Newark, New Jersey, starting in the 2011–2012 school year. In a coinciding National Bureau of Economic Research (NBER) working paper, we assessed the impact of those reforms on student achievement growth, comparing students in Newark Public Schools (NPS) district and charter schools to students with similar prior achievement, similar demographics, and similar peers elsewhere in New Jersey. This report includes key findings.
Hill, H. C., Kraft, M. A., & Herlihy, C. (2016). Developing Common Core Classrooms Through Rubric-Based Coaching . Center for Education Policy Research at Harvard University.Abstract

The project team is still awaiting student test data to complete the evaluation, but this brief provides a short update on survey results. Students of MQI-coached teachers report that their teachers ask more substantive questions, and require more use of mathematical vocabulary as compared to students of control teachers. Students in MQI-coached classrooms also reported more student talk in class. Teachers who received MQI Coaching tended to find their professional development significantly more useful than control teachers, and were also more likely to report that their mathematics instruction improved over the course of the year.

Kane, T. J. (2016). Let the Numbers Have Their Say: Evidence on Massachusetts' Charter Schools . Center for Education Policy Research at Harvard University.Abstract

In Massachusetts, the charter school debate has centered on four concerns:

  • that the achievement of the high-scoring charter schools is due to selective admission and retention policies and not the education that the charter schools provide,
  • that charter schools are underserving English language learners and special education students,
  • that charter schools are disciplining students at higher rates in order to drive troublesome students back to traditional schools, and
  • that charter schools are undermining traditional public schools financially.

This report summarizes the evidence pertaining to these four concerns.

West, M. R., Morton, B. A., & Herlihy, C. M. (2016). Achievement Network’s Investing in Innovation Expansion: Impacts on Educator Practice and Student Achievement.Abstract

Achievement Network (ANet) was founded in 2005 as a school-level intervention to support the use of academic content standards and assessments to improve teaching and learning. Initially developed within the Boston charter school sector, it has expanded to serve over 500 schools in nine geographic networks across the United States. The program is based on the belief that if teachers are provided with timely data on student performance from interim assessments tied to state standards, if school leaders provide support and create structures that help them use that data to identify student weaknesses, and if teachers have knowledge of how to improve the performance of students who are falling behind, then they will become more effective at identifying and addressing gaps in student learning. This will, in turn, improve student performance, particularly for high-need students.

In 2010, ANet received a development grant from the U.S. Department of Education’s Investing in Innovation (i3) Program. The grant funded both the expansion of the program to serve up to 60 additional schools in five school districts, as well as an external evaluation of the expansion. The Center for Education Policy Research (CEPR) at Harvard University partnered with ANet to design a matched-pair, school-randomized evaluation of their program’s impact on educator practice and student achievement in schools participating in its i3-funded expansion.

Hurwitz, M., Mbekeani, P. P., Nipson, M., & Page, L. C. (2016). Surprising Ripple Effects: How Changing the SAT Score-Sending Policy for Low-Income Students Impacts College Access and Success. Education Evaluation and Policy Analysis. Publisher's VersionAbstract

Subtle policy adjustments can induce relatively large “ripple effects.” We evaluate a College Board initiative that increased the number of free SAT score reports available to low-income students and changed the time horizon for using these score reports. Using a difference-in-differences analytic strategy, we estimate that targeted students were roughly 10 percentage points more likely to send eight or more reports. The policy improved on-time college attendance and 6-year bachelor’s completion by about 2 percentage points. Impacts were realized primarily by students who were competitive candidates for 4-year college admission. The bachelor’s completion impacts are larger than would be expected based on the number of students driven by the policy change to enroll in college and to shift into more selective colleges. The unexplained portion of the completion effects may result from improvements in nonacademic fit between students and the postsecondary institutions in which they enroll.

SDP Partner Diagnostic

Resources By Focus Area

Teacher Effectiveness

Postsecondary Access & Success

School Improvement & Redesign