Researchers have identified many characteristics of teachers and teaching that contribute to student outcomes. However, most studies investigate only a small number of these characteristics, likely underestimating the overall contribution. In this paper, we use a set of 28 teacher-level predictors drawn from multiple research traditions to explain teacher-level variation in student outcomes. These predictors collectively explain 28% of teacher-level variability in state standardized math test scores and 40% in a predictor-aligned math test. In addition, each individual predictor explains only a small, relatively unique portion of the total teacher-level variability. This first finding highlights the importance of choosing predictors and outcomes that are well aligned, and the second suggests that the phenomena underlying teacher effects is multidimensional.
The purpose of this study is to investigate three aspects of construct validity for the Mathematical Quality of Instruction classroom observation instrument: (1) the dimensionality of scores, (2) the generalizability of these scores across districts, and (3) the predictive validity of these scores in terms of student achievement.
As many states are slated to soon use scores derived from classroom observation instruments in high-stakes decisions, developers must cultivate methods for improving the functioning of these instruments. We show how multidimensional, multilevel item response theory models can yield information critical for improving the performance of observational instruments.
Education agencies are evaluating teachers using student achievement data. However, very little is known about the comparability of test-based or "value-added" metrics across districts and the extent to which they capture variability in classroom practices. Drawing on data from four urban districts, we find that teachers are categorized differently when compared within versus across districts. In addition, analyses of scores from two observation instruments, as well qualitative viewing of lesson videos identify stark differences in instructional practices across districts among teachers who receive similar within-district value-added rankings. Exploratory analyses suggest that these patterns are not explained by observable background characteristics of teachers and that factors beyond labor market sorting likely play a key role.
A growing economics literature reveals that small and subtle policy adjustments can induce relatively large “ripple effects.” We contribute to this literature by evaluating a College Board initiative, launched in the fall of 2007, which increased the number of free official SAT score reports afforded to low-income students and changed the time horizon over which these free score sends could be used. By resetting the default number of free SAT score reports from four to eight for SAT fee-waiver recipients, the College Board hoped to increase the number of college applications submitted by these students and to improve their college match. Using a difference-in-differences analytic strategy, we show that low-income students took advantage of this policy and were roughly 10 percentage points more likely to send eight or more score reports. We find that this policy achieved its intended goal of increasing college access and that it also favorably impacted college completion rates. Specifically, we estimate that inducing a low-income student to send one more score report, on average, increased on-time college attendance by nearly 5 percentage points and five-year bachelor’s completion by slightly more than 3 percentage points. The policy impact was driven entirely by students who, based on SAT scores, were competitive candidates for admission to four-year colleges.
In 2011, the Strategic Data Project (SDP) began a partnership with the Wake County Public School System (WCPSS). As part of this partnership, SDP collaborated with WCPSS to analyze patterns of high school students’ on-track status, graduation, college enrollment, and college persistence. This set of high-leverage, policy-relevant analyses constitutes the SDP College-Going Diagnostic.
This toolkit provides practical guidance for education practitioners on using video observations to help teachers accelerate their development. Inside you will find four sections to help you start video observations in your school community. Each section includes a discussion of important lessons from the Best Foot Forward project, a study of digital video in classroom observations, and adaptable tools for implementation.
The Strategic Data Project (SDP) and the Kentucky Department of Education (KDE) collaborated on the SDP College-Going Diagnostic—a set of policy-relevant analyses that track Kentucky public school students’ graduation from high school through enrollment and persistence in college. This interactive report highlights the key findings from this research collaboration and is designed to facilitate exploration across important student characteristics.
In an effort to promote college enrollment and degree completion, the state of Tennessee has invested a student-centric, technology-based blended learning model of high school mathematics instruction, The Seamless Alignment and Integrated Learning Support (SAILS).
The SAILS program provides high school seniors likely to require math remediation in college with coursework equivalent to college-level developmental education classes. Eligible students who complete the program are able to satisfy math require-ments for high school graduation and, upon postsecondary matriculation, to enroll directly in credit-bearing coursework toward a college degree.
Researchers at the Center for Education Policy Research at Harvard University and Vanderbilt Peabody College of Education are partnering with the SAILS Program and Tennessee state leadership to conduct an evaluation of SAILS. Using a range of quantitative and qualitative research methods, the study will examine the impact of participation in SAILS on students’ short- and long-term outcomes and investigate the mechanisms by which the program may promote students’ postsecondary success.
This case illustrates how the work of leaders and analysts in the Delaware Department of Education (DDOE) and the agency’s partnership with the Strategic Data Project (SDP), a program of the Center for Education Policy Research at Harvard University, created momentum for statewide policy change. By exploring Delaware leaders’ use of data and analytics to challenge assumptions and inform the development of better policies and practices, the case illustrates the importance of leadership, analytic and technical competency, and strategic partnerships when leading education reform. The case specifically highlights the power of human capital analytics to diagnose the current status of Delaware’s educator pipeline, from preparation through development and retention, and how effectively communicating with these analyses built coalitions of support and drove a culture of data use at both the state and district level. Download the case study [SDP website]
The Strategic Data Project (SDP) collaborated with the state of Delaware to illuminate patterns related to three critical areas of policy focus for the state: the recruitment, placement, and success of new and early career teachers; teacher impact on student learning; and teacher retention and the stability of the state’s teacher workforce.
The Strategic Data Project (SDP) partnered with the Cleveland Metropolitan School District (CMSD) to expand the use of data to inform policy and management decisions within the district. As part of this partnership, SDP collaborated with CMSD to analyze the high school graduation and college-going outcomes of CMSD students.
SDP launched a College-Going Diagnostic research collaboration with the Tennessee Department of Education (TDOE) as part of a larger partnership between the two organizations. In defining the scope of work for this project, TDOE policymakers were particularly interested in investigating how students’ transition from high school to postsecondary education differed across schools, regions, and student subgroups. The SDP College-Going Diagnostic examines the extent to which Tennessee high school students faced specific barriers to postsecondary enrollment, such as inadequate academic preparation for college-level coursework and limited college access.
The Strategic Data Project (SDP) designed the College-Going Diagnostic to inform state and district leaders about high school graduation, college enrollment, and persistence rates; and to identify potential areas for action to increase student achievement in high school, preparedness for college, and postsecondary attainment. In 2012, the New York State Education Department (NYSED) and SDP launched an initiative to assess student attainment in New York State public schools, and identified a set of questions to understand how first-time ninth-grade students in New York State public schools progress through high school and into college. The interactive graphics supporting the findings in this brief are designed to promote deeper engagement with the analysis through data exploration. View the interactive report on the SDP website
The Strategic Data Project (SDP) collaborated with the Colorado Department of Education (CDE) and the Colorado Education Initiative (CEI) to conduct SDP’s Human Capital Diagnostic—a series of high leverage, policy-relevant analyses related to the state’s educator workforce. SDP’s Human Capital Diagnostic investigates questions on five critical topics related to teachers and teacher effectiveness: recruitment, placement, development, evaluation, and retention.