The Strategic Data Project (SDP) and the Kentucky Department of Education (KDE) collaborated on the SDP College-Going Diagnostic—a set of policy-relevant analyses that track Kentucky public school students’ graduation from high school through enrollment and persistence in college. This interactive report highlights the key findings from this research collaboration and is designed to facilitate exploration across important student characteristics.
Recent investigations into the education production function have moved beyond traditional teacher inputs, such as education, certification, and salary, focusing instead on observational measures of teaching practice. However, challenges to identification mean that this work has yet to coalesce around specific instructional dimensions that increase student achievement. I build on this discussion by exploiting within-school, between-grade, and cross-cohort variation in scores from two observation instruments; further, I condition on a uniquely rich set of teacher characteristics, practices, and skills. Findings indicate that inquiry-oriented instruction positively predicts student achievement. Content errors and imprecisions are negatively related, though these estimates are sensitive to the set of covariates included in the model. Two other dimensions of instruction, classroom emotional support and classroom organization, are not related to this outcome. Findings can inform recruitment and development efforts aimed at improving the quality of the teacher workforce.
In an effort to promote college enrollment and degree completion, the state of Tennessee has invested a student-centric, technology-based blended learning model of high school mathematics instruction, The Seamless Alignment and Integrated Learning Support (SAILS).
The SAILS program provides high school seniors likely to require math remediation in college with coursework equivalent to college-level developmental education classes. Eligible students who complete the program are able to satisfy math require-ments for high school graduation and, upon postsecondary matriculation, to enroll directly in credit-bearing coursework toward a college degree.
Researchers at the Center for Education Policy Research at Harvard University and Vanderbilt Peabody College of Education are partnering with the SAILS Program and Tennessee state leadership to conduct an evaluation of SAILS. Using a range of quantitative and qualitative research methods, the study will examine the impact of participation in SAILS on students’ short- and long-term outcomes and investigate the mechanisms by which the program may promote students’ postsecondary success.
This case illustrates how the work of leaders and analysts in the Delaware Department of Education (DDOE) and the agency’s partnership with the Strategic Data Project (SDP), a program of the Center for Education Policy Research at Harvard University, created momentum for statewide policy change. By exploring Delaware leaders’ use of data and analytics to challenge assumptions and inform the development of better policies and practices, the case illustrates the importance of leadership, analytic and technical competency, and strategic partnerships when leading education reform. The case specifically highlights the power of human capital analytics to diagnose the current status of Delaware’s educator pipeline, from preparation through development and retention, and how effectively communicating with these analyses built coalitions of support and drove a culture of data use at both the state and district level. Download the case study [SDP website]
New observation instruments used in research and evaluation settings assess teachers along multiple domains of teaching practice, both general and content-specific. However, this work infrequently explores the relationship between these domains. In this study, we use exploratory and confirmatory factor analyses of two observation instruments - the Classroom Assessment Scoring System (CLASS) and the Mathematical Quality of Instruction (MQI) - to explore the extent to which we might integrate both general and content-specific view of teaching. Importantly, bi-factor analyses that account for instrument-specific variation enable more robust conclusions than in existing literature. Findings indicate that there is some overlap between instruments, but that the best factor structures include both general and content-specific practices. This suggests new approaches to measuring mathematics instruction for the purposes of evaluation and professional development.
The Strategic Data Project (SDP) designed the College-Going Diagnostic to inform state and district leaders about high school graduation, college enrollment, and persistence rates; and to identify potential areas for action to increase student achievement in high school, preparedness for college, and postsecondary attainment. In 2012, the New York State Education Department (NYSED) and SDP launched an initiative to assess student attainment in New York State public schools, and identified a set of questions to understand how first-time ninth-grade students in New York State public schools progress through high school and into college. The interactive graphics supporting the findings in this brief are designed to promote deeper engagement with the analysis through data exploration. View the interactive report on the SDP website
The Strategic Data Project (SDP) partnered with the Cleveland Metropolitan School District (CMSD) to expand the use of data to inform policy and management decisions within the district. As part of this partnership, SDP collaborated with CMSD to analyze the high school graduation and college-going outcomes of CMSD students.
SDP launched a College-Going Diagnostic research collaboration with the Tennessee Department of Education (TDOE) as part of a larger partnership between the two organizations. In defining the scope of work for this project, TDOE policymakers were particularly interested in investigating how students’ transition from high school to postsecondary education differed across schools, regions, and student subgroups. The SDP College-Going Diagnostic examines the extent to which Tennessee high school students faced specific barriers to postsecondary enrollment, such as inadequate academic preparation for college-level coursework and limited college access.
The Strategic Data Project (SDP) collaborated with the state of Delaware to illuminate patterns related to three critical areas of policy focus for the state: the recruitment, placement, and success of new and early career teachers; teacher impact on student learning; and teacher retention and the stability of the state’s teacher workforce.
The Strategic Data Project (SDP) collaborated with the Colorado Department of Education (CDE) and the Colorado Education Initiative (CEI) to conduct SDP’s Human Capital Diagnostic—a series of high leverage, policy-relevant analyses related to the state’s educator workforce. SDP’s Human Capital Diagnostic investigates questions on five critical topics related to teachers and teacher effectiveness: recruitment, placement, development, evaluation, and retention.
The Strategic Data Project (SDP) partnered with the Colorado Department of Education (CDE) and the Colorado Education Initiative (CEI) to investigate whether Colorado public school students who are academically behind their peers are disproportionately placed with novice teachers.
The Strategic Data Project (SDP) partnered with the Colorado Department of Education(CDE) and the Colorado Education Initiative (CEI) to investigate statewide trends in students’ high school graduation and their enrollment and persistence in college. This brief summarizes several of the key findings from this research collaboration.
The SDP Strategic Use of Data Rubric is a resource to provide direction and support to education organizations in their efforts to transform their use of data. The rubric establishes a common language and framework to more clearly illustrate what effective data use at the system level can look like. Learn more and download the rubric [SDP website]
An important assumption underlying meaningful comparisons of scores in rater-mediated assessments is that measurement is commensurate across raters. When raters differentially apply the standards established by an instrument, scores from different raters are on fundamentally different scales and no longer preserve a common meaning and basis for comparison. In this study, we developed a method to accommodate measurement noninvariance across raters when measurements are cross-classified within two distinct hierarchical units. We conceptualized random item effects cross-classified graded response models and used random discrimination and threshold effects to test, calibrate, and account for measurement noninvariance among raters. By leveraging empirical estimates of rater-specific deviations in the discrimination and threshold parameters, the proposed method allows us to identify noninvariant items and empirically estimate and directly adjust for this noninvariance within a cross-classified framework. Within the context of teaching evaluations, the results of a case study suggested substantial noninvariance across raters and that establishing an approximately invariant scale through random item effects improves model fit and predictive validity.
This report examines Albuquerque Public Schools (APS) students’ high school performance, college enrollment, and college persistence patterns, and compares these patterns across a variety of student characteristics and academic experiences. To conduct the analyses, researchers connected APS administrative student data (including demographics and test scores) to college enrollment records and to student surveys conducted by SDP Data Fellows at several APS high schools. These data sources allowed the diagnostic to track students’ progress through high school to graduation, and to examine their college-going aspirations and actual college outcomes.