Teacher Effectiveness

Blazar, D., Gogolen, C., Hill, H. C., Humez, A., & Lynch, K. (2014). Predictors of Teachers' Instructional Practices.Abstract

We extend this line of research by investigating teacher career and background characteristics, personal resources, and school and district resources that predict an array of instructional practices identified on a mathematics-specific observational instrument, MQI, and a general instrument, CLASS. To understand these relationships, we use correlation and regression analyses. For a subset of teachers for whom we have data from multiple school years, we exploit within-teacher, cross-year variation to examine the relationship between class composition and instructional quality that is not confounded with the sorting of "better" students to "better" teachers. We conclude that multiple teacher- and school-level characteristics--rather than a single factor--are related to teachers' classroom practices.

Kraft, M. A., & Papay, J. P. (2014). Can Professional Environments in Schools Promote Teacher Development? Explaining Heterogeneity in Returns to Teaching Experience. Educational Evaluation and Policy Analysis , 36 (4), 476-500. Publisher's VersionAbstract

Although wide variation in teacher effectiveness is well established, much less is known about differences in teacher improvement over time. We document that average returns to teaching experience mask large variation across individual teachers, and across groups of teachers working in different schools. We examine the role of school context in explaining these differences using a measure of the professional environment constructed from teachers’ responses to state-wide surveys. Our analyses show that teachers working in more supportive professional environments improve their effectiveness more over time than teachers working in less supportive contexts. On average, teachers working in schools at the 75th percentile of professional environment ratings improved 38% more than teachers in schools at the 25th percentile after ten years.

Hill, H. C., Gogolen, C., Litke, E., Humez, A., Blazar, D., Corey, D., Barmore, J., et al. (2013). Examining High and Low Value-Added Mathematics: Can Expert Observers Tell the Difference? In Association for Public Policy Analysis & Management Fall Research Conference . Washington, DC.Abstract

In this study, we use value-added scores and video data in order to mount an exploratory study of high- and low-VAM teachers' instruction. Specifically, we seek to answer two research questions: First, can expert observers of mathematics instruction distinguish between high- and low-VAM teachers solely by observing their instruction? Second, what instructional practices, if any, consistently characterize high but not low-VAM teacher classrooms? To answer these questions, we use data generated by 250 fourth- and fifth-grade math teachers and their students in four large public school districts.Preliminary analyses indicate that a teacher's value-added rank was often not obvious to this team of expert observers.

Kane, T. J., Taylor, E., Tyler, J., & Wooten, A. (2011). Identifying Effective Classroom Practices Using Student Achievement Data. The Journal of Human Resources , 46 (3), 587-613.Abstract

This paper combines information from classroom-based observations and measures of teachers’ ability to improve student achievement as a step toward addressing the challenge of identifying effective teachers and teaching practices. The authors find that classroom-based measures of teaching effectiveness are related in substantial ways to student achievement growth. The authors conclude that the results point to the promise of teacher evaluation systems that would use information from both classroom observations and student test scores to identify effective teachers. Information on the types of practices that are most effective at raising achievement is also highlighted.

Chin, M., Hill, H., McGinn, D., Staiger, D., & Buckley, K. (2013). Using Validity Criteria to Enable Model Selection: An Exploratory Analysis. Association for Public Policy Analysis and Management Fall Research Conference.Abstract

In this paper, the authors propose that an important determinant of value-added model choice should be alignment with alternative indicators of teacher and teaching quality. Such alignment makes sense from a theoretical perspective because better alignment is thought to indicate more valid systems. To provide initial evidence on this issue, they first calculated value-added scores for all fourth and fifth grade teachers within four districts, then extracted scores for 160 intensively studied teachers.Initial analyses using a subset of alternative indicators suggest that alignment between value-added scores and alternative indicators differ by model, though not significantly.

Cascio, E. U., & Staiger, D. O. (2012). Knowledge, Tests, and Fadeout in Educational Interventions. Publisher's VersionAbstract

Educational interventions are often evaluated and compared on the basis of their impacts on test scores. Decades of research have produced two empirical regularities: interventions in later grades tend to have smaller effects than the same interventions in earlier grades, and the test score impacts of early educational interventions almost universally “fade out” over time. This paper explores whether these empirical regularities are an artifact of the common practice of rescaling test scores in terms of a student’s position in a widening distribution of knowledge. If a standard deviation in test scores in later grades translates into a larger difference in knowledge, an intervention’s effect on normalized test scores may fall even as its effect on knowledge does not. We evaluate this hypothesis by fitting a model of education production to correlations in test scores across grades and with college-going using both administrative and survey data. Our results imply that the variance in knowledge does indeed rise as children progress through school, but not enough for test score normalization to fully explain these empirical regularities.

Bacher-Hicks, A., Chin, M., Hill, H., & Staiger, D. (Working Paper). Explaining Teacher Effects on Achievement Using Measures from Multiple Research Traditions.Abstract

Researchers have identified many characteristics of teachers and teaching that contribute to student outcomes. However, most studies investigate only a small number of these characteristics, likely underestimating the overall contribution. In this paper, we use a set of 28 teacher-level predictors drawn from multiple research traditions to explain teacher-level variation in student outcomes. These predictors collectively explain 28% of teacher-level variability in state standardized math test scores and 40% in a predictor-aligned math test. In addition, each individual predictor explains only a small, relatively unique portion of the total teacher-level variability. This first finding highlights the importance of choosing predictors and outcomes that are well aligned, and the second suggests that the phenomena underlying teacher effects is multidimensional. 

2013 May 15

Beyond the Numbers Convening 2013

Wed May 15 (All day) to Fri May 17 (All day)

The third annual SDP Beyond the Numbers Convening, From Classroom to Boardroom: Analytics for Strategy and Performance, was held in Boston, MA on May 15–17, 2013. Participants included SDP Fellows (current and alumni); SDP Fellows’ Supervisors; members of ActivateED, a collaborative effort with Education Pioneers and the Broad Center; SDP Faculty Advisors; Bill and Melinda Gates Foundation representatives; Center for Education Policy Research staff; and key leaders in K–12 education.... Read more about Beyond the Numbers Convening 2013

2012 Apr 23

Beyond the Numbers: The Power of Analytic Leaders

(All day)

Location: 

Boston, MA

Two hundred education leaders from across the country gathered in Boston, MA to attend the 2nd annual Strategic Data Project (SDP) Spring Convening, Beyond the Numbers: The Power of Analytic Leaders. Participants engaged in discussions about improving the use of data and analysis in their respective partner agencies, and partook in a rich set of sessions focused on college-going success and the human capital pipeline. Speakers included Amy Briggs, Chief Operating Officer at Student Achievement Partners; John Friedman, Assistant Professor of Public Policy at Harvard Kenned School; and...

Read more about Beyond the Numbers: The Power of Analytic Leaders
2012 Sep 13

CEPR Faculty Director Thomas Kane on Panel at The New York Times Schools for Tomorrow Conference

(All day)

Location: 

New York, NY

The New York Times hosted the second annual Schools for Tomorrow Conference with a focus on "Building a Better Teacher." Four hundred educators, government officials, pilanthropists, and investors participated in workshops and debates to explore ways government, the private sector, and parents can create and support the best teachers possible. CEPR Faculty Director Thomas Kane discussed  teacher evaluation and measurement along with fellow panelists ...

Read more about CEPR Faculty Director Thomas Kane on Panel at The New York Times Schools for Tomorrow Conference
2013 Jan 30

Thomas Kane "Measures of Effective Teaching" lecture at Clark University

2:00pm to 3:00pm

Location: 

Clark University

On Wednesday, January 30, CEPR Faculty Director Thomas Kane spoke at Clark University for the annual Dr. Lee Gurel '48 Lecture. His lecture, "Measures of Effective Teaching," focused on the questions: How do we tell who the great teachers are and how can we produce more of them and more effective teaching? Watch the video recording via the website below. 

Read more about Thomas Kane "Measures of Effective Teaching" lecture at Clark University
2013 Mar 12

Reinventing Teacher Evaluation: Bureaucratic Intrusion or Vital Educational Infrastructure?

12:00pm to 1:30pm

Location: 

Penn Graduate School of Education, University of Pennsylvania

CEPR Faculty Director Tom Kane will speak at the Penn Graduate School of Education (Penn GSE) at the University of Pennseylvania as part of the lecture series held for the Pre-Doctoral Training Program in Interdisciplinary Methods for Fied-Based Research in Education, sponosred by the Institute of Education Sciences (IES). He will discuss finidings from the MET project and the importance of providing regular, reliable feedback to teachers.  

Read more about Reinventing Teacher Evaluation: Bureaucratic Intrusion or Vital Educational Infrastructure?
2013 Apr 27

CEPR-Affiliated Sessions at AERA 2013

Sat Apr 27 (All day) to Wed May 1 (All day)

Location: 

San Francisco, CA

CEPR affiliates are to present at the American Educational Research Association (AERA) Annual Meeting 2013 from April 27–May 1 in San Francisco, CA. The theme of the event this year is “Education and Poverty: Theory, Research, Policy, and Praxis.” CEPR presenters include researchers; steering committee members; and SDP Fellows, Alumni, and Faculty Advisors. The sessions will be an opportunity to learn more about some CEPR research, as well as research done outside of the center by its affiliates. View the list below to see the sessions that include the CEPR network.

2013 Oct 17

SDP Toolkit for Effective Data Use Webinar

(All day)

The Strategic Data Project has released the human capital edition of the SDP Toolkit for Effective Data Use. The SDP Toolkit is a free resource guide for education agency analysts who collect and analyze data on student achievement. Completing the toolkit produces a set of basic, yet essential, human capital and college-going analyses that every education agency should have as a foundation to inform strategic management and policy decisions. The addition of human capital analyses will give users insight into teacher recruitment, placement, evaluation, development, and retention.

In...

Read more about SDP Toolkit for Effective Data Use Webinar

Pages