Value-Added Analysis

By the summer of 2009, several school districts were piloting the value-added model, including Dallas, Houston, New York City, Washington, DC and Chicago. Secretary of Education Arne Duncan was a supporter. When the Obama administration in July 2009 announced Race to the Top, a national initiative which allowed states to compete for federal funds to improve schools, it featured incentives to link teacher evaluations to student test scores.

Value-added metrics emerged from the world of economics. The model used complex algorithms to compute how much value a teacher added to his students’ mastery of math and English as measured through standardized tests. The value-added model tracked individual teachers over the course of several years to determine whether their students’ test scores consistently improved, declined or remained stagnant.

Support . Value-added supporters believed it could be a useful tool for school superintendents, principals and parents to hold teachers accountable. They argued that it controlled for socioeconomic differences among students because it rated teachers based on how much their students improved. Thus, the teacher was judged not on standard grade level expectations for students, but on their progress. Teachers in low-income areas, for example, with fewer students performing at grade level, could still receive an effective rating so long as the majority of students made significant progress.

Criticism . The method’s critics, however, charged that its calculations were misleading and often wrong. For one thing, researchers in each jurisdiction were free to decide which variables to include or exclude—for example, student family income, parent educational level, race, or proficiency in English. Thus, the same set of data could generate different results based on the variables selected. This also meant that results could not be compared across states or districts because the methodology was not consistent. The model also failed to factor in important information such as whether a class was team-taught, or whether a student or teacher had been absent for prolonged periods. Finally, it depended for its findings on standardized tests, whose own validity had been the subject of intense debate for decades.

One issue that both supporters and critics agreed on was that value-added should be only one component in a teacher’s evaluation. They differed, however, on the weight it should be given, with estimates ranging from as high as 50 percent to a low of 3 percent.

At the LA Times , reporters and editors knew that value-added analysis had its limitations. But the more Felch and Song researched it, the more they came to believe that it was likely the best method available for assessing a teacher’s abilities. As Felch says: “This was the key that these researchers were using to kind of unlock this world, where we suddenly were able to see dynamics that were going on that had been blurred before.” They decided to see whether the value-added approach could possibly work for teachers in Los Angeles public schools.

Do it here? To do a valid value-added analysis, researchers required several years of continuous student test scores linked to their teachers. The LAUSD had been collecting data from the California Standards Tests (which it adopted in 2002) for seven years. Felch and Song proposed to their editors, Marquis and Shuster, that the Times try to obtain the LAUSD data. If successful, the Times could hire an expert in value-added analysis to rate the LAUSD teachers. The paper could then post the results on itswebsite, along with a series of articles putting the data in context.  To increase the chance of influencing real policy change, the paper might even name the teachers.

Marquis and Shuster thought the idea had potential, but believed it unlikely LAUSD would release the data. AME Lauter agreed that the prospects for LAUSD cooperation were slim. But he loved the idea, and felt that if they could get the data, the Times would be in a position to provide a valuable public service.

Lauter also worried about the cost, especially the expense of an outside consultant. While Times management wanted to support ambitious journalism, 2009 had been a particularly bad year financially for newspapers. Nonetheless, Lauter advised the reporters to push ahead. In the meantime, he applied for funding to the Hechinger Institute on Education and the Media, an arm of Teachers College at Columbia University that supported major journalism projects focused on education. [6]



[6] The Hechinger Institute in August 2010 awarded the Times a $15,000 grant that the paper used to help defray the cost of the consultant.