School Report Cards vs. Teacher Ratings: What Do They Measure?

February 29, 2012 Brooklyn Eagle Staff
Share this:

By Mary Frost
Brooklyn Daily Eagle

NEW YORK– Last Friday, the New York City Department of Education (DOE) released ratings for 18,000 public school teachers.

The evaluations, officially called Teacher Data Reports, are based on incomplete state test results in English and math for a group of 4th to 8th graders over a three-year period. (click here for more on this story).

Subscribe to our newsletters

Originally intended as a pilot study to help teachers increase their effectiveness, the ratings were never considered to be especially accurate, nor were they designed to be made public.

For example, at P.S. 8 in Brooklyn Heights, teachers averaged 51.3 percent over three years (only six teachers received a multi-year rating), with stunning margins of error for each rating. For example, one teacher received a rating of 47 percent in English – with a margin of error of 11/81. Another teacher received a 61 percent in math, but the margin of error was 33/83.

Like the city’s School Progress Reports (school report cards), teacher ratings have raised a number of questions about what is being measured and how accurately it reflects the education of children at that school.

Many observers have noted that some of the city’s most sought-after schools have received mediocre ratings on school report cards. Likewise — while the results are still new – a number of parents say that some of their school’s best teachers have received low ratings.

Here’s a comparison of key differences and similarities between School Report Cards and Teacher Ratings:

School Report Cards

 Based Mostly on Improvement on State Tests: Weight is given not to students’ actual performance on state tests, but their improvement on the tests. While this is advantageous for poorly performing schools, it works against schools with little room to improve.

Criteria: Over their five-year history, School Progress Reports have received criticism for using confusing criteria to calculate a school’s final grade.

High Volatility: Over five years, schools’ grades have swung wildly from year to year. (To counter criticism of this volatility, DOE deliberately stabilized the reports this past year, though doubling the number of “failing” schools.)

Based Mostly on Invalid Exams: The reports being released are based on exams from 2007, 2008 and 2009, called unreliable by the state and later changed.

Graded On a Curve: This year, like last year, the city graded schools on a curve. Regardless of actual test scores, the top 25 percent of all schools received A’s, the next 35 percent received B’s, 30 percent C’s, 7 percent D’s, and the bottom 3 percent scored F’s.

Not Correlated to Actual School Quality: The most sought after schools have received low grades and some of the most dangerous schools in the city have received As and Bs.
Who was Rated: Public schools and charter schools.

Teacher Ratings

Based Solely on Improvement on State Tests: A student’s “predicted performance” on state tests, adjusted according to “key characteristics” (like poverty level) is compared with the student’s actual performance to determine the teacher’s contribution (“added value”). Since high-performing students can’t improve as much as average students, many of their teachers were penalized.

Criteria: The complex statistical technique called “value-added measurement” has been criticized for being confusing and having large margins of error.

High Volatility: The Annenberg Institute study found high volatility over the three years evaluated. A third of the English teachers (and a quarter of the math teachers) who got the top grade in 2007 received the worst grade in 2008.

Based on Invalid Exams: The reports being released are based on exams from 2007, 2008 and 2009, called unreliable by the state and later changed.

Graded On a Curve: The city graded teachers on the curve: The top 5 percent of teachers received the grade of “high;” the next 20 percent scored “above average;”50 percent were ranked “average;” 20 percent “below average;” and 5 percent “low.”

Not Correlated to Actual Teacher Quality: Though the results are still new, many parents are reporting that some of their schools’ best teachers received low ratings.

Who Was Rated: 18,000 public school teachers (4th – 8th grade). No charter school teachers or special education were rated in this go round.


A Sampling of Brooklyn Schools
 

School Report Cards: Brooklyn Heights’ P.S. 8 (District 13) received an overall “C” on their most recent report card.
Teacher Ratings: P.S. 8 teachers averaged 51.3 percent (only six teachers were rated over multiple years).  
 

School Report Cards: P.S. 9 (District 13 in Prospect Heights) received a C.
Teacher Ratings: P.S. 9 teachers averaged 13.5 percent over multiple years (only four teachers were rated over multiple years).

School Report Cards: In District 15, P.S. 261 on Pacific Street received an overall C grade. Test performance was at the C level.
Teacher Ratings: P.S. 261 teachers averaged 48.8 percent (only six teachers were rated over multiple years).

School Report Cards: P.S. 29 on Henry Street in Cobble Hill (District 15) received a C.
Teacher Ratings: P.S. 29 teachers averaged 38.4 percent (ten teachers were rated over multiple years).

School Report Cards: Park Slope’s P.S. 321 on Seventh Avenue (District 15) received an A.
Teacher Ratings: P.S. 321 teachers averaged 55.2 percent (22 teachers were rated over multiple years).

School Report Cards: In Bay Ridge (District 20), P.S. 48 on 18th Avenue received an overall A.
Teacher Ratings: P.S. 48 teachers averaged 44.3 percent (12 teachers were rated over multiple years).

School Report Cards: P.S. 185 on Ridge Boulevard (District 20) received an overall C.
Teacher Ratings: P.S. teachers averaged 38.4 percent (18 teachers were rated over multiple years).


Leave a Comment


Leave a Comment