This study (abstract) found no correlation between measuring Value Added results and other widely approved forms of teaching…..  


Nor did the study find associations between “multiple measure” ratings (which combine value-added measures with observations and other factors) and the amount and type of content covered in classrooms. That finding is potentially important because many states have responded to the Race to the Top grant competitions and other federal initiatives by adopting such multiple-measure evaluation systems for teachers.


It appears to be all bunk.  What value-added models are measuring is one narrow slice of what it means to be an effective teacher.

Most provocative of all are the preliminary results of a study that uses value-added modeling to assess teacher effects on a trait they could not plausibly change, namely, their students’ heights.  The authors found that teachers’ one-year “effects” on student height were nearly as large as their effects upon reading and math….

One big drawback of value added method the studies showed, was that it does not account for the very real fact that teachers vary in effectiveness across all different subjects, ranging science to poetry. Good in math, bad in English; good in English, bad in math. Nothing abnormal about that…

It found that between 15 and 25 percent of teachers were misranked by typical value-added assessments because the models did not consider that some teachers were more effective at teaching certain subjects (for example, reading versus math) or certain types of students. Such policies will unfairly fire a large number of the wrong teachers.

Furthermore the study notes, the pool of potential replacements is, at least on average, of lower quality than the pool of current teachers…. Firing improves nothing.

A computer model tried this experiment. What would happen if you fired the bottom 10% of teachers based on test scores.They then asked another question: What would happen if, rather than using value-added data to fire the “bottom” teachers, the state instead used it to match all teachers with the subjects and students for which they had demonstrated the strongest results? They found that this approach also increased test scores—in fact, by much more than the teacher-firing model had


So, with all evidence pointing that Value Added Methods of evaluation are indeed flawed, why are we so hell bent here in Delaware to throw ourselves over the cliff…  We can see the edge!  We can hear the water churning 100’s of feet below!  We look up and down and we see no one else doing it…  And yet we are being driven at too fast a speed to stop, fast approaching the very lip of the point of no return….


Where is someone who will put a stop to this bill 334 and either kill it with amendments or put it out of its misery on the floor?  We may not know what works. But we do know what doesn’t work.  We are hell bent on purchasing and expanding, something that doesn’t work….