JACKSONVILLE, FL: Quality Education for All, the privately
funded program that sought to get the best teachers to transfer to our city’s
worst-performing schools (according to state tests), has been in the news. In a
John Kerryesque moment (‘I actually voted for [it] before I voted against it.’ https://www.youtube.com/watch?v=esUTn6L0UDU,)
the school district told 273 teachers (out of 952 in the program) that they had
earned a partial bonus ($5000) before they told them they hadn’t.
Naturally, these teachers were unhappy. They focused on a
discrepancy between their contracts, in which they were led to believe that
their performance would be based on calculations for QEA schools only, and the
MOU, memorandum of understanding, between the teacher’s union and the district,
which specified that their performance would be based upon calculations for the
district as a whole.
As a math teacher, that got me thinking. What would change?
First, these performance calculations are based upon student
growth, not proficiency; more specifically, upon a teacher’s Value-Added
Measurement (VAM), which is suspect in itself. Setting that aside, why would
the VAM average for the QEA schools be less than the VAM average for the entire
district? That is what these teachers are saying: if they were only measured
against other QEA schools, they would have qualified for the bonus.
I’m not sure of that. The district has lots of struggling
schools (as measured by state or district tests, which is how they determine
these things) beyond those in the QEA group. 103rd Street? Jefferson
Davis MS, JEB Stuart MS, Ed White HS, Westside HS? All struggling. Arlington
schools: struggling.
This is a growth measurement, not a proficiency measurement.
The school district could shut this down in an instant if
only they would provide these teachers with calculations based upon the QEA
schools only and show that they wouldn’t have qualified anyway.
But that’s not DCPS style. You don’t question God.
(Actually, anyone who has read the Bible knows that God gets questioned a lot.)
What can we learn from this debacle?
1.
Teachers working in QEA schools have the growth
potential to achieve results. They shouldn’t want to be measured against one
another; they should demand a criterion-referenced performance standard, not a
norm-referenced standard regardless of whether it is QEA only or the entire
district. In the Folio Weekly story, http://folioweekly.com/BAIT-AND-SWITCH,15231,
the 3rd-grade reading teacher describes the progress she made with
her children. That in itself deserves a reward, not a comparison with other
teachers to determine a reward.
2.
VAM is a measurement denounced by professional
statisticians: www.amstat.org/policy/pdfs/asa_vam_statement.pdf;
http://blogs.edweek.org/edweek/teacherbeat/2014/04/statisticians_group_issues_sta.html
3.
VAM should not be used to measure the success of
these teachers.
4.
DCPS demanded a 3 year commitment from these
teachers; it should have made a 3 year commitment as well.
5.
Student learning, correct that--student
performance on the tests, correlates to zip code; that is, to the socio-economic
status of the neighborhood in which they live. The whole premise of this
program, that teacher quality is the only factor that matters, is false. I
anticipated that many highly-performing teachers located in neighborhoods with
high levels of income would not meet the performance goals in these
neighborhoods with low levels of income.
6.
Performance on state and district testing alone
is a poor way of measuring student outcomes.
7.
Quality Education for All: an experiment
conceived with good motives, but doomed to fail. The devil is in the details.
(And if you don’t believe it has failed, why has the Superintendent recommended
converting the worst of the schools (as measured by tests) for magnet
conversion after only ONE year? (Northwestern MS; A. Jackson HS).
No comments:
Post a Comment