Join the GOOGLE +Rubber Room Community

Sunday, December 2, 2012

Merryl Tisch and Annual Professional Performance Review (APPR)

FRIDAY, NOVEMBER 30, 2012


APPR Is Not About Feedback

LINK

New York State Regents Chancellor Merryl Tisch is putting pressure on the UFT to agree to an evaluation system with the NYCDOE.

She writes the following in the NY Post:
In February, Gov. Cuomo stood with state Education Commissioner John B. King Jr. and the heads of city and state teachers unions to announce agreement on a new evaluation system for teachers and principals. The new law was a groundbreaking accord that laid the foundation for a fair, responsible process to provide educators with constructive evaluations that can strengthen teaching and learning. 
Nine months later, more than 600 school districts around the state have submitted evaluation plans, and Commissioner King has approved more than 250 of those plans. Unfortunately, New York City isn’t one of those districts.

This isn’t just about money, although the city stands to lose hundreds of millions of dollars if it doesn’t have an approved plan in place by Jan. 17. And it’s not about a “gotcha” system to get rid of teachers. This is about giving teachers and principals the tools they need to strengthen their skills and improve their instruction.

Research and common sense tell us the best way to improve student performance is to make sure that every child is in a class headed by a great teacher and every school is run by a great principal.
Common sense tells us something else: Just like the rest of us, teachers and principals need objective feedback to get better at their jobs. An effective evaluation plan lets educators receive professional development tailored to their needs, and gives top practitioners the opportunity to serve as mentors for their colleagues.

That’s why the state Board of Regents included implementation of strong evaluation programs as a key pillar of its education-reform agenda.
Now if she were being honest about this system being about giving feedback to teachers and helping them to improve, that would be all fine and good.
But she isn't.
The system is rigged against teachers - as Carol Burris has noted here and Sean Feeney has noted here.
Merryl Tisch says test scores are an "essential component" of teacher evaluations.
John Kuhn explains here why putting such high stakes on standardized tests is damaging to students here.
Merryl Tisch isn't interested in giving feedback to teachers, improving schools or giving students a better education.
She's interested in giving the education reformers the tools they need to shed expensive teacher salaries at will.
That's what the unworkable teacher observations are about, that's what the endless standardized testing is about, that's what the algorithm developed by the state to measure so-called student growth is about.
This is a "gotcha" system set up to clear the rolls of as many teachers as possible and make the profession into a right-to-work job.
Unfortunately because the sell-outs at the NYSUT and the UFT agreed to this stuff, that's exactly what is going to happen.
Merryl Tisch can make believe this system is "for the kids" all she wants (and notice the usual "WE HAVE NO TIME!" urgency in her propaganda piece in the Post that is a blueprint from the Shock Doctrinaires.)
This system is for the education reform criminals, the hedge fund managers, the for-profit and quasi-non profit charter operators and the privatizers.
How teachers are evaluated has become one of the big issues in the ongoing strike by Chicago public school teachers as well as in the many debates on school reform being conducted around the country.
Assessment experts say that the method of using student standardized scores to gauge a teacher’s effectiveness is unreliable, but reformers still insist on using this “value-added” method of evaluation. Some reformers, such as Chicago Mayor Rahm Emanuel, want as much as half of a teacher’s evaluation to be linked to student test scores.
“Value added” scores sometimes label very effective teachers as ineffective, and vice versa. How can that happen? Here’s a case that tells you how an excellent teacher got a low value-added score. This story is not an aberration.
It was written by Sean C. Feeney, principal of The Wheatley School in New York State and president of the Nassau County High School Principals’ Association. He is the co-author of an open letter of concern about New York state’s new test-based educator evaluation system that has been signed by thousands of people.
By Sean C. Feeney
New York State schools are back in session! With the new school year comes a new responsibility for principals across the state: the need to inform teachers of their “growth score” based on the New York State assessments their students took in the spring. This teacher growth score 
(Linda Davidson/THE WASHINGTON POST)
is one of the parts of the New York State APPR system that was implemented last year in a rushed manner against the very public objection of over one-third of the New York State principals along with thousands of other teachers, administrators, parents and concerned citizens (see www.newyorkprincipals.org for more information).
These state-supplied scores were the missing piece in a teacher’s final end-of-year score — potentially determining whether or not a teacher is deemed Ineffective and therefore subject to requiring a Teacher Improvement Plan (TIP) within 10 days of the start of the school year. These scores were not available to schools until the third week of August. So there you have it: high-stakes information that can potentially have a serious impact on a teacher’s career being supplied well past any sort of reasonable timeframe. Welcome to New York’s APPR system!
As a principal, I sat with each of the teachers who received a score from the state and tried to explain how the state arrived at these scores out of 20 points. One of the first teachers with whom I did this was Ashley.
Ashley is the type of teacher that all parents want for their child: smart in her content area and committed to making a difference in her students’ lives. Ashley works incessantly with her students, both inside and outside of the classroom.

During her free time, Ashley can always be found working with small groups of students in the hallways or any free space in the area. She has taken our school’s math teams on weekend trips as our mathematics team has found success in various competitions. Over the past four years, 91% of her 179 Algebra 1, Geometry or Algebra 2/Trigonometry students have passed the corresponding Regents examination on their first attempt.
At the end of every year, students and parents send in countless notes of thanks to Ashley for her tireless efforts. Ashley has worked with our highest achieving students as well as many of those who struggle with mathematical understanding. For those who struggle, Ashley has a well-deserved reputation for making them more confident, successful and comfortable with the material. Last spring, Ashley was recognized as the Parent Teacher Organization teacher of the year.
So what score did the state assign Ashley? Well, she earned a score of 7 out of 20 points. According to the state’s guidelines, this makes Ashley a Developing teacher. Goodness. To those of us who know Ashley and have had the pleasure of working with her over the years, this is a jaw-dropping result. Ashley’s score defies all understanding of who she is as an educator. Her score flies in the face of how she is valued in our school and what she has done for students in our school. Her score contradicts the thoughtful evaluations given to her over the past five years.
How, then, is one to understand this score?
Officials at our State Education Department have certainly spent countless hours putting together guides explaining the scores. These documents describe what they call an objective teacher evaluation process that is based on student test scores, takes into account students’ prior performance, and arrives at a score that is able to measure teacher effectiveness. Along the way, the guides are careful to walk the reader through their explanations of Student Growth Percentiles (SGPs) and a teacher’s Mean Growth Percentile (MGP), impressing the reader with discussions and charts of confidence ranges and the need to be transparent about the data. It all seems so thoughtful and convincing! After all, how could such numbers fail to paint an accurate picture of a teacher’s effectiveness?
(One of the more audacious claims of this document is that the development of this evaluative model is the result of the collaborative efforts of the Regents Task Force on Teacher and Principal Effectiveness. Those of us who know people who served on this committee are well aware that the recommendations of the committee were either rejected or ignored by State Education officials.)
One of the items missing from this presentation, however, is an explanation of how State officials translated SGPs and MGPs into a number from 1 to 20. In order to find out how the State went from MGPs to a teacher effectiveness score out of 20 points, one needs to refer to the 2010-11 Beta Growth Model for Educator Evaluation Technical Report. Why a separate document for explaining these scores? Most likely because there are few State officials who are fluent in the psychometrics necessary to explain how this part of our APPR system works.
It is incredulous that the state feels that it is perfectly fine to use a statistical model still in a beta phase to arrive at these amorphous teacher effectiveness scores. I make it a point not to use beta software on my computer, for I do not want something untested and filled with bugs to contaminate the programs that are working fine on my machine. It is a shame that the State does not have the same opinion regarding its reform initiatives.
As explained in the technical paper, the SGP model championed by New York State claims to account for students who are English Language Learners (ELL), students with disabilities (SWD) and even economically disadvantaged students as it determines a teachers adjusted mean growth percentage. While the statistical explanation underlying the SGP model is carefully developed, nowhere do the statisticians justify the underlying cause for any change in student score measured. In other words, what is the research basis for attributing any change in score from year to year to the singular variable of a teacher? The reason why this is never explained is because there is virtually no research that justifies attributing the teacher as the sole cause of a change in student score from year to year.
So if it is not solely the teacher who caused the change in score, to what should one attribute a change in student score? Well, that is a question that continues to challenge statisticians and educational researchers. Despite the hopes and declarations of so many of our present-day “reformers,” we simply do not have to tools necessary to quantify the impact a single teacher has on an individual student’s test score over the course of time. Derek Briggs presented a critique of the use of SGPs in this paper.
How can one explain Ashley’s shockingly low score, however? As a principal who has always availed himself of data when evaluating teachers, I would sit down and have a conversation about the test results so that I could put them in context. Here is what we know about the context of Ashley’s score:
* This year, Ashley’s score was based on her two eighth grade classes, not the results of her Regents-level classes
* The two eighth grade classes were different curricula: one was an Algebra course and the other was a Math 8 course.
* The Algebra 8 course is geared towards the Regents exam, which is a high-school level assessment that is beyond the mathematical level of the NYS Math 8 examination. Ninety one percent of Ashley’s students in this class passed the Regents Algebra 1 examination. There is different content on the Math 8 exam, which can make it a challenge for some of our weaker Algebra students. In fact, of the students who took the Algebra course, one-quarter of them passed the Regents examination but scored below proficiency on the Math 8 exam.
* In the two weeks prior to the three-day administration of the Math 8 exam in April 2012, students in Ashley’s class had one week of vacation followed by three days of English testing. In the two weeks leading to the beginning of the Math 8 exam, Ashley saw her class only three times.
Rather than place the student results in context, the State issued a blind judgment based on data that was developed through unproven and invalid calculations. These scores are then distributed with an authority and “scientific objectivity” that is simply unwarranted. Along the way, teacher reputations and careers will be destroyed.
Despite the judgment of the New York State Education Department, Ashley remains a model teacher in our school: beloved by students and parents; respected by colleagues and supervisors. She continues to work on perfecting her practice and helping her students gain confidence and skills. My hope, of course, is that she will continue to feel that she is part of a profession that respects teachers and students alike, not one that reduces them to a poorly conceived and incoherent number.
Follow The Answer Sheet every day by bookmarkingwww.washingtonpost.com/blogs/answer-sheet .
By   |  12:15 PM ET, 09/13/2012

Annual Professional Performance Review (APPR)

Download: Complete Bulletin. PDF file.

Introduction
Section 100.2 of the Commissioner’s Regulations regarding the Annual Professional Performance Review (APPR) requires school districts and BOCES to annually evaluate the performance of probationary and tenured teachers providing instructional and pupil personnel services. The procedures for evaluating teachers are a mandatory subject of collective bargaining. This bulletin includes amendments to Section 100.2 of the Regulations to conform with Chapter 57 of the Laws of 2007 (CR 100.2(o)(2)(iii)2(b)(vi)).
Regulatory Information
This bulletin provides information on Section 100.2 (o) of the Commissioner’s Regulations Annual Professional Performance Review. The regulation specifies formal procedures for the review of the performance of teachers which must be determined by the school district or BOCES, consistent with the requirements of Article 14 of the Civil Service Law. The bulletin provides advice to local leaders to assist them in the implementation of these procedures.
Who are subject to the APPR Regulations?
Each school district and BOCES must adopt an annual professional performance review plan affecting the following individuals:
  • All teachers providing instructional services. Evening school teachers of adults of nonacademic, vocational subjects are exempt from the APPR requirements.
  • All teachers providing pupil support services including: school attendance teacher, school counselor, school dental hygiene teacher, school nurse-teacher, school psychologist and school social worker.

Download: Complete Bulletin. PDF file.

No comments: