Monday, March 14, 2016

The Peer Validator Scam

The above letter says that fair evaluations are over. Well, it doesn't say that directly, but I'm sure that you can see what is written there just as well as I can.

Adam, what were you thinking when you signed this?

Historically, observations were supposed to help support the teacher. Teaching in the 21st Century outlined the process to be used, with Component A and Component B, pre- and post observation feedback and reports, and never anything put into your personnel file that was unsigned.

See ARTICLE TWENTY-ONE: DUE PROCESS AND REVIEW PROCEDURES in the 
UFT Contract (see p. 110):
A. (1) "No material derogatory to a teacher's conduct, service, character or personality 
shall be placed in the [teacher] files unless the teacher has had an opportunity to read the material. The teacher shall acknowledge that he/she has read such material by affixing his/her signature on the actual copy to be filed, with the understanding that such signature merely signifies that he/she has read the material to be filed and does not necessarily indicate agreement with its content."

I think it is astonishing that Adam Ross, the Attorney for the UFT , would sign away a right that is clearly given to all members in the Collective Bargaining Agreement.

By the way, no one wanted me to see this letter, above. I was the paralegal at a 3020-a, and I and the private Attorney I was working with submitted a Motion To Dismiss Any Unsigned Documents  from the hearing. The DOE Attorney, Nicole Andrade, argued that this could not happen, because the UFT agreed to the submission of unsigned documents. I really did not believe this. We asked to see this agreement, and the Arbitrator ordered that we get a copy. That's how the letter got into my possession.

Article 21 of the CBA A(1) requires the removal of any and all documents submitted to 3020-a Arbitration and/or a personnel file which do not have the signature, or protest, written by the Respondent teacher on the document. Without any signature or protest by the Respondent, the Arbitrator at a 3020-a must assume that the Respondent never received the document under review. If the Respondent did not see the document requested by the Department to be placed into evidence, the document must not be allowed into evidence, as placing such a document into evidence would be a violation of not only the CBA, but also Respondent's right to due process.

Arbitration supports the mandate of the CBA and State Law which requires Arbitrators to determine whether the requirements of due process and just cause were met before, and after, the employee was disciplined or charged.

Arbitration has informal rules of evidence which are designed to allow both parties wide latitude in bringing forth facts to present their side of the story, however the CBA contains language which prohibits the arbitrator from looking beyond the contract. The most important point made here is, that due process requires that Respondent be informed of the charges made against him/her and his/her pedagogy, and then be provided with reasonable access to material that could be used in his/her defense.

All this is now moot.

Do you have a peer validator coming to see you? Be ready. They do what their name suggests, they validate the "whatever" your administrator has said/written about you.

And then there is this:

From: AdvanceSupport-NoReply
Sent: Friday, October 24, 2014 5:28 PM
Subject: Peer Validator Program


Dear Teacher,
Based on current records of your 2013-14 overall Annual Professional Performance Review rating, you will be assigned a Peer Validator during school year 2014-15.

The Peer Validator program is a joint initiative of the New York City Department of Education (DOE) and the United Federation of Teachers (UFT) that exists as part of Advance, our teacher evaluation and development system. This program, which is new this school year, provides all teachers who received an overall Annual Professional Performance Review (APPR) rating of “Ineffective” (Safety Net Result, if applicable) for the 2013-14 school year with a Peer Validator in the 2014-15 school year. The Peer Validator’s job is to independently evaluate a teacher’s classroom performance.

Peer Validators are trained New York City teachers who are assigned to the Division of Teaching and Learning. Each Peer Validator applied to work in the program and met qualifications consistent with the terms set forth in the DOE’s collective bargaining agreement with UFT. They were selected for the position by a hiring committee comprised of DOE and UFT representatives. Each teacher who is assigned a Peer Validator will receive three unannounced, full-period classroom observations. The Peer Validator will assess teacher practice based on components 2a, 2d, 3b, 3c and 3d of the Danielson Framework for Teaching. The Peer Validator will not communicate with you or your school’s administration about the APPR process. His/her role is to observe you in your classroom, in order to provide an independent assessment of the Measures of Teacher Practice component of the APPR. The Peer Validator cannot disclose his/her ratings for any observation until the annual rating period is over, at which point both you and your lead evaluator will be provided with copies of the three completed Peer Validator observation reports.

The following answers to frequently asked questions will help you to understand more about this program and how it supports you:

1. Why was the Peer Validator program created?
New York State Education Law 3012-c requires that Independent Validators be assigned to teachers who received an overall APPR rating of “Ineffective” in a school year who were not rated “Ineffective” the year prior. As part of the DOE-UFT contract agreement this summer, the Independent Validators were replaced with Peer Validators, in recognition of the skills and abilities of teachers who work within our schools.

2. What do Peer Validators do?

Peer Validators perform their work entirely independent of the school-based evaluation process. They confer with neither teachers nor their supervisors during the program year. Visits are unannounced for both the school and the teacher. Peer Validators do not have access to any historical information regarding the teachers who they are observing. Finally, they do not disclose their ratings for any observation until the annual rating period is over, at which point both lead evaluator and teachers are provided with copies of the three completed observation reports.

Peer Validators provide teachers being served by the program with three independent and unannounced observation visits to their classrooms. The observations must occur at least 20 school days apart. During those visits, the Peer Validator takes notes of what she/he sees and hears, and develops observation ratings for components 2a, 2d, 3b, 3c and 3d using the same process and tools that school-based evaluators do.

3. What should I expect during a Peer Validator visit?
Each Peer Validator observation will be unannounced and last a full period. When the Peer Validator comes to your classroom, she/he will greet you and give you a copy of this letter. She/he will then observe and take detailed notes. As is also true for school-based evaluators, she/he may circulate around the classroom, examine student work, confer with students and take photographs unobtrusively. When the observation is concluded, the Peer Validator will leave. Other than the initial greeting, there is no communication between the teacher and Peer Validator.

If you have additional questions about the Peer Validator program, please contact the Advance Support Team at AdvanceSupport@schools.nyc.gov<mailto:AdvanceSupport@schools.nyc.gov>.

Thank you.

- The Advance Support Team

And:

6.  TEACHER EVALUATION/PEER VALIDATOR


Article 8J of the Teachers’ CBA shall be amended to include the following:

The Board (DOE) and UFT agree that the following, subject to approval by the Commissioner of Education, represents the Parties APPR Plan as required by Education Law § 3012-c.

This Article replaces the Commissioner’s June 1, 2013 APPR decision and subsequent clarification decisions dated September 5, 2013 and November 27, 2013 (collectively “the Commissioner’s Decision”).

Except as modified herein, the terms of the Commissioner’s Decision are incorporated by reference and remain in full force and effect. Except as stated herein, any dispute regarding this APPR Plan and the Commissioner’s Decision shall be resolved exclusively through negotiation between the parties or the grievance process set forth in Article 22 of the parties’ collective bargaining agreement. Any issue regarding the implementation of the APPR Plan with respect to the Measures of Student Learning and scoring that was not addressed in the Commissioner’s Decision, shall be resolved through negotiations between the parties and, in the absence of an agreement, referred to the State Education Department for clarification.

The Parties agree to submit a draft APPR Plan to the State Education Department no later than May 15, 2014.

Teacher Practice Rubric


In order to simplify and focus the use of Danielson’s Framework for Teaching (2013 Edition), and reduce unnecessary paperwork, only the following eight (8) components of the rubric shall be rated: 1(a), 1(e), 2(a), 2(d), 3(b), 3 (c), 3(d), and 4(e). These eight (8) components shall be referred to herein as the “Danielson Rubric.” Any reference to Danielson or the Danielson Rubric in the Commissioner’s Decision shall be deemed to refer only to these eight (8) components. In each observation, all components of the Danielson Rubric shall be rated for which there is observed evidence. The remaining components of the Danielson Framework for Teaching (2013 Edition) not describe herein will continue to be used by the Parties for formative purposes.

Observation Cycle


1.                    Feedback following an observation must be provided to the teacher within fifteen (15) school days of the observation. Feedback must be evidence-based and aligned to the Danielson Rubric.

2.                    Evaluator forms shall be provided to the teacher no later than forty-five
(45) school days following the observation. From the time an observation (formal


or informal, as defined by the Commissioner’s Decision) is conducted until the time the teacher receives the evaluator form for that observation, only one (1) additional evaluative observation (formal or informal) may be conducted.

3.                    The parties agree that Teacher Artifacts (as defined in the Commissioner’s Decision) shall not be used in determining the Other Measures of Effectiveness (“Measures of Teaching Practice”) subcomponent rating. Teachers are not required to submit Teacher Artifacts (as defined in the Commissioner’s Decision) except principals have the discretion to collect evidence related to the Danielson Rubric in a manner consistent with the collective bargaining agreement and the Commissioner’s Decision. The DOE and UFT shall jointly create guidance for evaluators on the collection of evidence for the Danielson Rubric. Whenever possible, the Parties will jointly present this guidance to school communities.

4.                    An evaluator shall provide a score on any component that is observed from the Danielson Rubric regardless of the observation option selected by the teacher and regardless of whether it is a formal or informal observation (as defined by the Commissioner’s Decision).

5.                    In addition to the two observation options set forth in the Commissioner’s Decision, teachers who have received “Highly Effective” as their final APPR rating in the previous year may choose Option 3. Option 3 consists of a minimum of three (3) informal observations that are used for evaluative purposes. Option 3 is subject to the same procedures and scoring rules as Options 1 and 2 as provided for in the Commissioner’s Decision as modified by this APPR Plan.

A teacher that chooses Option 3 shall make his/her classroom available for three (3) classroom visits by a colleague per school year. The classroom visits described herein shall not be used for any evaluative purpose. Any additional classroom visits by colleagues shall only be with the consent of the teacher selecting Option 3. The date and time of such visits shall be scheduled jointly by the teacher selecting Option 3 and the principal.

6.                    An evaluator may assess a teacher’s preparation and professionalism only if the evaluator’s conclusions are based on observable evidence pertaining to components 1a, 1e, and/or 4a of the Danielson Rubric during an observation or if the evaluator observes evidence for these components during the fifteen (15) school days immediately preceding a classroom observation.

7.                    The parties agree to create an evaluator form that will allow evaluators to rate and delineate between all components observed during a classroom observation as well as (for components 1a, 1e, and 4e only) observed within fifteen (15) school days prior to the classroom observation as part of an assessment of a teacher’s preparation and professionalism. Each evaluator form shall contain lesson-specific evidence for components observed during a


classroom observation and teacher-specific evidence for components observed as part of an assessment of a teacher’s preparation and professionalism.

8.                    An evaluator shall not include or consider evidence regarding the preparation and professionalism on an evaluator form if such evidence (or conduct) is also contained in a disciplinary letter to the teacher’s file, unless the evidence was directly observed by the evaluator during a classroom observation (in which case the evidence may be on both an evaluator form and in a disciplinary letter). Evidence not related to components 1a, 1e, and/or 4e, or directly observed by the evaluator in the fifteen (15) school day period immediately preceding a classroom observation shall not be considered in a teacher’s evaluation.

9.                    Consistent with the Commissioner’s Decision, there shall be Initial Planning Conferences (“IPC”) and Summative End of Year Conferences (as defined therein). Teachers shall have the sole discretion of setting professional goals as part of the IPC. The DOE will explicitly state this in guidance for evaluators and educators for the 2014-15 school year and thereafter.

Videotaping and Photographing


1.        All observations shall be conducted in person. The teacher and evaluator may mutually consent to evaluators not being present when videotaping.

2.        A teacher may choose to have his/her observations videotaped.  If a teacher chooses to have his/her observations videotaped he/she shall select among the following options:

(a) the evaluator will choose what observations, if any, will be videotaped; or (b) the evaluator shall videotape the observations in the following manner: (i) if the teacher selected Option 1, the formal observation shall be videotaped; (ii) if the teacher selected Option 2, two (2) of the informal observations shall be videotaped (at the evaluator’s option); or (iii) if the teacher selected Option 3, one (1) of the informal observations shall be videotaped (at the evaluator’s option).

3.        Evaluators who take photographs during observations relevant to the Danielson Rubric, should, to the extent practicable, be unobtrusive (for example, photographs may be taken at the end of the observation).

Covered Employees


1.                    The DOE and the UFT agree to jointly request that the State Education Department issue a determination as to whether teachers of programs for suspended students and teachers of programs of incarcerated students are subject


to Education Law § 3012-c (and therefore subject to this APPR Plan). Such decision shall be incorporated by reference into this APPR Plan.

2.                    In order for a classroom teacher to be covered by this APPR Plan, the teacher must be teaching for at least six (6) cumulative calendar months in a school year. If the teacher does not satisfy this requirement he/she shall not be covered by this APPR Plan and shall be subject to the evaluation system set forth in Article 8J of the collective bargaining agreement and Teaching for the 21st Century.

3.                    The following shall apply to teachers who are teaching for more than six
(6)     cumulative calendar months in a school year but less than the full year due to either (a) paid or unpaid leave of absence; (b) reassignment from teaching responsibilities; or (c) the teacher commenced, or separated from, employment mid-year:
(a)     When a teacher is absent from the first day of school until the last Friday of October, the IPC (as defined in this APPR Plan) shall be conducted within ten (10) school days of his/her return to school.
(b)     When a teacher is absent between the last Friday of April and the last Friday of June, and the absence was foreseen and the evaluator was aware that the teacher would not be present during this period (e.g., they are taking a maternity leave), the Summative Conference shall be held before the teacher leaves.
(c)     When a teacher is absent between the last Friday of April and the last Friday of June and the absence was unforeseen (e.g., extended leave) and therefore the evaluator could not conduct the Summative Conference ahead of time, the Summative Conference shall be held no later than the last Friday of October in the following school year. Evaluators shall have the discretion to conduct the IPC and Summative Conference at the same time but must fulfill all the requirements of both conferences.
(d)     When a teacher is unexpectedly absent for the remainder of the school year (e.g., extended leave), the teacher shall have a minimum of two (2) observations, which shall fulfill the observation requirements set forth herein.
(e)     When a teacher is absent during the period when the baseline or post- test assessments are administered, and the teacher was assigned individual target populations for his/her State and/or Local Measures, the teacher will still receive Local and/or State Measures for individual target populations.
(f)    When a teacher is absent during the period when the targets are set (for assessments with goal-setting), the teacher shall set targets and have their targets approved within the first month of his/her return to school.

The DOE shall explicitly state the rules described herein in guidance for educators for the 2014-15 school year and all school years thereafter.


Multiple Observers


For formative purposes (observations conducted entirely for non-evaluative purposes), no more than four (4) observers (either school-based or from outside of the school) may be present in a classroom. Additional observers may be present in teacher’s classroom with the teacher’s consent. The visits described in this paragraph shall not be considered when scoring the Measures of Teacher Practice subcomponent.

For evaluative purposes, no more than one (1) evaluator (as defined by the Commissioner’s Decision) and two (2) school-based observers (i.e., the Superintendent or Assistant Superintendent or trained administrator of the teacher’s school) may be present during a formal or informal observation. The evaluator shall be solely responsible for the observation report. The DOE and UFT shall jointly create guidance for evaluators on the role of multiple observers. Whenever possible, the Parties will jointly present this guidance to school communities.

In extraordinary circumstances, only one (1) of the two (2) observers described herein may be an observer from outside of the school may observe. The outside observer may only be either a Network Leader or Deputy Network Leader (or its functional equivalent).

Student Surveys


The DOE shall pilot student surveys during the 2013-2014 at mutually agreed upon schools and in all schools during the 2014-2015 school year. During the pilot, student surveys shall not be used for evaluative purposes. At the conclusion of each pilot year, the DOE and UFT shall meet to discuss the results of the pilot and discuss the possibility of continuing/discontinuing the pilot and use of the surveys for evaluative purposes. If agreement is not reached at the conclusion of each pilot year, the student surveys shall be used for non-evaluative purposes in the 2014-2015 school year and evaluative purposes starting in the 2015-16 school year and thereafter. The implementation and scoring of the student surveys in 2015-16 and thereafter shall be consistent with the Commissioner’s Decision.

Scoring


For all formal and informal observations (as defined by the Commissioner’s Decision), all components of the Danielson Rubric shall be rated for which there is observed evidence. At the end of the school year, Overall Component Scores shall be created for each of the eight (8) components. The Overall Component Scores shall be the average of each rated component from the observations and/or assessments of a teacher’s preparation and professionalism.


An Overall Rubric Score will then be calculated by taking the weighted average of the Overall Component Scores, using the following weightings: 1a (5%), 1e (5%), 2a (17%), 2d (17%), 3b (17%), 3c (17%), 3d (17%), 4e (5%).

Formal and informal observations (as defined by the Commissioner’s Decision) shall not receive average observation ratings.
Formal and informal observations (as defined by the Commissioner’s Decision) will no longer be afforded the weights as provided for in the Commissioner’s Decision.

The Overall Rubric Score shall be the basis for the 60 points of the Measures of Teaching Practice subcomponent, unless the student surveys are used for evaluative purposes. If student surveys are used for evaluative purposes, the Overall Rubric Score shall count for 55 of the 60 points of the Measures of Teaching Practice subcomponent score. The implementation and scoring of the student surveys in 2015-16 and thereafter shall be consistent with the Commissioner’s Decision.

Courses That Are Not Annualized


In the event that Measures of Student Learning (MOSL) assessment options do not include options for non-annualized courses: 1) in a school where each of the terms covers content where the second term builds on content from the first, the fall teacher shall administer the baseline and the spring teacher shall administer the post-test. Teachers from all terms will be held accountable for the students’ results; or 2) in a school where the second term does not build on content from the first, these teachers shall be assigned Linked or Group Measures. Notwithstanding the foregoing, with respect to a teacher of a course leading to a January Regents, the post-test is the January Regents and a baseline shall be administered in the fall.

For Group and Linked Measures (as defined herein), if a student takes the same Regents exam in January and June, only the higher result will be used for State and Local Measures. For non-Group and Linked Measures, if a student takes the same Regents exam in January and June, and has the same teacher in the fall and spring, only the higher result will be used for State and Local Measures. If the student has different teachers in the fall and spring, the January Regents will be used for the fall teacher and the June Regents for the spring teacher.
Students will be equally weighted in a teacher’s State and/or Local Measures subcomponent score if they are in a teacher’s course for the same length of time (regardless of whether they take the January or June Regents).

For assessments that use growth models, the DOE will calculate scores following the rules outlined above. For assessments that use goal-setting, the teacher who administers the baseline will recommend targets for the students and the principal will approve. Fall term teachers shall set targets on the same timeline as other


teachers. It is recommended that in the fall principals consult with subsequent term teachers about student targets if their assignments are known. Principals shall share these targets with subsequent term teachers within the first month of the start of the new term and provide these teachers with an opportunity to recommend any additional changes to student targets. Principals shall communicate any changes to targets to all affected teachers.

For assessments that use goal-setting, teachers of subsequent term courses who have students who have not previously had targets for them shall set and have their targets approved within the first month of the start of the new term.
State and Local Measures selections for teachers of non-annualized courses, including the application of the 50% rule, shall be determined based upon the teachers’ entire school year schedule. As subsequent term selections may not be known in the fall, teachers shall administer all applicable assessments for the grades/subjects they are teaching in the fall.

Rules Regarding Measures of Student Learning


For the 2014-2015 school year and thereafter the DOE shall issue guidance to the School MOSL Committee that sets forth and explains the rules described herein.

There is no limit on the number of Local Measures that a School MOSL Committee, as defined in this APPR Plan, can recommend for a particular grade or subject. If a School MOSL Committee selects the same assessment but different group for the Local Measures subcomponent, the following are allowable subgroups since the DOE is currently analyzing the performance of these groups of students: 1) English Language Learners, 2) students with disabilities, 3) the lowest-performing third of students, 4) overage/under-credited students, or 5) Black/Latino males (consistent with New York City’s Expanded Success Initiative).

School MOSL Committees shall consider, when selecting subgroups for Local Measures that the intent of having both Local and State Measures is to have two different measures of student learning. Using subgroups for Local Measures, by nature of the fact that they are a subset of the overall population, will in many instances mean that State and Local Measures are more similar to one another than if different assessments are used for State and Local Measures. Therefore, subgroups should not be selected for teachers in some schools if the subgroup selected reflects the entire population of students the teacher serves (e.g., if a teacher only teaches English Language Learners, the Committee shall not select English Language Learners for their Local Measures and all of their students for the same assessment on their State Measures).

In the event that schools inadvertently select the same measures for State and Local Measures (after to the extent possible they have had an opportunity to


correct), the lowest third performing students will be used for Local Measures and the entire populations of students used for State Measures.

The Central MOSL Committee will revisit the list of allowable subgroups annually, taking into account feedback from educators. If the Central MOSL Committee cannot agree on new/different subgroups, the current list of subgroups will be used.

Evaluators cannot choose to go above the 50% rule in selecting teachers’ State Measures. The 50% rule will be followed for State Measures, per State Education Department guidance, such that teachers’ State Measures must be determined as follows: for teachers of multiple courses, courses that result in a state growth score must always be used for a teacher’s State Measures. If a teacher does not teach any courses that result in state growth scores, or state growth score courses cover less than 50% of a teacher’s students, courses with the highest enrollment will be included next until 50% or more of students are included.

The 50% rules shall not apply to Local Measures. School MOSL Committees shall select the method that shall be used to determine which courses shall be included in a teacher’s Local Measure. In the 2014-15 school year and thereafter, the DOE will 1) state this rule, provide guidance for teachers of multiple courses, and describe the benefits and considerations of not following the 50% rule for Local Measures and 2) explain how to record and track Local Measures selections for individual teachers when the 50% rule is and is not used for Local Measures. The process for setting student targets for Local Measures is the same as the process for setting student targets for State Measures. The only exception is Group Measures (not including Linked Measures) for Local Measures. For Group Measures, the School MOSL Committees will have the option of recommending for Local Measures that student targets are set either 1) following the process used for State Measures or 2) by the Committee. If the School MOSL Committee’s chooses to create the targets and the principal accepts the School MOSL Committee’s recommendation, the School MOSL Committee must create these targets no later than December 1. Targets must be submitted using a format determined by the DOE. In the event that the School MOSL Committee cannot agree on Group Measures targets for Local Measures, Group Measures targets will be determined following the process used for State Measures which requires that superintendents must finalize targets by January 15.

School MOSL Committees may recommend which baselines will be used for Local Measures from a menu of options created by the DOE. The only exceptions are instances where the same assessments are used for teachers in the same grades/subjects for State Measures. In these instances, the Principal shall select the baselines that will be used for State and Local Measures.

School MOSL Committees may recommend that Local Measures, Group Measures and Linked Measures may be used with state-approved 3rd party


assessments. The DOE shall create guidance that will include a description of which 3rd party assessments it can use to create growth models.

School MOSL Committees may recommend that for Local Measures, Group Measures and Linked Measures may be used with NYC Performance Assessments. The DOE shall create guidance which will include a description of which NYC Performance Assessments it can use to create growth models, as well as the implications of selecting Group Measures with NYC Performance Assessments for scoring.

Regarding the Local Measures school-wide default, if a School MOSL Committee makes recommendations for Local Measures in only some grades/subjects, the principal may accept those recommendations and the Local Measures default would apply for the grades and subjects for which there is no recommendation.
Principals must choose to accept either all a School MOSL Committee’s recommendations or none of the School MOSL Committee’s recommendations. If the School MOSL Committee recommends the Local Measures default (or the principal does not accept the School MOSL Committee’s recommendations and therefore the Local Measures default must be used), teachers must administer NYC Performance Assessments in grades 4-8 ELA and Math (if they are included in the DOE’s menu of NYC Performance Assessments that are approved by the Commissioner annually). In the foregoing scenario, the DOE growth models will be used to calculate a teacher’s score on the NYC Performance Assessments in grades 4-8 ELA and Math.

Growth Model Conversion Charts


For assessments where schools opt to use DOE-created growth models for State or Local Measures, including the Local Measures default, the DOE shall create scoring charts that convert growth model scores into 0-20 points, taking into account confidence intervals. These charts must be shared and discussed with the MOSL Central Committee (as defined herein) annually. In addition, analyses will be conducted and shared with the MOSL Central Committee regarding the comparability of Individual, Group, and Linked Measures. If members of the MOSL Central Committee do not agree with any element of the growth model conversion charts and/or how they were created, the MOSL Central Committee members that are in disagreement may submit in writing to the Chancellor their reasons for disagreement.

The parties agree to convene a MOSL Technical Advisory Committee (the “MOSL TAC”) consisting of one person designated by the DOE, one person designated by the UFT, and a person mutually-selected by the Parties.  To ensure a meaningful and fair distribution of ratings, the MOSL TAC shall review the methodology and approach to the creation of growth models and their conversion charts and provide recommendations to the Chancellor. The Chancellor shall have final decision-making authority on the growth model conversion charts.



Measures of Student Learning Options


1.                    For the 2014-15 school year and thereafter the DOE shall create new measures (referred to as “Linked Measures”) for Local and State Measures of Student Learning such that there is an option for each teacher to be evaluated based upon assessment results of students he/she teaches. Some or all assessments are not linked to courses the teacher teaches.

2.                    For the 2013-14 school year, the following process for “procedural appeals” will only apply to “Group Measures” (i.e., measures where teachers are evaluated based on the performance of some or all students they do not teach). For the 2014-15 and 2015-16 school years, the following process for “procedural appeals” will apply to Linked Measures and Group Measures. For the 2016-17 school year and thereafter the following process for “procedural appeals” will apply only to Group Measures. In all cases, teachers with 50% or more of their Local or State Measures based on Linked Measures/Group Measures shall be eligible for the procedural appeals process.

3.                    If a teacher receives “Ineffective” ratings in both the State and Local Measures subcomponents and either is based on Linked Measures or Group Measures, and in that year the teacher receives either a “Highly Effective” or “Effective” rating on the Measures of Teaching Practice subcomponent, the teacher shall have a right to a “procedural appeal” of such rating to a representative of the DOE’s Division of Teaching and Learning.
a.                    If the teacher receives a “Highly Effective” rating on the Measures of Teaching Practice subcomponent, there shall be a presumption that the overall APPR rating shall be modified by the DOE such that the overall “Ineffective” rating becomes either an “Effective” rating (in the instance where both the State and Local Measures of Student Learning subcomponents are based on Linked Measures or Group Measures) or a “Developing” rating (in the instance where only one of the State or Local Measures of Student Learning subcomponents is based on Linked Measures or Group Measures);
b.                    If the teacher receives an “Effective” subcomponent rating on the Measures of Teaching Practice, there shall be a presumption that the overall APPR rating shall be modified by the DOE such that the overall “Ineffective” rating becomes a “Developing” rating if both the State and Local Measures of Student Learning subcomponents are based on Linked Measures or Group Measures. If only one of the State or Local Measures of Student Learning subcomponents be based on Linked Measures or Group Measures, the rating shall be appealed to the principal, who shall have the discretion to increase the teacher’s overall APPR rating. If the principal does not respond to the appeal, the teacher’s overall APPR rating shall be modified to a “Developing” rating.
c.                    The above-described procedural appeal process is separate and distinct from, and in addition to the appeal processes set forth in the Commissioner’s Decision.



4.                    In the event a teacher receives an “Highly Effective” rating in both the State and Local Measures of Student Learning, and neither is based on Linked Measures or Group Measures, and in that year the teacher is rated “Ineffective” on Measures of Teaching Practice subcomponent, and this results in the teacher receiving an “Ineffective” overall APPR rating, the UFT may choose to appeal the rating to a three (3) member Panel consistent with the rules for Panel Appeals as described in Education Law § 3012-c (5-a) and the Commissioner’s Decision. However, these appeals shall not be counted towards the 13% of “Ineffective” ratings that may be appealed pursuant to Education Law §3012-c (5-a)(d) and the Commissioner’s Decision.

5.                    The Parties agree to meet each fall to review and discuss other types of anomalies in scoring and determine appropriate actions.

6.                    The DOE and UFT shall establish a Measures of Student Learning Central Committee consisting of an equal number of members selected by the DOE and the UFT (herein referred to as the “MOSL Central Committee”). The MOSL Central Committee shall convene within sixty (60) days after the ratification of this agreement by the UFT and each month thereafter. The MOSL Central Committee shall explore additional assessment options for the 2014-15 school year, which could include state-approved 3rd party assessments or existing assessments (e.g., Fitnessgram, LOTE exams), and review and approval by the Chancellor, which would be offered as non-mandated options for State and Local Measures. The MOSL Central Committee shall also examine the current range of
options and discuss expanded options for the State and Local Measures of Student Learning including, but not limited to, subject-based assessments, the use of portfolios, project-based learning, and/or semi-annualized/term course assessments. The MOSL Central Committee will also examine potential changes to the Local Measures default each school year. The MOSL Central Committee shall propose expanded options for the 2015-16 school year and thereafter.
Expanded options proposed by the MOSL Central Committee shall be implemented for the 2015-2016 school year and thereafter subject to review and approval by the Chancellor. All MOSL options for the 2014-15 school year and thereafter shall be shared with the MOSL Central Committee. The MOSL Central Committee shall review all MOSL options to determine which options shall be proposed to the Chancellor for approval. If members of the MOSL Central Committee cannot agree which options should be proposed to the Chancellor, the MOSL Central Committee members that are in disagreement may submit in writing to the Chancellor their reasons for disagreement. The Chancellor shall have final decision-making authority.

7.                    There will be no State Measures default. Principals must make decisions for State Measures for all applicable grades/subjects in their school by the deadline. For the 2014-15 school year, the Local Measures default for all schools shall be a school-wide measure of student growth based on all applicable


assessments administered within the building which are limited to NYC Performance Assessments, if developed by August 1 prior to the start of the school year, and/or state-approved 3rd party assessments (Chancellor must select by August 1 prior to the start of the school year), and/or state assessments. The DOE and UFT shall annually review the Local Measures default and discuss the possibility of altering the default. If agreement is not reached at the conclusion of each year, the default will be the same as that used in the 2014-15 school year.

8.                    All decisions of the School MOSL Committee (as defined in the Commissioner’s Decision) must be recommended to the principal and the principal must 1) accept the recommendation (or opt for the Local Measures default) and 2) select the State Measures no later than ten (10) school days after the first day of school for students.

9.                    In the event that a school uses the goal-setting option for State or Local Measures, teachers must submit their proposed goals to their building principal or designee no later than November 1 of each school year absent extraordinary circumstances. The principal or designee must finalize teacher’s goals no later than December 1 of each school year, absent extraordinary circumstances.

10.                 Teachers whose MOSL scores would have been subject to chart 2.11 or
3.13 of the Commissioner’s Decision shall now be assigned points such that 85%- 100% of students must meet or exceed targets for a teacher to be rated Highly Effective; 55%-84% of students must meet or exceed targets for a teachers to be rated Effective; 30%-54% of students must meet or exceed targets for a teacher to be rated Developing; and 0%-29% of students must meet or exceed targets for a teacher to be rated Ineffective.

Peer Validator


1.                    Except as modified herein, the Peer Validator shall replace the Independent Validator and fulfill all of the duties of and comply with the provisions applicable to the Independent Validator set forth in Education Law § 3012-c(5-a) and the Commissioner’s Decision.

2.                    Term: The Peer Validator program shall be two (2) school years (2014-15 and 2015-16). At the end of the two years, the parties must agree to extend the Peer Validator program and in the absence of an agreement the parties shall revert to the Independent Validator process as set forth in Education Law § 3012-c(5-a) and the Commissioner’s Decision.

3.                    Selection: A joint DOE-UFT committee composed of an equal number of members from the UFT and the DOE (the “Selection Committee”) shall be established to determine selection criteria and screen and select qualified applicants to create a pool of eligible candidates. The Deputy Chancellor of Teaching and Learning shall select all Peer Validators from the pool of all eligible


candidates created by the Selection Committee. To be eligible to become a Peer Validator an applicant must have at least five (5) years teaching experience; be tenured as a teacher; have received an overall APPR rating of Highly Effective or Effective (or Satisfactory rating where applicable) in the most recent school year; and either be a teacher, a teacher assigned, an assistant principal with reversion rights to a tenured teacher position, or an education administrator with reversion rights to a tenured teacher position.

4.                    Duties: The term for a Peer Validator shall be for two (2) years. All Peer Validators shall work under the title of Teacher Assigned A and shall have the same work year and work day as a Teacher Assigned A as defined in the collective bargaining agreement. Peer Validators shall report to the Deputy Chancellor of Teaching and Learning or his/her designee. Peer Validators shall conduct observations consistent with the Commissioner’s Decision and shall not review any evidence other than what is observed during an observation by the Peer Validator. All assignments are at the discretion of the DOE, however Peer Validators shall not be assigned to any school in which s/he previously worked. The parties agree to consult regarding Peer Validator assignments and workload. Peer Validators shall be reviewed and evaluated by the Deputy Chancellor of Teaching and Learning or his/her designee. The review and evaluation of a Peer Validator shall not be based in any way on whether the Peer Validator agrees or disagrees with the principal’s rating. A Peer Validator may be removed from the position at any point during the program provided that both the DOE and UFT agree. Teachers who become Peer Validators shall have the right to return to their prior school at the end of their term as a Peer Validator.


5.                    Compensation: Peer Validators shall receive additional compensation in the amount of fifteen thousand dollars ($15,000.00) per year for the term of this agreement above the applicable teacher compensation in accordance with the collective bargaining agreement.

Friday, February 17, 2012


Carol Burris, principal, on the new NY State teacher evaluation plan announced yesterday


Carol is the courageous Long Island principal who co-authored the letter, signed onto by  one third of all NY State principalsprotesting the NYS teacher evaluation system. Her follow-up article for the Washington Post Answer Sheet was called, “Forging ahead with a nutty teacher evaluation plan.”

Carol Burris
Below is the email she sent out, late last night; appended to a press release from Commissioner King and Regents Chancellor Merryl Tisch, that contained an outline of the provisions in the agreement announced yesterday. 
The agreement, as summarized by King and Tisch, says that test scores will trump all, as “Teachers rated ineffective on student performance based on objective assessments must be rated ineffective overall.”
In addition, the Commissioner can reject any locally devised system that isn’t “rigorous” enough, and can require “corrective” action if “districts evaluate their teachers positively regardless of students’ academic progress”, i.e. refuse to rate them as ineffective based on test scores alone. 
This all will be done by means of unreliable state tests that in recent years have been repeatedly shown to be defective, as filtered through a “growth” model that has been shown to have even less reliability. 
The only possible meaning of “multiple measures” in this context is that there are multiple ways to ensure that a teacher can be judged as a failure.
___
From: Carol Burris
Sent: Thursday, February 16, 2012 10:16 PM
Dear friends,
Every teacher I know in NY is in a state of shock after seeing what NYSUT agreed to.  I think principals were somewhat prepared, but I do believe that teachers had hoped that somehow NYSUT would come through for them.  The power transferred to the commissioner is unprecedented. The governor's 'shot clock' flies in the face of the Taylor Law. The fact that Randi Weingarten applauded this agreement is beyond comprehension.  Teachers feel abandoned.
The governor who just last week said "I am the government" is a bully who is now thumping his chest in victory as the 'student lobbyist'. My greatest fear is that educators will be so discouraged they will bow their heads. Anything you can write, post, blog or send as an editorial in the next days and weeks will help to lift spirits.  I know Sean and I will keep the letter going but we will need your help in keeping resistance to this wrongheaded policy alive. thank you for all you do. Carol

No comments:

Post a Comment

Please do not use offensive language