(Revisions Adopted April 2023)

(Revised June 2023)

Assessment Design Standard 1: Assessment Designed for Validity and Fairness

The sponsor* of a teaching performance assessment seeking approval for use in California (model sponsor) designs a Teaching Performance Assessment (TPA) in which complex pedagogical assessment tasks and multi-level scoring scales are linked to and assess California’s Teaching Performance Expectations (TPEs). The model sponsor clearly describes the uses for which the assessment has been validated (i.e., to serve as a determination of a candidate’s status with respect to the TPEs and to provide an indication of preparation program quality and effectiveness), anticipates its potential misuses, and identifies appropriate uses consistent with the assessment’s validation process. The model sponsor maximizes the fairness of the assessment design for all groups of candidates in the program. A passing standard is recommended by the model sponsor based on a standard setting study where educators have made a professional judgment about an appropriate performance standard for beginning teachers to meet prior to licensure.

* Note: the “model sponsor” refers to the entity that represents the assessment and is responsible to programs using that model and to the Commission. Model sponsors may be a state agency, individual institutions, a consortium of institutions and/or partners, a private entity, and/or combinations of these.

Required Elements for Assessment Design Standard 1: Assessment Designed for Validity and Fairness

1(a) The Teaching Performance Assessment includes complex pedagogical assessment tasks to prompt aspects of candidate performance that measure the TPEs. Each task is substantively related to two or more major domains of the TPEs. For use in judging candidate-generated responses to each pedagogical task, the assessment also includes multi-level scoring rubrics that are clearly related to the TPEs that the task measures. Each task and its associated rubrics measure two or more TPEs. Collectively, the tasks and rubrics in the assessment address key aspects of all major domains of the TPEs. The sponsor of the performance assessment documents the relationships between TPEs, tasks, and rubrics.

1(b) 1. The multiple subject general education TPA model sponsor must include in its performance assessment a focus on content-specific pedagogy within the design of the TPA tasks and scoring scales to assess the candidate’s ability to effectively teach literacy in a manner aligned to the requirements of subparagraphs (A) and (B) of paragraph (4) of subdivision (b) of Education Code sec. 44259; the Commission’s standards of program quality and effectiveness and current Teaching Performance Expectations (TPEs); and the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the State Board as well as the content areas authorized by the credential.

1(b) 2. The single subject general education TPA model sponsor must include in its performance assessment a focus on content-specific pedagogy within the design of the TPA tasks and scoring scales to assess the candidate’s ability to effectively teach the content area(s) authorized by the credential.

1(b) 3. The education specialist TPA model sponsor must include in its performance assessment a focus on content specific pedagogy and provide consultative, collaborative, and coordinating specially designed instruction with students, parents, teachers, and other community and school personnel within the design of the TPA tasks and scoring scales. It must also assess the candidate’s ability to effectively teach literacy in a manner aligned to the requirements of subparagraphs (A) and (B) of paragraph (4) of subdivision (b) of Education Code section 44259; the Commission’s standards of program quality and effectiveness and current Teaching Performance Expectations (TPEs); and the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the State Board, as well the content areas authorized by the credential.

1(b) 4. The PK-3 TPA model sponsor must include a focus on developmentally appropriate pedagogy within the design of the TPA tasks and scoring scales to assess the candidate’s ability to effectively teach literacy in a manner aligned to the requirements of subparagraphs (A) and (B) of paragraph (4) of subdivision (b) of Education Code section 44259; the Commission’s standards of program quality and effectiveness and current Teaching Performance Expectations (TPEs); and the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the State Board, as well as the content areas authorized by the credential.

1(c) Consistent with the language of the TPEs, the model sponsor defines scoring rubrics so candidates for credentials can earn acceptable scores on the Teaching Performance Assessment with the use of different literacy and content-specific pedagogical practices that support implementation of the state-adopted content standards, curriculum frameworks, and Preschool Learning Foundations. The model sponsor takes steps to plan and anticipate the appropriate scoring of candidates who use a wide range of pedagogical practices that are educationally effective and builds scoring protocols to take these variations into account.

1(d) 1. For Multiple Subject, Single Subject, and PK-3 candidates, the model sponsor must include within the design of the TPA candidate tasks a focus on addressing the teaching of English learners, all underserved education groups or groups that need to be served differently, and students with disabilities in the general education classroom to adequately assess the candidate’s ability to effectively teach all students.

1(d) 2. For Education Specialist candidates, the model sponsor must include within the design of the TPA candidate tasks a focus on addressing teaching students who have an IEP (students aged 3 through 22), who have an IEP and English learners, and who have an IEP who are underserved education groups or groups that need to be served differently to adequately assess the candidate’s ability to effectively teach all students with disabilities.

1(e) 1. For Multiple Subject, PK-3, and Education Specialist candidates, the model sponsor must include assessments of the candidate’s ability to demonstrate pedagogical competence related to teaching current, state-adopted core content areas of at least Literacy and Mathematics. Programs use local program performance assessments for History/Social Science and Science if not already included as part of the TPA.

1(f) The model sponsor must include a teaching performance within the TPA during the required clinical experience, including a video of the candidate’s teaching performance with candidate commentary describing the lesson plan and rationale for teaching decisions shown and evidence of the effect of that teaching on student learning.

1(g) The TPA model sponsor must provide materials appropriate for use by programs in helping faculty become familiar with the design of the TPA model, the candidate tasks, and the scoring rubrics so that faculty can effectively assist candidates to prepare for the assessment. The TPA model sponsor must also provide candidate materials to assist candidates in understanding the nature of the assessment, the specific assessment tasks, the scoring rubrics, submission processes and scoring processes.

1(h) The model sponsor develops scoring rubrics and assessor training procedures that focus primarily on teaching performance and that minimize the effects of candidate factors that are not clearly related to pedagogical competence, which may include any actual or perceived characteristic protected by AB 537, which includes sex, sexual orientation, gender identity, ethnic group identification, race, ancestry, national origin, religion, color, or mental or physical disability or any other bias that is not likely to affect job effectiveness and/or student learning, such as appearance, hairstyles and/or hair texture, demeanor, speech patterns and accents, or personal attire.

1(i) 1. The model sponsor provides a clear statement acknowledging the intended uses of the assessment. The statement demonstrates the model sponsor’s clear understanding of the implications of the assessment for Multiple Subject, PK-3, and Education Specialist candidates, preparation programs, public schools, and public school students within the authorization of the credential. The statement includes appropriate cautions about additional or alternative uses for which the assessment is not valid. All elements of assessment design and development are consistent with the intended uses of the assessment for determining the literacy and content-specific pedagogical competence of candidates for Preliminary Teaching Credentials in California and as information useful for determining program quality and effectiveness.

1(i) 2. The model sponsor provides a clear statement acknowledging the intended uses of the assessment. The statement demonstrates the model sponsor’s clear understanding of the implications of the assessment for single subject candidates, preparation programs, public schools, and public school students within the authorization of the credential. The statement includes appropriate cautions about additional or alternative uses for which the assessment is not valid. All elements of assessment design and development are consistent with the intended uses of the assessment for determining the content-specific pedagogical competence of candidates for Preliminary Teaching Credentials in California and as information useful for determining program quality and effectiveness.

1(j) The model sponsor completes content review and editing procedures to ensure that literacy and content-specific pedagogical assessment tasks and directions to candidates are culturally and linguistically responsive, sustaining, fair and appropriate for candidates from diverse backgrounds.

1(k) The model sponsor completes initial and periodic basic psychometric analyses to identify pedagogical assessment tasks and/or scoring rubrics that results in differential effects in relation to candidates’ race, ethnicity, language, gender or disability. When group pass rate differences are found, the model sponsor investigates the potential sources of differential performance and seeks to eliminate construct-irrelevant sources of variance.

1(l) In designing assessment administration procedures, the model sponsor includes administrative accommodations that preserve assessment validity while addressing issues of access for candidates with disabilities or learning needs.

1(m) In the course of determining a passing standard, the model sponsor secures and reflects on the considered judgments of teachers, supervisors of teachers, support providers of new teachers, and other preparers of teachers regarding necessary and acceptable levels of proficiency on the part of entry-level teachers. The model sponsor periodically reviews the reasonableness of the scoring scales and established passing standard, when and as directed by the Commission.

1(n) To preserve the validity and fairness of the assessment over time, the model sponsor may need to develop and field test new literacy and content-specific pedagogical assessment tasks and multi-level scoring rubrics to replace or strengthen prior ones. Initially and periodically, the model sponsor analyzes the assessment tasks and scoring rubrics to ensure that they yield important evidence that represents candidate knowledge and skill related to the TPEs and serve as a basis for determining entry-level pedagogical competence to teach the curriculum and student population of California’s public schools. The model sponsor documents the basis and results of each analysis, and modifies the tasks and rubrics as needed.

1(o) The model sponsor must make all TPA materials available to the Commission upon request for review and approval, including materials that are proprietary to the model sponsor. The Commission will maintain the confidentiality of all materials designated as proprietary by the model sponsor.

1(p) For concurrent bilingual candidates, no candidate can be required to translate student work or provide English transcriptions for the video component(s) of the TPA if in a language other than English. Model sponsors must ensure candidates may demonstrate their knowledge and skills teaching literacy in the language of instruction, including in a language other than English.

1(q) All candidates must demonstrate as part of the TPA effective strategies teaching an English learner, in English with the use of the language of instruction as appropriate, within the content area of the intended credential. Each candidate must submit his or her analyses and reflections primarily in English.

Assessment Design Standard 2: Assessment Designed for Reliability and Fairness

The sponsor of the performance assessment requests approval of an assessment that will yield, in relation to the key aspects of the major domains of the TPEs, enough collective evidence of each candidate’s pedagogical performance to serve as a valid basis to judge the candidate’s general pedagogical competence for a Preliminary Teaching Credential. The model sponsor carefully monitors assessment development to ensure consistency with this stated purpose of the assessment. The Teaching Performance Assessment includes a comprehensive program to train, calibrate and maintain assessor calibration over time. The model sponsor periodically evaluates the assessment system to ensure equitable treatment of candidates. The assessment system and its implementation contribute to local and statewide consistency in the assessment of teaching competence.

Required Elements for Assessment Design Standard 2: Assessment Designed for Reliability and Fairness

2(a) In relation to the key aspects of the major domains of the TPEs, the pedagogical assessment tasks, rubrics, and the associated directions to candidates are designed to qualifications for a Preliminary Teaching Credential as one part of the requirements for the credential.

2(b) Pedagogical assessment tasks and scoring rubrics are extensively field tested in practice before being used operationally in the Teaching Performance Assessment. The model sponsor evaluates the field test results thoroughly and documents the field test design, participation, methods, results and interpretation.

2(c) The Teaching Performance Assessment system includes a comprehensive process to select and train California educators as assessors who score candidate responses to the pedagogical assessment tasks. An assessor training program demonstrates convincingly that prospective and continuing assessors gain a deep understanding of implicit bias as it relates to scoring, the TPEs, the pedagogical assessment tasks and the multi-level scoring rubrics. The training program includes task-based scoring trials in which an assessment trainer evaluates and certifies each assessor’s scoring accuracy and calibration in relation to the scoring rubrics associated with the task. The model sponsor for multiple subject, PK-3, and education specialist TPAs establish selection criteria for assessors of candidate responses to the TPA. The selection criteria include but are not limited to appropriate literacy and pedagogical expertise in the content areas and TPE domains assessed within the TPA. The model sponsor for the single subject TPA establishes selection criteria for assessors of candidate responses to the TPA. The selection criteria include but are not limited to appropriate pedagogical expertise in the content areas and TPE domains assessed within the TPA. The model sponsor selects assessors who meet the established selection criteria and uses only assessors who successfully calibrate during the required TPA model assessor training sequence. When new pedagogical tasks and scoring rubrics are incorporated into the assessment, the model sponsor provides additional training to the assessors, as needed.

2(d) In conjunction with the provisions of the applicable Teacher Preparation Program Standards relating to the Teaching Performance Assessment, the model sponsor plans and implements periodic evaluations of the assessor training program, which include systematic feedback from assessors and assessment trainers, and which lead to substantive improvements in the training as needed.

2(e) The model sponsor provides a consistent scoring process for all programs using that model, including programs using a local scoring option provided by the model sponsor. The scoring process conducted by the model sponsor to assure the reliability and validity of candidate outcomes on the assessment may include, for example, regular auditing, selective back reading, and double scoring of candidate responses near the cut score by the qualified, calibrated scorers trained by the model sponsor. All approved models must include a local scoring option in which the assessors of candidate responses are California program faculty and/or other individuals identified by the program who meet the model sponsor’s assessor selection criteria. These local California assessors are trained and calibrated by the model sponsor, and whose scoring work is facilitated, and their scoring results are facilitated and reviewed by the model sponsor. The model sponsor provides a detailed plan for establishing and maintaining scorer accuracy and inter-rater reliability during field testing and operational administration of the assessment. The model sponsor demonstrates that the assessment procedures, taken as a whole, maximize the accurate determination of each candidate’s overall pass-fail status on the assessment. The model sponsor must provide an annual audit process that documents that local scoring outcomes are consistent and reliable within the model for candidates across the range of programs using local scoring and informs the Commission where inconsistencies in local scoring outcomes are identified. If inconsistencies are identified, the sponsor must provide a plan to the CTC for how it will address and resolve the scoring inconsistencies both for the current scoring results and for future scoring of the TPA.

2(f) The model sponsor’s assessment design includes a clear and easy to implement appeal procedure for candidates who do not pass the assessment, including an equitable process for rescoring of evidence already submitted by an appellant candidate in the program, if the program is using centralized scoring provided by the model sponsor. If the program is implementing a local scoring option, the program must provide an appeal process as described above for candidates who do not pass the assessment. Model sponsors must document that all candidate appeals granted a second scoring are scored by a new assessor unfamiliar with the candidate or the candidate’s response.

2(g) The model sponsor conducting scoring for the program provides results on the TPA to the individual candidate based on performance relative to TPE domains and/or to the specific scoring rubrics within a maximum of three weeks following candidate submission of completed TPA responses. The model sponsor provides results to programs based on both individual and aggregated data relating to candidate performance relative to the rubrics and/or domains of the TPEs. The model sponsor also follows the timelines established with programs using a local scoring option for providing scoring results.

2(h) The model sponsor provides program level aggregate results to the Commission, in a manner, format and time frame specified by the Commission, as one means of assessing program quality. It is expected that these results will be used within the Commission’s ongoing accreditation system.

Assessment Design Standard 3: TPA Model Sponsor Support Responsibilities

The sponsor of the performance assessment provides technical support to teacher preparation programs using that model concerning fidelity of implementation of the model as designed. The model sponsor is responsible for conducting and/or moderating scoring for all programs, as applicable, within a national scorer approach and/or the local scoring option. The model sponsor has ongoing responsibilities to interact with the Commission, to provide candidate and program outcomes data as requested and specified by the Commission, and to maintain the currency of the model overtime.

3(a) The model sponsor provides technical assistance to programs implementing the model to support fidelity of implementation of the model as designed. Clear implementation procedures and materials such as a candidate and a program handbook are provided by the model sponsor to programs using the model.

3(b) A model sponsor conducting scoring for programs is responsible for providing TPA outcomes data at the candidate and program level to the program within three weeks and to the Commission, as specified by the Commission. The model sponsor supervising/moderating local program scoring oversees data collection, data review with programs, and reporting.

3(c) The model sponsor is responsible for submitting at minimum an annual report to the Commission describing, among other data points, the programs served by the model, the number of candidate submissions scored, the date(s) when responses were received for scoring, the date(s) when the results of the scoring were provided to the preparation programs, the number of candidate appeals, first time passing rates, candidate completion passing rates, and other operational details as specified by the Commission.

3(d) The model sponsor is responsible for maintaining the currency of the TPA model, including making appropriate changes to the assessment tasks and/or to the scoring rubrics and associated program, candidate, and scoring materials, as directed by the Commission when necessitated by changes in state-adopted content standards and frameworks, as well as Commission adopted teacher preparation standards and TPEs.

3(e) The model sponsor must define the retake policies for candidates who fail one or more parts of the TPA which preserve the reliability and validity of the assessment results. The retake policies must include whether the task(s) on which the candidate was not successful must be retaken in whole or in part, with appropriate guidance for programs and candidates about which task and/or task components must be resubmitted for scoring by a second assessor and what the resubmitted response must include.

Updated November 20, 2023