Approved by Commission February 12, 2026

Assessment Design Standard 1: Assessment Designed for Validity and Fairness

The sponsor* of a teaching performance assessment seeking approval for use in California (model sponsor) designs a Teaching Performance Assessment (TPA) in which complex pedagogical assessment tasks, authentic to the practice of teaching, are aligned with key aspects of all major domains of California's Teaching Performance Expectations (TPEs) and assessed through multi-level analytic scoring rubrics.

The model sponsor clearly:

  • Describes and provides data to support the specific uses for which the assessment is valid (i.e., to serve as a determination of a candidate's competence with respect to the TPEs and to provide an indication of preparation program quality and effectiveness)
  • anticipates its potential misuses
  • maximizes the fairness of the assessment for all groups of candidates
  • documents the relationship among the tasks, the TPEs, and the rubrics

A passing standard is recommended by the model sponsor and presented to the Commission for adoption based on a standard setting where educators have made a professional judgment about an appropriate performance standard for beginning teachers to meet prior to licensure.

The model sponsor carefully monitors and documents assessment development to ensure the TPA's consistency with its stated purpose.

* Note: the “model sponsor” refers to the entity that represents the assessment and is responsible to programs using that model and to the Commission. Model sponsors may be a state agency, individual institutions, a consortium of institutions and/or partners, a private entity, and/or combinations of these.

1(a)TPA Required Specifications:

1(a)(1) General Task Specifications:

The Teaching Performance Assessment (TPA), completed during a candidate's clinical experience, includes complex pedagogical assessment tasks, authentic to teaching in the credential area sought, that require candidates to, at a minimum, engage in an instructional cycle that incorporates:

  1. >gathering data about students
  2. using the data to inform the design and implementation of an instructional sequence that is responsive to students'; learning needs and cultural and linguistic identities and is aligned with appropriate content and grade-level standards,
  3. gathering data on the instruction and students' learning through informal and formal assessments,
  4. analyzing that data and reflecting on the results
  5. identifying next steps for instruction.

All tasks must be constructed to reduce bias to ensure fairness and validity.

Each TPA task and its related rubrics:

  • measures two or more major domains of the TPEs
  • supports the implementation of state-adopted content standards, curriculum frameworks, and/or Preschool Learning Foundations
  • requires candidates to demonstrate their ability to provide instruction that is considerate of the whole child, thus attending to the diversity of learners in their classroom and in the state of California, including
    • effective strategies for supporting emergent bilingual students in English, with the use of the language of instruction as appropriate, within the content area of the intended credential
    • effective strategies for supporting students with an IEP or 504 plan or who are otherwise designated as having specialized learning needs

The TPA includes a minimum of one video of the candidate's teaching performance. The candidate is required to provide written, audio, or video commentary on the video on instruction that includes description of the lesson plan, rationale for teaching decisions shown, and reflection on the effect of the teaching decisions on student learning.

Collectively, the tasks candidates complete are constructed to:

  • measure key aspects of all major domains of the TPEs
  • be culturally and linguistically responsive and sustaining (Gay, 2010; Ladson-Billings, 1995; Paris & Alim, 2012)
  • be appropriate for candidates from diverse backgrounds
  • incorporate Universal Design for Learning principles (CAST, 2024), providing candidates multiple options for how to demonstrate their knowledge
  • be embedded within teacher preparation coursework and clinical practice.
  • be modular in design and able to be incorporated into programs' existing learning management systems

TPA tasks and/or scoring rubrics and associated program, candidate, and scoring materials must maintain alignment with state-adopted content standards and frameworks and Commission-adopted teacher preparation standards and TPEs.

All TPA materials, including but not limited to task specifications, rubrics, templates, handbooks, and related data, are available to the Commission upon request for review and approval. The Commission maintains the confidentiality of all materials designated as proprietary by the model sponsor.

1(a)(2) PK3-Specific Task Specifications:

In addition to the General Task Specifications listed above, the PK-3 TPA includes tasks and rubrics to assess the candidate's ability to utilize developmentally appropriate pedagogy to effectively:

  • teach literacy in a manner aligned with the requirements of Education Code sec. 44259 subparagraphs (A) and (B) of paragraph (4) of subdivision (b);
  • teach current, state-adopted core content areas of at least Literacy and Mathematics
  • align all elements of instruction–including, at least, selected curriculum, student engagement opportunities, and assessments used–with the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the California State Board of Education

1(a)(3) Multiple Subject-Specific Task Specifications:

In addition to the General Task Specifications listed above, the Multiple Subject TPA includes tasks and rubrics to assess the candidate's ability to effectively:

  • teach literacy in a manner aligned with the requirements of Education Code sec. 44259 subparagraphs (A) and (B) of paragraph (4) of subdivision (b);
  • teach current, state-adopted core content areas of at least Literacy and Mathematics
  • align all elements of instruction–including, at least, selected curriculum, student engagement opportunities, and assessments used–with the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the California State Board of Education

1(a)(4) Education Specialist-Specific Task Specifications:

In addition to the General Task Specifications listed above, the Education Specialist TPA includes tasks and rubrics to assess the candidate's ability to effectively:

  • teach literacy in a manner aligned with the requirements of Education Code sec. 44259 subparagraphs (A) and (B) of paragraph (4) of subdivision (b);
  • teach current, state-adopted core content areas of at least Literacy and Mathematics
  • align all elements of instruction–including, at least, selected curriculum, student engagement opportunities, and assessments used–with the current English Language Arts/English Language Development (ELA/ELD) Framework adopted by the California State Board of Education

1(a)(5) Single Subject-Specific Task Specifications:

In addition to the General Task Specifications listed above, the Single Subject TPA includes tasks and rubrics to assess the candidate's ability to:

  • demonstrate pedagogical competence related to teaching the content area(s) authorized by the credential
  • teach current, state-adopted core content in their single subject credential area

1(b) TPA Required Scoring Specifications:

1(b)(1) Rubric Specifications:

Each TPA task includes analytic rubrics (Brown, 2018) that provide construct-specific scores clearly aligned with each task and its associated TPEs. As such, submissions are clearly scored on each rubric construct, allowing candidates and programs to receive feedback in the form of a specific score for each.

Any administrative accommodations preserve the TPA's validity while addressing issues of access for candidates with disabilities or learning needs.

1(b)(2) Bias Review:

TPA scoring processes focus on candidates' teaching performance and minimize the effects of candidate factors not clearly related to pedagogical competence, which may include any actual or perceived characteristic protected by AB 537 (sex, sexual orientation, gender identity, ethnic group identification, race, ancestry, national origin, religion, color, or mental or physical disability) or any other bias that is not likely to affect job effectiveness and/or student learning, such as appearance, hairstyles and/or hair texture, demeanor, speech patterns and accents, or personal attire.

Initial and ongoing psychometric analyses are conducted to identify and eliminate potential sources of bias in relation to candidates' race, ethnicity, language, gender or disability.

1(b)(3) Condition Codes:

Technical condition codes may be implemented only when an assessment cannot be scored because of a technical issue. Possible technical issues include, but are not limited to when a candidate uploads an incorrect or blank document, an uploaded document is illegible, an uploaded video or audio file cannot be viewed or heard, and/or a submission is uploaded to the incorrect credential area.

Content condition codes, previously allowed when required content is missing or determined to be inappropriate, may not be used. Instead, assessors should be directed to score submissions as they are submitted. Any potential content issues should be addressed through the rubrics.

1(b)(4) Standard Setting Process:

To determine the passing standard for the TPA, a standard setting is held that includes the judgments of individuals familiar with the expectations of beginning teachers in the credential area aligned with the TPA, including currently practicing credentialed teachers, supervisors of teachers, support providers of new teachers, and other preparers of teachers. The Standard Setting process must align with Standards 5.21-5.23 of the American Education Research Association, American Psychological Association, and National Council on Measurement in Education Standards for Educational and Psychological Testing (Brown, 2018).

1(c) Model Revisions:

Any revisions made to a TPA model must be reported to the Commission.

Each time substantive modifications or revisions are made, a new standard setting is held, and results must be reported to the Commission.

1(d) Statement of Intended Use:

The TPA includes a clear statement acknowledging the intended uses of the assessment. The statement demonstrates the model sponsor's clear understanding of the implications of the assessment for candidates, preparation programs, public schools, and public school students within the authorization of the credential. The statement includes appropriate cautions about additional or alternative uses for which the assessment is not valid.

1(d)(1) PK3, Multiple Subject, and Education Specialist Statements of Intended Use:

The Statement of Intended Use includes language that indicates all elements of assessment design and development are consistent with the assessment's intended uses for determining the literacy and content-specific pedagogical competence of candidates for Preliminary Teaching Credentials in California and as information useful for determining program quality and effectiveness.

1(d)(2) Single Subject Statement of Intended Use:

The Statement of Intended Use includes language that indicates all elements of assessment design and development are consistent with the assessment's intended uses for determining the content-specific pedagogical competence of candidates for Preliminary Teaching Credentials in California and as information useful for determining program quality and effectiveness.

Assessment Design Standard 2: Assessment Designed for Reliability and Fairness

The TPA model sponsor designs and maintains assessment scoring processes that fairly and reliably yield enough collective evidence, in relation to the key aspects of major domains of the TPEs, to serve as a valid basis to judge a candidate's general pedagogical competence for a Preliminary Teaching Credential.

The TPA includes a comprehensive program to calibrate assessors for scoring accuracy and maintain assessor calibration over time.

The model sponsor develops scoring processes that include ongoing monitoring of the assessment system to ensure equitable evaluation of candidates. The assessment system and its implementation contribute to local and statewide consistency in the assessment of teaching competence.

TPA model sponsors provide a technical manual outlining the full development and validation process that includes, at a minimum, the process used for developing the TPA and details about and results of field testing.

2(a) Field Testing:

TPA tasks and scoring rubrics are extensively field tested in practice before being used operationally. The TPA model sponsor evaluates the field test results thoroughly and documents the field test design, participation, methods, results and interpretation. Results of the field test are presented to the Commission for review prior to operational use.

2(b) Assessor Specifications:

All assessors, at a minimum, demonstrate pedagogical expertise in the content areas and TPE domains assessed within the specific instrument they will score.

  • PK3, Multiple Subject, and Education Specialist TPA assessors demonstrate expertise in the content areas of literacy and mathematics appropriate to the TPA credential area.
  • Single Subject TPA assessors demonstrate expertise in the specific content areas measured.

All assessors hold a valid credential and at least three years of experience working with PK-12 learners in California schools in the credential area being assessed and/or be a faculty member in a California educator preparation program with expertise in the credential area being assessed.

All assessors successfully calibrate on a TPA model before participating in a TPA scoring and must re-calibrate each scoring session to continue scoring.

2(c) Assessor Calibration Specifications:

The TPA Assessor Calibration Process includes, at a minimum:

  • A comprehensive introduction to the TPA that includes the conceptual framework of the TPA and an overview of each of the tasks, their corresponding rubrics, and their alignment with the TPEs
  • Task-based scoring trials using sample submissions in which a model sponsor scoring expert evaluates and certifies each assessor's scoring accuracy and calibration in relation to the scoring rubric(s) associated with the task
  • Feedback from assessors and calibration session leaders that informs revisions to the calibration process.

The TPA Assessor Re-Calibration Process includes, at a minimum:

  • Task-based scoring trials using sample submissions in which a model sponsor scoring expert evaluates and certifies each assessor's scoring accuracy and calibration in relation to the scoring rubric(s) associated with the task
  • Feedback from assessors and calibration session leaders that informs revisions to the calibration process.

2(d) Scoring Processes:

2(d)(1) Development of Scoring Processes to Ensure Reliability and Validity of Scores:

The developed scoring process assures and documents the reliability and validity of candidate outcomes on the assessment and includes the following, at a minimum, (1) regular auditing; (2) selective back reading; and (3) double scoring of candidate responses near the cut score by the qualified, calibrated scorers.

The model sponsor:

  • Develops a detailed plan for establishing and maintaining scorer accuracy and inter-rater reliability during both field testing and operational administration of the TPA
  • Demonstrates that the assessment procedures, taken as a whole, maximize the accurate determination of each candidate's overall pass-fail status on the assessment.

2(d)(2) Local Scoring Option:

TPA approved models provide a local scoring option that is commensurate with the model sponsor scoring, described above, and in which assessors are California program faculty; mentor teachers working with student teachers, interns and new teachers; and/or other individuals, including those from partner local education agencies, identified by the program. All selected assessors must meet the minimum qualifications described in the Assessor Specifications provided above.

All individuals selected as local assessors are calibrated using the same process as other selected assessors. Calibration scoring results are evaluated and approved by the model sponsor before an individual can participate in a scoring session.

When local scoring is utilized, a system of blind scoring of 15% of submissions must be implemented.

The model sponsor develops an annual audit process that documents that local scoring outcomes are consistent and reliable within the model for candidates across the range of programs using local scoring. At a minimum, the audit process includes a report of blind double-scoring at least 15 percent of submissions to ensure reliability. If two or more institutions engage in local scoring of the same assessment, scores are compared across institutions to ensure consistency.

  • The model sponsor informs the Commission if/when inconsistencies in local scoring outcomes are identified.
  • If inconsistencies are identified, the sponsor provides a plan to the Commission for how it will address and resolve the scoring inconsistencies both for the current scoring results and for future scoring of the TPA.

If a program chooses to engage in local scoring, any assessor fees incorporated into a model sponsor's registration costs must be given to the program to support the costs associated with local scoring.

2(d)(3) Appeal Process:

The TPA includes a clear and easy to implement appeal procedure for candidates who do not pass the assessment, including an equitable process for rescoring of evidence already submitted by an appellant candidate in the program.

All candidate appeals granted a second scoring are scored by a new assessor unfamiliar with the candidate or the candidate's response.

If a program implements a local scoring option, the program provides an appeal process as described above for candidates who do not pass the assessment.

2(e) Retake Policy:

A retake policy is developed for candidates who fail one or more parts of the TPA that preserves the reliability and validity of the assessment results.

The retake policy includes notification to candidates and programs with the following information, at a minimum:

  • rubric scores, including a rationale for any scores of one
  • whether the task(s) on which the candidate was not successful must be retaken in whole or in part
  • appropriate guidance for programs and candidates about which task and/or task components must be resubmitted for scoring by a second assessor
  • what the resubmitted response must include

2(f) Technical Assistance Manual

TPA model sponsors create a Technical Assistance Manual that outlines the full development and validation process of the TPA. The Technical Assistance Manual includes, at a minimum, the process used for developing the TPA, including steps taken to validate the assessment, and details about and results of field testing.

Assessment Design Standard 3: TPA Model Sponsor Support Responsibilities

TPA model sponsors provide technical support to teacher preparation programs using that model, including support for embedding the TPA within programs.

TPA model sponsors conduct and/or moderate scoring of all TPAs, as applicable, within a statewide scorer approach and/or the local scoring option. Model sponsors are responsible for reporting all TPA outcome data to programs and to the Commission.

3(a) Required Program Supports:

Faculty will provide formative feedback to candidates based on the content of the scoring rubrics (formative feedback will not be in the form of specific scores) prior to the submission of tasks for summative scoring through the embedding of TPA tasks. TPA model sponsors provide materials appropriate for use by programs to support faculty in effectively embedding the TPA within their programs.

All assessment materials will be available to programs annually by August 1.

Program support materials include at least the following:

  • TPA handbooks
  • TPA tasks
  • TPA rubrics
  • annotated passing and non-passing sample responses for each credential area
  • examples of commonly assigned technical condition codes

All program support materials are created to be dynamic, searchable, and interactive.

TPA model sponsors provide on-going technical assistance to program faculty to support the embedding of the TPA, including orientation and calibration materials for all individuals involved in the preparation of candidates. TPA model sponsors establish a process that facilitates faculty review of a candidate's submission before it moves to scoring.

3(b) Required Candidate Supports

The TPA model sponsors provide materials, at no additional cost, to assist candidates in understanding the nature of the assessment, the specific assessment tasks, the scoring rubrics, the submission processes, and the scoring processes.

3(c) Score Reporting Requirements

3(c)(1) To Candidates:

  • TPA results, in the form of both overall passing status and individual rubric scores aligned to TPE domains, are provided to candidates within a maximum of three weeks following candidate submissions.
    • If a candidate's submission is scored at a level one on any rubric, the candidate must be provided with a rationale for why the submission received that score.
  • If a submission is determined un-scorable because of a technical issue and thus receives a technical condition code, the submission is returned to the candidate within one week. The candidate may address the identified issue, resubmit the submission within one week, and receive a score within the same scoring window.

3(c)(2) To Programs:

  • Individual and aggregate TPA results data, relative to overall passing rate, individual rubric scores, and rubric-construct scores, are provided to programs at the conclusion of each scoring window. Aggregate results include all submissions, including those returned to candidates for a technical issue that were not resubmitted, relating to candidate performance relative to the rubrics and/or domains of the TPEs.
  • For programs utilizing the local scoring option, individual and aggregate TPA results data, relative to overall passing rate and individual rubric scores aligned to TPE domains, are provided to the program at an interval established in collaboration with the program.
  • For any candidate submissions that received a level one on any rubrics, the program must also be provided the rationale for why the submissions were scored at a level one.

3(c)(3) To the Commission:

  • Individual and program-level aggregate TPA results data, relative to overall passing rate, individual rubric scores, and rubric-construct scores, are provided to programs at the conclusion of each scoring window.
  • An annual report is submitted to the Commission each year that includes the following, at a minimum:
    • the programs served by the model
      • for each program, the number of candidates who submitted materials for scoring, the number of submissions scored; and the number of submissions returned or not scored with a rationale for why
    • the date(s) when responses were received for scoring
    • the date(s) when scoring results were provided to preparation programs
    • the number of candidate appeals, disaggregated by race, ethnicity and gender,
    • first time passing rates, disaggregated by race, ethnicity and gender, that include submissions returned for technical issues
    • candidate completion passing rates, disaggregated by race, ethnicity and gender,
    • other operational details as specified by the Commission.

3 (d) Candidate Survey

All TPA models administer a survey to candidates at the time of submission to gather information about the candidates experience. Commission-required items address candidates' perception of the formative nature of the TPA, the alignment between the TPA and required coursework, and the appropriateness of the TPA tasks for assessing candidates readiness to begin teaching.

References

Brown, J. (2018). Rubrics. In The SAGE encyclopedia of educational research, measurement, and evaluation (Vol. 4, pp. 1437-1440). SAGE Publications, Inc.

CAST (2024). CAST Universal Design for Learning Guidelines version 3.0. Retrieved from UDL Guidelines

Gay, G. (2010). Culturally Responsive Teaching (2nd ed.). New York: Teachers College Press.

Ladson-Billings, G. (1995). Toward a theory of culturally relevant pedagogy. American Educational Research Journal, 32(3), 465-491.

Paris, D., & Alim, H. S. (2012). Culturally Sustaining Pedagogies: Teaching and Learning for Justice in a Changing World. New York: Teachers College Press.

Updated March 09, 2026