February 2016

Assessment Design Standard 1: Assessment Designed for Validity and Fairness

The developer of an Administrator Performance Assessment (model sponsor), consistent with Commission practice and the provisions of Education Code section 44288, will appoint a design team consisting of recognized experts in the field of education administration, including public school administrators, teachers, college and university faculty and other stakeholders, to participate and advise in the selection, development, administration, and interpretation of the assessment. The Administrator Performance Assessment (APA) shall contain complex assessment tasks and multi-level scoring rubrics that are linked to and assess California’s Administrator Performance Expectations (CAPEs) with particular emphasis on school leadership. The assessment model sponsor clearly describes the uses for which the assessment has been validated, anticipates its potential misuses, and identifies appropriate uses consistent with the assessment’s validation process. The assessment is designed and validated to serve as a determination of a candidate’s status with respect to the CAPEs and to provide an indication of preparation program quality and effectiveness. The model sponsor maximizes the fairness of the assessment design for all groups of candidates in the program. A passing standard is recommended to the Commission based on a standard setting study where educators have made a professional judgment about an appropriate performance standard for beginning administrators to meet prior to licensure. * *Note: the “model sponsor” refers to the entity or entities that develop an administrator performance assessment, administer and score the assessment, and are responsible to programs using the assessment and to the Commission. The model sponsor may be a state agency, individual institution, a consortium of institutions and/or partners, a private entity, and/or combinations of these. The “model sponsor” could be a single entity that both develops and administers and scores the assessment, or these tasks may be divided across several entities within a partnership or collaborative arrangement.

Required Elements for Assessment Design Standard 1: Assessment Designed for Validity and Fairness

1(a) The Administrator Performance Assessment includes complex assessment tasks to prompt aspects of candidate performance that measure the CAPEs. Each task is substantively related to two or more domains of the CAPEs. For use in judging candidate-generated responses to each administrative task, the assessment also includes multi-level scoring rubrics that are clearly related to the CAPE elements that the task measures. Collectively, the tasks and rubrics in the assessment address key aspects of the CAPEs with particular emphasis on school leadership. The developer of the performance assessment documents the relationships between CAPE elements, tasks, and rubrics.

1(b) The Administrator Performance Assessment includes a focus on two key school administrator job roles within the design of the APA tasks and scoring rubrics to assess the candidate’s ability to effectively perform the job role of (1) the administrator as the instructional leader of the school and (2) the administrator as the school improvement leader.

1(c) Consistent with the language of the CAPEs, the model sponsor defines scoring rubrics so candidates for credentials can earn acceptable scores on the APA with the use of different administrative practices that support implementation of effective teaching and learning for all students, and improvements of student and other educational outcomes. The model sponsor takes steps to plan and anticipate the appropriate scoring of candidates who use a wide range of administrative practices that are educationally effective and builds scoring protocols to take these variations into account.

1(d) APA candidate tasks focus on an administrator’s role in promoting and supporting effective teaching and specific learning outcomes for English learners, underserved education groups or groups that need to be served differently, and students with special needs, to adequately assess the candidate’s ability to effectively perform the job role of the school’s instructional and improvement leader.

1(e) The APA may include a video or other media evidence of the administrative services candidate’s performance during fieldwork. If included in the APA, the video or other media must be accompanied by a commentary describing the activity and rationale for leadership decisions and actions shown and evidence of the effect of those decisions and actions in relation to selected aspects of the .

1(f) The APA model sponsor must develop materials appropriate for use by programs in helping faculty become familiar with the design of the APA, the candidate tasks and the scoring rubrics so that faculty can effectively assist candidates to prepare for the assessment. The APA model sponsor must also develop candidate materials to assist candidates in understanding the nature of the assessment, the specific assessment tasks, the scoring rubrics, submission processes, scoring processes, and appeal policies.

1(g) The model sponsor develops scoring rubrics and assessor training procedures that focus primarily on administrator performance and that minimize the effects of candidate factors that are not clearly related to administrative services competence, which may include (depending on the circumstances) factors such as gender, height, speech patterns, volume and/or accents, or any other bias that could be related to appearance or behavior that are not likely to affect the candidate’s job effectiveness.

1(h) The model sponsor provides a clear statement acknowledging the intended uses of the administrator performance assessment. The statement demonstrates the model sponsor’s clear understanding of the implications of the assessment for candidates, preparation programs, the public schools, and TK-12 students. The statement includes appropriate cautions about additional or alternative uses for which the assessment is not valid. All elements of assessment design and development are consistent with the intended uses of the assessment for determining the competence of candidates for a Preliminary Administrative Services Credential in California and as a source of useful information about preparation program quality and effectiveness.

1(i) The model sponsor completes content review and editing procedures to ensure that administrator assessment tasks, rubrics, and directions to candidates are culturally and linguistically free of bias, fair, and appropriate for candidates from diverse backgrounds.

1(j) The model sponsor completes initial and periodic basic psychometric analyses to identify administrator assessment tasks and/or scoring rubrics that show differential effects in relation to candidates’ race, ethnicity, language, gender, or disability. When group pass-rate differences are found, the model sponsor investigates the potential sources of differential performance and documents steps taken to eliminate construct-irrelevant sources of variance.

1(k) In designing assessment administration procedures, the assessment model sponsor includes administrative accommodations that preserve assessment validity while addressing issues of access for candidates with disabilities or specific learning needs.

1(l) In the course of determining a passing standard, the model sponsor secures and reflects on the considered judgments of administrators, supervisors of administrative services candidates, and appropriate other preparers of administrators regarding necessary and acceptable levels of proficiency on the part of entry-level school administrators. The model sponsor periodically reviews the reasonableness of the scoring scales and established passing standard, when and as directed by the Commission.

1(m) To preserve the validity and fairness of the assessment over time, the model sponsor may need to develop and field test new administrator assessment tasks and multi-level scoring rubrics to replace or strengthen prior ones. Initially and periodically, the model sponsor analyzes the assessment tasks and scoring rubrics to ensure that they yield important evidence that represents candidate knowledge and skill related to CAPEs, and serve as a basis for determining entry-level administrator competence to lead California’s TK-12 public schools. The model sponsor documents the basis and results of each analysis, and modifies the tasks and rubrics as needed.

Assessment Design Standard 2: Assessment Designed for Reliability and Fairness

The APA model sponsor designs and develops an assessment that will yield, in relation to the key aspects of the major domains of the CAPEs, enough collective evidence of each candidate’s performance to serve as a valid basis to judge the candidate’s general administrative competence for a Preliminary Administrative Services Credential. The model sponsor carefully monitors assessment development to ensure consistency with the stated purpose of the assessment. The Administrator Performance Assessment includes a comprehensive program to train, calibrate, and maintain assessor calibration over time. The model sponsor periodically evaluates the assessment system to ensure equitable treatment of candidates. The assessment system and its implementation contribute to local and statewide consistency in the assessment of administrator competence.

Required Elements for Assessment Design Standard 2: Assessment Designed for Reliability and Fairness

2(a) In relation to the key aspects of the major domains of the CAPEs, the administrator assessment tasks, rubrics, and the associated directions to candidates are designed to yield valid evidence for an overall judgment of each candidate’s qualifications for a Preliminary Administrative Services Credential as one part of the requirements for the credential.

2(b) Administrator assessment tasks and scoring rubrics are pilot and field tested in practice before being used operationally in the APA. The model sponsor evaluates the pilot and field test results thoroughly and documents the pilot and field test designs, participation, methods, results and interpretation.

2(c) The Administrator Performance Assessment system includes a comprehensive process to select and train assessors who score candidate responses to the administrator assessment tasks. The assessor training program demonstrates convincingly that prospective and continuing assessors gain a deep understanding of the CAPEs, the tasks and the multi-level scoring rubrics. The training process includes task-based scoring trials in which an assessment trainer evaluates and certifies each assessor's scoring accuracy and calibration in relation to the scoring rubrics associated with the task. The assessment model sponsor establishes selection criteria for assessors of candidate responses to the APA. The selection criteria must include but are not limited to appropriate administrative expertise in the content areas assessed within the APA. Only assessors who meet the sponsor’s established criteria are selected to score APAs, and only assessors who successfully calibrate during the required APA assessor training sequence are used. If new administrator tasks and scoring scales are incorporated into the APA, the assessment sponsor provides additional training to the assessors, as needed.

2(d) The model sponsor plans and implements periodic evaluations of the assessor training process, which include systematic feedback from assessors and assessment trainers, and which lead to improvements in the assessor training as needed.

2(e) The model sponsor provides a consistent scoring process for all programs using the assessment, including programs using a local scoring option provided by the model sponsor. The scoring process conducted by the model sponsor to assure the reliability and validity of candidate outcomes on the assessment may include, for example, regular auditing, selective back reading, and double scoring of candidate responses near the cut score by the qualified, calibrated scorers trained by the model sponsor. All approved APAs must include a local scoring option in which the assessors of candidate responses are program faculty and/or other individuals identified by the program who meet the model sponsor’s assessor selection criteria. These local assessors are trained and calibrated by the model sponsor, and their scoring work is facilitated and reviewed by the model sponsor. The model sponsor provides a detailed plan for establishing and maintaining scorer accuracy at the local and state levels, and inter-rater reliability during pilot and field testing and during operational administration of the assessment.

2(f) The model sponsor must demonstrate that the assessment procedures, taken as a whole, maximize the accurate determination of each candidate’s overall pass-fail status on the APA. The model sponsor must provide an annual audit process that documents that local scoring outcomes are consistent and reliable within the assessment for candidates across the range of programs using centralized and local scoring, and inform the Commission where inconsistencies in scoring outcomes are identified. If inconsistencies are identified, the sponsor must provide a plan to the CTC for how it will address and resolve the scoring inconsistencies both for the current scoring results and for future scoring of the APA.

2(g) The model sponsor’s APA system includes a clear and easy to implement appeal procedure for candidates who do not pass the assessment, including an equitable process for rescoring of evidence already submitted by an appellant candidate in the program. Model sponsors must document that all candidate appeals granted a second scoring are scored by a new assessor unfamiliar with the candidate’s response.

2(h) The model sponsor provides results on the APA for individual candidates based on performance relative to the specific scoring rubrics within three weeks following candidate submission of completed APA responses. The model sponsor must provide results to programs based on both individual and aggregate data relating to candidate performance relative to the rubrics and the CAPEs. The model sponsor also follows the timelines established with programs using a local scoring option for providing scoring results.

2(i) The model sponsor provides program level aggregate results to the program and the Commission, in a manner, format, and timeframe specified by the Commission, as one means of assessing program quality. Programs have an opportunity to ensure accuracy in the data, and will report any inaccuracies to the model sponsor and the Commission. APA candidate and program results will be used within the Commission’s ongoing accreditation system.

Assessment Design Standard 3: APA Assessment Sponsor Support Responsibilities

The APA model sponsor provides technical support to administrator preparation programs using the assessment to support fidelity of implementation of the assessment as designed. The model sponsor is responsible for conducting and/or moderating scoring for all programs, as applicable, within a centralized scoring approach and/or the local scoring option. The model sponsor has ongoing responsibilities to interact with the programs and the Commission, to provide candidate and program outcomes data as requested and specified by the Commission, and to maintain the currency of the assessment over time.

Required Elements for Assessment Design Standard 3: APA Assessment Sponsor Support Responsibilities

3(a) The model sponsor provides ongoing technical assistance to programs implementing the APA concerning fidelity of implementation of the assessment as designed. Clear implementation procedures and materials such as a candidate and a program handbook are provided by the model sponsor to programs using the assessment.

3(b) A model sponsor conducting centralized scoring for programs is responsible for providing APA outcomes data at the candidate and program level to the program within three weeks and to the Commission, as specified by the Commission. The model sponsor supervising/moderating local program scoring oversees data collection, data review with programs, and reporting.

3(c) The model sponsor is responsible for submitting at minimum an annual report to the Commission describing, among other data points, the programs served by the assessment, the number of candidate submissions scored, the date(s) when responses were received for scoring, the date(s) when the results of the scoring were provided to the preparation programs, the number of candidate appeals, first time passing rates, candidate completion passing rates, and other operational details as specified by the Commission.

3(d) The model sponsor is responsible for maintaining the currency of the APA assessment, including making appropriate changes to the assessment tasks and/or to the scoring rubrics and associated program and candidate materials, as directed by the Commission when necessitated by changes in TK-12 standards and/or in teacher or administrator preparation standards or expectations.

3(e) The model sponsor must define the retake policies for candidates who fail one or more parts of the APA which preserve the reliability and validity of the assessment results. The retake policies must include whether the task(s) on which the candidate was not successful must be retaken in whole or in part, with appropriate guidance for programs and candidates about which task and/or task components must be resubmitted for scoring by a second assessor and what the resubmitted response must include.

Updated December 26, 2023