CAEP Revised 2022
Standards Workbook
June 11, 2021
Based on CAEP Revised Standards for Initial-Licensure Preparation
Adopted by the CAEP Board of Directors December 2020
And CAEP Standards for Advanced-Level Preparation
Adopted by the CAEP Board of Directors, June 2016; Amended June 2021
NOTICE: This workbook is in effect for self-study reports submitted by
educator preparation providers preparing for a review to be conducted
using CAEP Revised Standards for Initial-Licensure Preparation
scheduled Spring 2022 and beyond. Spring 2022 self-study reports
submitted for educator preparation providers for a review to be conducted
using CAEP Standards for Advanced-Level Preparation must continue to
use the Consolidated Handbook for Advanced Standards only. The CAEP
Consolidated Handbook remains in effect through site visits scheduled
Fall 2021.
1140 19th Street, NW Suite 400
Washington, DC 20036
Caepnet.org
1
This page is left intentionally blank.
2
Contents
Preface 5
CAEP Accreditation Process Timeline 6
Steps in Preparing to Write the Self-Study Report (SSR) 7
The Standards 9
Standards for Initial-Licensure Preparation 10
Standard R1 Content and Pedagogical Knowledge 10
Component R1.1 The Learner and Learning 10
Component R1.2 Content 12
Component R1.3 Instructional Practice 14
Component R1.4 Professional Responsibility 17
Standard R2 Clinical Partnerships and Practice 19
Component R2.1 Partnerships for Clinical Preparation 19
Component R2.2 Clinical Educators 21
Component R2.3 Clinical Experiences 23
Standard R3 Candidate Recruitment, Progression, and Support 27
Component R3.1 Recruitment 27
Component R3.2 Monitoring and Supporting Candidate Progression 29
Component R3.3 Competency at Completion 32
Standard R4 Program Impact 34
Component R4.1 Completer Effectiveness 34
Component R4.2 Satisfaction of Employers 36
Component R4.3 Satisfaction of Completers 38
Standard R5 Quality Assurance System and Continuous Improvement 40
Component R5.1 Quality Assurance System 40
Component R5.2 Data Quality 42
Component R5.3 Stakeholder Involvement 44
Component R5.4 Continuous Improvement 46
Standards for Advanced-Level Preparation 48
Standard RA.1 Content and Pedagogical Knowledge 48
Component RA1.1 Candidate Knowledge, Skills, and Professional Dispositions 48
Component RA1.2 Provider Responsibilities 50
Standard RA.2 Clinical Partnerships and Practice 52
Component RA2.1 Partnerships for Clinical Preparation 52
Component RA2.2 Clinical Experiences 54
Standard RA.3 Candidate Quality and Selectivity 56
Component RA3.1 Recruitment 56
3
Component RA3.2 Candidates Demonstrate Academic Achievement and Ability to
Complete Preparation Successfully 58
Component RA3.3 Monitoring and Supporting Candidate Progression 60
Component RA3.4 Competency at Completion 62
Standard RA.4 Satisfaction with Preparation 64
Component RA4.1 Satisfaction of Employers 64
Component RA4.2 Satisfaction of Completers 66
Standard RA5 Quality Assurance System and Continuous Improvement 68
Component RA5.1 Quality Assurance System 68
Component RA5.2 Data Quality 70
Component RA5.3 Stakeholder Involvement 72
Component RA5.4 Continuous Improvement 74
Standard 6: Fiscal and Administrative Capacity 76
Standard 7: Record of Compliance with Title IV of the Higher Education Act 78
Appendices 79
Appendix A: CAEP Criteria for Evaluation of EPP-Created Assessments & Surveys 79
CAEP Criteria for Evaluation of EPP-Created Assessments 79
CAEP Criteria for Evaluation of EPP-Created Surveys 82
Appendix B: Transition and Phase-in Plan Schedules and Guidelines 83
Standards for Initial-Licensure Preparation 83
Sufficiency Criteria for Initial-Licensure Preparation Transition Plans 84
Phase-In Plan Schedule for Standards for Advanced-Level Preparation 86
Sufficiency Criteria for Advanced-Level Preparation Phase-In Plans 88
Appendix C: Additional Resources 90
4
Preface
This CAEP Workbook accompanies the 2022 CAEP Revised Standards for Initial-Licensure Preparation
approved by the Board in December of 2020 and is in effect for anyone preparing for a review to be
conducted using CAEP Revised Standards for Initial-Licensure Preparation during Spring of 2022 and
beyond. This Workbook, like CAEP’s accreditation policy and procedures, has been written, in part, to
meet requirements established by CHEA and the USDOE. The term “workbook” is used rather than
“handbook” to better reflect its purpose and differentiate it from earlier published aides to help those
preparing for CAEP accreditation.
The Workbook addresses three requests/goals from those who have previously written self-studies and
those currently going through the process:
It is more targeted and refined, reflecting changes to the revised standards components
themselves. We believe, based on feedback from the field, the revised standards components
eliminate redundancy and improve clarity and we hope the Workbook reflects those same efforts.
It reflects the style of a workbook in that it provides step by step actions taking a provider from
self-study through site evaluation. Our intention is it is easier to use than a handbook and offers
more examples for possible evidence. You will notice it is substantially smaller in size than
previous versions as a result.
It updates sufficiency criteria including for assessments and surveys - and offers transition plan
guidelines for those moving from previous standards components.
It is not uncommon during the standards revisions process to receive many recommendations for the
implementation of the standards. I would like to thank two Board Committees for their collective thinking
on both the standards and workbook. The Equity and Diversity Committee provided invaluable
recommendations on better addressing equity, diversity, and inclusion through the CAEP accreditation
process. As well, the Research Committee systematically reviewed underlying research related to each
standard and component. Additionally, members of the Accreditation Council, the task force on revision
of the standards, and those who volunteered their time to make recommendations and review workbook
drafts provided an invaluable service during a raging pandemic. Finally, thanks to those who responded to
the call for public comment on the standards.
Each EPP has its own mission and faces specific challenges in recruiting, preparing, and supporting
high-quality candidates from a broad range of backgrounds and diverse populations to better reflect the
diversity found in America’s P-12 classrooms. Accreditation is a means for EPPs to strive for equity and
excellence in their P-12 educator preparation through evidence and discussion. There is not “one way” to
make a case for accreditation. A goal of the standards revision process was to acknowledge the unique
context each EPP brings. Therefore, this workbook provides a process that anticipates many forms of
evidence, different assessments, differing approaches to candidate recruitment, and multiple ways to
monitor candidate progress and efforts to support them.
Thank you for your pursuit of CAEP accreditation!
Christopher A. Koch, Ed.D.
President
5
CAEP Accreditation Process Timeline
NOTE: The Initial Accreditation Process and Renewal of Accreditation Process differ. Both
processes are described in greater detail in the Accreditation Policy and Procedures.
Following the submission of the rejoinder response, the Accreditation Council will meet in either the
Fall or Spring semester after the site review.
6
Steps in Preparing to Write the Self-Study Report (SSR)
A self-study is the process through which an EPP uses CAEP Standards to evaluate its
preparation programs. The self-study report primarily focuses on how the EPP provides
evidence and analysis that its programs are meeting the CAEP standards. There are some
basic steps to consider when beginning the CAEP self-study process. These are not mandates
or requirements. They are suggestions for how a provider might proceed to address the CAEP
components and the accreditation process. When in doubt, contact CAEP staff.
1. Review the CAEP Scope of Accreditation, CAEP Standards, and the Workbook.
The provider should access the CAEP website or Accreditation Policy and Procedures to
review the CAEP Scope of Accreditation to determine which preparation programs are to
be included in the EPP review. Once the provider has identified the EPP programs within
the scope of CAEP review, the provider should study the CAEP Standards for
Initial-Licensure Preparation and Standards for Advanced-Level Preparation, as
applicable, including all components. This Workbook provides examples of “quality of
evidence” and “sources of evidence” as well as “guiding questions” for each component.
These sections are provided as a framework for EPPs to focus their accreditation report.
Providers are welcome to employ evidence different from those described and to select
the ones they believe will make the strongest case the EPP has met each Standard.
2. Review current data and processes against the components. The provider should
consider developing an inventory of the evidence/ data they currently collect, do not
collect, and might begin to collect, in support of the CAEP standard components. Make a
crosswalk grid with the standards/components and the inventory of evidence. Expand
the grid to include the criteria for EPP created tools (data cycles, data quality, need for
revision). The provider documents the processes that it currently conducts, does not
conduct, or may need to begin, related to the CAEP standard components.
3. Engaging internal and external stakeholders in the process. The provider identifies
appropriate internal and external stakeholders to engage in the self-study process at
multiple points. Faculty, staff, EPP leadership, school partners, employers, and school
based clinical educators are informed and included in the accreditation process.
Everyone needs to be able to discuss the accreditation process and their role in
designing, implementing, analyzing and reporting for continuous improvement.
4. Analyze and interpret the evidence, and then formulate the case for each
component. The provider makes a case for meeting each standard component through
evidence, the analysis and interpretation of evidence, and its conclusions demonstrating
the component is met. The provider is responsible for showing it has addressed the
CAEP standard component through evidence and interpretation of evidence in
supporting narrative.Three cycles of data ensure the EPP can adequately discuss trends
within the data and for consistency of findings.
Analysis and interpretation of the evidence:
7
Data should be disaggregated and presented in a manner to best inform an
interpretation of patterns, trends, differences, or similarities for demographic groups
or for the EPP's individual preparation programs.
Multiple sources of data should be used to triangulate and inform different aspects of
each point the EPP chooses to make in the self-study.
Description of how the provided data inform the EPP’s ability to meet the standard
component, including highlighting confirming and conflicting conclusions from the
data; if disparities are identified, explain the disparities and include steps to remedy
them.
Illustrate how the data inform the EPP’s continuous improvement.
Make comparisons between its data and any existing benchmarks, normative
comparisons to peers, or performance standards.
5. Review AIMS and the Self-Study Report Template. As described in Accreditation
Policy and Procedures, CAEP staff will make an electronic Self-Study Report shell
available for the EPP to use -- either within 30 days of an EPP attaining Applicant Status,
or no less than 18 months prior to the expiration of the EPP’s current term. Review the
template for criteria and format. Note file size requirements, number of evidence
allowed, and narrative character limits. In addition, make note of tables and additional
information required for Standard 6. These include a table of faculty qualifications,
program listing, branch campus details, facilities, and copies of current regional
accreditation letters and other programmatic accreditor letters. These items can take
some time to acquire from institutional leadership.
6. Draft the Self-Study Report. Draft a response to each CAEP standard component and
create supporting evidence attachments. While drafting it is important to consider
consistency: a consistent voice in the narrative; consistent representation of data
throughout the report; consistent organization of evidence, including titling, numbering
and tagging of evidence in the narrative. Do not work solely in AIMS. It is strongly
recommended that providers draft their evidences and narrative in another tool and
copy/paste it into AIMS when completed.
7. Submit the Self-Study Report into the SSR Template. The SSR is due in AIMS
9-months prior to the Site Visit. Submitting the evidence, narrative, and required tables is
a time intensive process. It is important to allow ample time to complete the Self-Study
Template in AIMS before the deadline. No tables or weblinks may be used in the
narrative text boxes. Supporting evidence can be uploaded and “tagged” to show
alignment to the appropriate CAEP standard(s) and component(s). It is important to note
that weblinks/hotlinks are deactivated in evidence documents - all documents must be
static.
8
The Standards
This part of the Workbook presents the text of the CAEP Standards for Initial- Licensure
Preparation together with their associated components. Each section begins with the standard,
then continues with guidelines for preparation of a Self-Study Report, including key concepts
that identify the main points of the standards and components language, guiding questions,
descriptions of quality evidence with possible sources of evidence, and connections to other
components. It is important to note that the guiding questions and descriptions of quality
evidence with possible sources of evidence are not exhaustive as EPPs may have different
evidence based on EPP systems, structures, and/or mission.
The designation of R before standards and components are used to differentiate the 2022
revised standards from their previous counterparts (the 2013 standards).
9
Standards for Initial-Licensure Preparation
Standard R1 Content and Pedagogical Knowledge
Content and Pedagogical Knowledge
The provider ensures that candidates develop an understanding of the critical concepts and
principles of their discipline and facilitates candidates’ reflection of their personal biases to
increase their understanding and practice of equity, diversity, and inclusion. The provider is
intentional in the development of their curriculum and clinical experiences for candidates to
demonstrate their ability to effectively work with diverse P-12 students and their families.
Component R1.1 The Learner and Learning
R1.1 The Learner and Learning
The provider ensures candidates are able to apply their knowledge of the learner and learning at
the appropriate progression levels. Evidence provided should demonstrate that candidates are
able to apply critical concepts and principles of learner development (InTASC Standard 1),
learning differences (InTASC Standard 2), and creating safe and supportive learning
environments (InTASC Standard 3) in order to work effectively with diverse P-12 students and
their families.
Key Concepts
Guiding Questions
The provider presents evidence that
candidates are able to apply their knowledge
of:
Learner development (e.g.,
cognitive, linguistic, social,
emotional, physical)
Learning differences (e.g., individual
differences, diverse cultures and
communities, prior knowledge and
experiences, multiple perspective,
cultural norms, language
development)
Learning environment (e.g.,
individual and collaborative learning,
positive social interaction, active
engagement in learning, self
motivation)
How does the EPP know candidates can
apply the InTASC standards relating to:
learner development?
learning differences?
the learning environment?
How does the EPP know candidates are
prepared to teach diverse learners under the
different situations they may encounter on
the job?
How does the EPP assess candidate
examination of their own personal biases?
How are EPP candidates able to engage
families in the P-12 learning process?
How does the EPP evidence demonstrate
increasing complexity in candidate
application of the learner and learning
10
Diversity, equity, and inclusion in the
learner and learning (e.g.,
candidates believe all learners can
achieve at high levels, examine and
understand their personal biases,
persist in supporting and scaffolding
all learners, respect learners as
individuals, make learners feel
valued, promote respect among
learners)
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
key concept with which it is aligned.
aligned with the InTASC Learning
Progression for Teachers?
How does the EPP define equity, diversity,
and inclusion in relation to the learner and
learning?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence documents candidates’ application of P-12 student growth and development
and of individual differences across cognitive, linguistic, social, emotional, and physical
areas as well as individual differences and diverse cultures and communities.
Disaggregated evidence indicates that candidates understand student growth and
development across racial/ethnic demographic populations.
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidences:
Differentiated planning for learners (unit plan, lesson plan)
EPP created measures
Performance-based assessments
Differentiated assessments (task, communication, response, materials)
Studies of student populations for purposes of planning and differentiation
Connections
R2.3 (Clinical Experiences), R3.2 (Monitoring and Supporting Candidate Progression), R3.3
(Competency at Completion)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
11
Component R1.2 Content
R1.2 Content
The provider ensures candidates are able to apply their knowledge of content at the appropriate
progression levels. Evidence provided demonstrates candidates know central concepts of their
content area (InTASC Standard 4) and are able to apply the content in developing equitable and
inclusive learning experiences (InTASC Standard 5) for diverse P-12 students. Outcome data
can be provided from a Specialist Professional Associations (SPA) process, a state review
process, or an evidence review of Standard 1.
Key Concepts
Guiding Questions
The provider presents evidence
candidates are able to apply their
knowledge of:
Central concepts, tools of inquiry,
and structures of discipline specific
to content
Accessible and meaningful learning
experiences to ensure mastery of
content
Content specific pedagogy (e.g.,
connecting concepts, using differing
perspectives, engaging learners in
critical thinking, creativity and
collaborative problem solving;
encouraging learner exploration,
discovery, and expression across
content areas)
Diversity and equity in content
knowledge (e.g., intentional design
and implementation of inclusive
curriculum, awareness that content
knowledge is culturally situated,
teaching with multiple perspectives,
promoting critical analysis, awareness
of and responsiveness to bias,
inclusion of authentic and global
issues)
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
How does the EPP ensure candidates
know central concepts of their content
area and are able to apply the content in
developing equitable and inclusive
learning experiences for diverse P-12
students?
How does the evidence demonstrate
increasing complexity in candidate
application of the content aligned with
the InTASC Learning Progression for
Teachers?
How does the EPP define equity,
diversity, and inclusion in relation to
content knowledge?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
12
checklist but rather a guide to unpack the
key concept with which it is aligned.
Quality Evidence
Evidence demonstrates candidates can apply critical concepts and principles in their
discipline and pedagogical knowledge in their content field.
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Framework
for Evaluation of EPP-Created Assessments or CAEP Framework for Evaluation of
EPP-Created Surveys
Possible sources of evidences:
Outcome assessments submitted as part of the SPA National Recognition or state
approval process or used for internal review of programs using specialty area
standards.
EPP created measures
Proprietary measures (e.g., edTPA rubrics related to content, PPAT rubrics related
to content, Praxis Content Exams)
State Required Licensure measures
Connections
R2.3 (Clinical Experiences), R3.2 (Monitoring and Supporting Candidate Progression), R3.3
(Competency at Completion)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
13
Component R1.3 Instructional Practice
R1.3 Instructional Practice
The provider ensures that candidates are able to apply their knowledge of InTASC standards
relating to instructional practice at the appropriate progression levels. Evidence demonstrates
how candidates are able to assess (InTASC Standard 6), plan for instruction (InTASC Standard
7), and utilize a variety of instructional strategies (InTASC Standard 8) to provide equitable and
inclusive learning experiences for diverse P-12 students. Providers ensure candidates model
and apply national or state approved technology standards to engage and improve learning for
all students.
Key Concepts
Guiding Questions
The provider presents evidence
candidates are able to apply their
knowledge of:
Multiple methods of assessment to
monitor learner progress and guide
decision making.
Planning instruction that draws on
content knowledge, curriculum,
cross-disciplinary skills, and
pedagogy to support every student
in meeting rigorous learning goals.
Variety of instructional strategies to
encourage learners to develop
content knowledge and content
connections to build skills and
knowledge in meaningful ways.
Technology for enhancement of
P-12 learning. (e.g., design
authentic learning activities that
align with content area standards
and use digital tools and resources
to maximize active, deep learning)
Diversity and equity in instructional
practice (e.g., adapt instructional
resources and assessments to
create culturally responsive,
equitable learning opportunities;
making accommodations in
assessment conditions; identifying
How does the EPP know candidates can
apply the InTASC standards relating to
measuring P-12 students’ progress?
How does the EPP know candidates can
apply the InTASC standards relating to
planning for instruction?
How does the EPP know candidates
understand and can apply the InTASC
standards relating to using a variety of
instructional strategies?
Describe the evidence that
demonstrates effective integration of
technology as supported by state or
national technology standards.
How does the evidence demonstrate
increasing complexity in candidate
understanding and application of
instructional practice aligned with the
InTASC Learning Progression for
Teachers?
How does the EPP define equity,
diversity, and inclusion in relation to
instructional practice?
How do the EPP’s candidates identify
potential biases and adapt instructional
resources and assessments to create
culturally responsive, equitable learning
opportunities?
14
and using learner strengths and
needs; respenting learners’ diverse
strengths and needs; using of
formative assessment; openness to
adjustment and revision based on
learner needs)
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
key concept with which it is aligned.
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Evidence demonstrates candidates have developed proficiencies to apply their content
and pedagogical knowledge effectively in instruction and other interactions with diverse
P-12 students.
Evidence indicates that candidates are proficient in the applications of technology for
enhancement of P-12 learning.
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidences:
Assignments or tasks from courses
Assignments or tasks from clinical experiences
Proprietary assessments (e.g., edTPA rubrics related to instructional practice,
PPAT rubrics related to instructional practice, Teacher Work Sample-TWS rubrics
related to instructional practice)
Pedagogical knowledge tests
Observational measures
Digital portfolios demonstrating application of national or state technology
standards
Connections
R2.3 (Clinical Experiences), R3.2 (Monitoring and Supporting Candidate Progression), R3.3
(Competency at Completion)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
15
Component R1.4 Professional Responsibility
R1.4 Professional Responsibility
The provider ensures candidates are able to apply their knowledge of professional responsibility
at the appropriate progression levels. Evidence provided should demonstrate candidates engage
in professional learning, act ethically (InTASC Standard 9), take responsibility for student
learning, and collaborate with others (InTASC Standard 10) to work effectively with diverse P-12
students and their families.
Key Concepts
Guiding Questions
The provider presents evidence that
candidates are able to apply their knowledge
of:
Professional standards of practice,
relevant laws, and policies and codes
of ethics.
Collaboration with learners, families,
and colleagues and other school
professionals to ensure learner
growth.
Engagement in ongoing professional
learning and using evidence to
continually evaluate his/her practice,
particularly the effects of his/her
choices and actions on others
(learners, families, other
professionals, and the community).
Diversity and equity in professional
responsibility (e.g., adaption of
practice to meet needs of each
learner; taking responsibility for
learning of all students; deepening
understanding of own frames of
reference and potential bias; seeing
role as one of advocacy for learners
and accountable for learner success;
embracing challenge of continuous
improvement and change)
How does the EPP know candidates can
apply the InTASC standards relating to
professional learning and ethical
practice?
How does the EPP know candidates can
apply the InTASC standards relating to
collaboration and leadership?
How does the EPP ensure candidates
have knowledge of professional
standards of practice, relevant laws, and
policies and codes of ethics?
How does the evidence demonstrate
increasing complexity in candidate
understanding and application of
professional responsibility aligned with
the InTASC Learning Progression for
Teachers?
How does the EPP define equity,
diversity, and inclusion in relation to
professional responsibility?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
16
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
key concept with which it is aligned.
Quality Evidence
Evidence of candidates’ understanding of professional standards of practice, relevant
laws and policies and codes of ethics, reflections addressing own cultural background,
unconscious biases and systemic biases, and ability to collaborate with learners, families,
and colleagues to ensure learner growth
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidence:
Course assignments or tasks
EPP created measures
Dispositions assessments
Relevant sections of state licensure requirements
Required state/EPP ethics training
Connections
R2.3 (Clinical Experiences), R3.2 (Monitoring and Supporting Candidate Progression), R3.3
(Competency at Completion)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
17
Standard R2 Clinical Partnerships and Practice
Clinical Partnerships and Practice
The provider ensures effective partnerships and high-quality clinical practice are central to
candidate preparation. These experiences should be designed to develop candidate’s
knowledge, skills, and professional dispositions to demonstrate positive impact on diverse
students’ learning and development. High quality clinical practice offers candidates experiences
in different settings and modalities, as well as with diverse P-12 students, schools, families, and
communities. Partners share responsibility to identify and address real problems of practice
candidates experience in their engagement with P-12 students.
Component R2.1 Partnerships for Clinical Preparation
R2.1 Partnerships for Clinical Preparation
Partners co-construct mutually beneficial P-12 school and community arrangements for clinical
preparation and share responsibility for continuous improvement of candidate preparation.
Key Concepts
Guiding Questions
The provider presents evidence:
they establish and maintain partnerships
with schools and school districts, as well
as other appropriate organizations.
P-12 schools and/or community partners
and EPPs have both benefited from the
partnership.
all partners are active participants in the
on-going, collaborative process to
improve candidate preparation
(co-construction).
How does the EPP document partnerships?
How are the partnerships mutually
beneficial?
How does the EPP ensure all partners are
involved - or have the opportunity to be
involved - in the development, maintenance,
and modification of the partnership? In other
words, how does the EPP ensure that
partnerships are co-constructed?
How does the EPP engage P-12 partners in
an on-going collaborative process?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
The provider presents evidence that a collaborative process is in place with P-12 partners
that is reviewed periodically and involves activities such as:
18
Collaborative development, review, or revision of instruments and evaluations
Collaborative development, review, or revision of the structure and content of the
clinical activities
Mutual involvement in ongoing decision-making about partnership structure and
operations
Agreed upon provisions to ensure diversity of clinical settings
Creation of opportunities for candidates to work with diverse P-12 students who
have differing needs
The EPP provides evidence that the P-12 schools and EPPs have both benefited from
the partnership.
Possible evidences can include:
Documentation of collaboration (meeting decisions, agenda topics)
MOUs
Advisory Boards
Connections
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
Notes
19
Component R2.2 Clinical Educators
R2.2 Clinical Educators
Partners co-select, prepare, evaluate, and support high-quality clinical educators, both provider-
and school-based, who demonstrate a positive impact on candidates’ development and diverse
P-12 student learning and development.
Key Concepts
Guiding Questions
The provider presents evidence that the EPP
and partners (e.g., P-12, community,
agency)
develop criteria for the co-selection of
clinical educators* that includes
demonstrating a positive impact on
candidate and/or P-12 student learning
and development.
collaborate in the preparation of clinical
educators* that ensures they are
prepared for the role and responsibilities.
collaborate on the evaluation of clinical
educators* as it relates to roles and
responsibilities.
collaborate to develop, review, and
revise supports provided for clinical
educators*.
* Clinical educators refers to both provider-
and school-based clinical educators.
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
key concept with which it is aligned.
What features of partnerships including
clinical educator participation, selection, or
training have had positive effects on
candidate development?
How does the EPP work with partners to
select clinical educators?
How does the EPP prepare clinical
educators for the role and responsibilities in
working with candidates?
How does the EPP evaluate clinical
educators and their impact on candidate
success?
How does the EPP engage partners in the
data informed decision-making for clinical
educators?
How does the EPP support clinical educators
as they engage in the role of working with
candidates?
How does the EPP define equity, diversity,
and inclusion in relation to professional
responsibility?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence documents that clinical educators have the opportunity to receive feedback on
their experiences.
Evidence documents that the EPP and its P-12 partners participate in the design and
delivery of training for clinical educators.
20
Examples of training might include:
Understanding the roles and responsibilities of clinical educators and of
the clinical curriculum
Use of evaluation instruments, evaluating professional dispositions of
candidates,
Setting specific goals/objectives of the clinical experiences, and
Providing feedback
Evidence should include three cycles of data and subsequent analyses of the results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidence:
Memorandum of Understanding(MOUs)
Process documents and training materials for clinical educators
Feedback tools for clinical educators
Criteria for serving as a clinical educator
Job descriptions and expectations for clinical educators
Meeting decisions/active discussions for partnership
Connections
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
Notes
21
Component R2.3 Clinical Experiences
R2.3 Clinical Experiences
The provider works with partners to design and implement clinical experiences, utilizing various
modalities, of sufficient depth, breadth, diversity, coherence, and duration to ensure candidates
demonstrate their developing effectiveness and positive impact on diverse P-12 students’
learning and development as presented in Standard R1.
Key Concepts
Guiding Questions
The provider presents evidence:
they document the relationship between
the attributes and outcomes of clinical
experiences.
they work with partners to design and
implement clinical experiences to ensure
candidates demonstrate their developing
effectiveness and positive impact on all
students’ learning and development.
These clinical experiences are designed
and implemented to include:
Depth: the intentional programmatic
design for the relationship between
clinical experiences, coursework, and
candidate assessments
Breadth: the opportunities candidates
are provided within clinical experience
to observe and practice within a wide
variety of settings
Diversity: the opportunities candidates
are provided to work with students of
varied learning needs and
backgrounds.
Coherence: the sequence of
experiences is deliberate, purposeful,
sequential, and is assessed using
performance-based protocols
Duration: the appropriate time for
candidates to demonstrate their
developing effectiveness and positive
impact
What opportunities have candidates had to
prepare in diverse settings and to work with
students having different needs?
What features of clinical experiences (e.g.,
depth, breadth, coherence, and duration) has
the EPP studied—through comparisons
across preparation programs, or more formal
investigations—to improve candidate
outcomes?
What clinical experiences have enhanced
completer’s knowledge of diversity, equity,
and inclusion issues and their readiness to
use that knowledge in teaching situations?
What applications of technology have
prepared completers for their responsibilities
on the job?
How are clinical experiences effective in
preparing candidates for initial employment
in education in their field of specialization?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
guiding question with which it is aligned.
22
Modality: the opportunity to
demonstrate their effectiveness and
positive impact in a variety of delivery
methods
they document clinical experience
goals/outcomes and operational design
along with evidence that clinical
experiences are being implemented as
described.
their candidates engage in high-quality
clinical experiences including various
modalities (e.g., virtual instruction,
hybrid, face to face).
candidates have experiences in diverse
settings with diverse P-12 students.
of how candidate progression is
monitored and supported.
of how clinical experiences provide
opportunities for candidates to apply
technology to enhance instruction in
P-12 learning for diverse students.
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
key concept with which it is aligned.
Quality Evidence
Clinical experiences are guided, hands-on, practical applications of program curriculum with P-12
teachers and students. These include, but are not limited to, early field experiences,
observations, and culminating clinical practices such as student teaching or internship.
Evidence documents that all candidates have active clinical experiences in diverse
settings and experiences with diverse P-12 students (which may be in the same or
different schools).
Evidence is provided that clinical experiences are assessed using performance-based
criteria.
Evidence documents a sequence of clinical experiences with specific goals that are
focused, purposeful, and varied.
Attributes (depth, breadth, diversity, coherence, and duration) are linked to student
outcomes and candidate performance.
Evidence shows that candidates have purposefully assessed impact on student learning
using both formative and summative assessments in more than one clinical setting (which
may be in the same or different schools) and have:
23
Used comparison points or other means to interpret findings
Used the impact data to guide instructional decision-making
Modified instruction based on impact data, and have differentiated instruction
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Scope and sequence chart/graphic of clinical experiences
Performance-based assessment data
Tracking system of placements/experiences
Portfolio of clinical experiences
Proprietary Assessments to demonstrate a positive impact on student learning in clinical
experiences (e.g., edTPA rubrics, PPAT rubrics)
Connections
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R3.2 (Monitoring and Supporting Candidate Progression), R3.3
(Competency at Completion)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
24
Notes
25
Standard R3 Candidate Recruitment, Progression, and Support
Candidate Recruitment, Progression, and Support
The provider demonstrates the quality of candidates is a continuous and purposeful focus
from recruitment through completion. The provider demonstrates that development of
candidate quality is the goal of educator preparation and that the EPP provides support
services (such as advising, remediation, and mentoring) in all phases of the program so
candidates will be successful.
Component R3.1 Recruitment
R3.1 Recruitment
The provider presents goals and progress evidence for recruitment of high-quality candidates
from a broad range of backgrounds and diverse populations that align with their mission. The
provider demonstrates efforts to know and address local, state, regional, or national needs for
hard-to-staff schools and shortage fields. The goals and evidence should address progress
towards a candidate pool which reflects the diversity of America’s P-12 students.
Key Concepts
Guiding Questions
The provider presents evidence of:
goals towards admitting high-quality
initial program candidates from a
broad range of backgrounds and
diverse populations.
routinely monitoring the employment
landscape to identify shortage areas,
openings, forecasts, and related
information in the community, state,
regional, or national markets for which
it is preparing completers.
recording, monitoring, and using
recruitment results to plan and, as
appropriate, modify recruitment
strategies and goals.
descriptions of strategies and actions
in place to achieve the EPP’s goals
together with periodic evaluation of
the effectiveness of those strategies
How does the EPP recruit an increasingly
diverse and strong pool of candidates?
How does the EPP’s recruitment strategies
respond to and serve employer needs?
How does the EPP determine the success
of recruitment efforts?
How are recruitment efforts supported as
evidence-informed, meaningful, and
feasible given the context of the EPP?
How do the recruitment strategies and
actions meet the needs of employers for
which the EPP prepares candidates?
How do the recruitment strategies and
actions align with the mission of the EPP?
How have the recruitment strategies and
actions and their implementation moved
the EPP toward the goal of greater
candidate diversity?
In what ways does disaggregated data on
candidates (admitted and enrolled
candidates by a broad range of
26
backgrounds and diverse populations)
inform decisions that align with the EPP
missing and the goals of achieving a
diverse, highly qualified candidate pool?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Connections
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
27
Component R3.2 Monitoring and Supporting Candidate Progression
R3.2 Monitoring and Supporting Candidate Progression
The provider creates and monitors transition points from admission through completion that
indicate candidates’ developing content knowledge, pedagogical knowledge, pedagogical
skills, critical dispositions, professional responsibilities, and the ability to integrate technology
effectively in their practice. The provider identifies a transition point at any point in the
program when a cohort grade point average of 3.0 is achieved and monitors this data. The
provider ensures knowledge of and progression through transition points are transparent to
candidates. The provider plans and documents the need for candidate support, as identified
in disaggregated data by race and ethnicity and such other categories as may be relevant for
the EPP’s mission, so candidates meet milestones. The provider has a system for effectively
maintaining records of candidate complaints, including complaints made to CAEP, and
documents the resolution.
Key Concepts
Guiding Questions
The provider presents evidence:
of criteria for transition points from
admission through completion
of monitoring progression from
admission through completion,
including attention to how candidates
develop:
content knowledge
pedagogical knowledge
pedagogical skills
critical dispositions
professional responsibilities
ability to integrate technology
effectively
transition points and related criteria are
shared with candidates.
of using disaggregated demographic
data to advise and support candidates
who may not progress.
a cohort grade point average of 3.0 is
achieved at some transition point in the
program
of a system for tracking and resolving
candidate complaints.
How does the EPP monitor candidate
progress, including performance on
non-academic factors like critical
dispositions and professional
responsibilities?
How does the EPP communicate with
candidates the progress monitoring
points and requirements for each point?
How does the EPP collect and respond
to complaints/appeals?
How is the evidence for monitoring
progression from admission through
completion identified in Standard R1
connected to identified transition
points?
How does the EPP demonstrate the
transition point process is followed with
fidelity within the EPP (e.g., how does
the EPP ensure there are no loopholes
to work around the system)?
Identify and describe the support
mechanisms for candidates not meeting
program expectations (e.g., advising,
remediation, or mentoring) that are
28
available and how do recommendations
occur
How are support mechanisms (e.g.,
remediation and mentoring) culturally
responsive for candidates?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
NOTE: The parenthetical examples
provided (e.g.,) are not intended to be
used as a checklist but rather a guide to
unpack the guiding question with which it
is aligned.
Quality Evidence
Evidence documents performance reviews, remediation efforts, and/or provisions
illustrating that the EPP sets goals for candidate support and monitors progress
towards goals of providing sufficient support to candidates to facilitate successful
program completion.
Disaggregated data by preparation program, race/ethnicity, and other demographic
items highlighted in R3.1 show no or few disparities OR disparities are identified and
explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
Evidence that actions are taken when there are problems with the progression of
individual candidates.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria
for Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of
EPP-Created Surveys.
Possible sources of evidence:
Crosswalk/curriculum of key evidences from Standard R1 aligned with transition
points
Assessments used at key points during the program including data and analyses
Documentation of complaints/appeals (no identifying names) and demographics of
those submitting complaints/appeals.
Description of support services available, frequency of use, and results in terms of
keeping candidates on the path to completion or counseling out of program
Connections
29
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R2.3 (Clinical Experiences), R3.1 (Recruitment), R4.1
(Completer Impact)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
Notes
30
Notes
31
Component R3.3 Competency at Completion
R3.3 Competency at Completion
The provider ensures candidates possess academic competency to teach effectively with
positive impacts on diverse P-12 student learning and development through application of
content knowledge, foundational pedagogical skills, and technology integration in the field(s)
where certification is sought. Multiple measures are provided and data are disaggregated
and analyzed based on race, ethnicity, and such other categories as may be relevant for the
EPP’s mission.
Key Concepts
Guiding Questions
The provider presents evidence:
of using disaggregated data to verify
candidate quality at completion to
teach diverse P-12 students.
that candidates reach the expected
level of proficiency at completion in
the following areas:
content knowledge
pedagogical knowledge
pedagogical skills
critical dispositions
professional responsibilities
ability to integrate technology
effectively
illustrating proficiency at completion in
the areas identified.
documenting candidates’ effective
teaching, including positive impacts
on diverse P-12 student learning and
development.
What evidence does the EPP use to
ensure by the end of the program a
candidate is ready to move into the
profession?
How does the EPP use multiple sources
of evidence to triangulate that candidates
are prepared for certification at
completion?
How does the EPP ensure candidates
are proficient in effective teaching and
have a positive impact on diverse P-12
student learning and development?
How does the EPP ensure candidates’
critical dispositions reflect positive beliefs
about the learning potentials of all
students and a commitment to continued
growth in cultural awareness and
reflection on bias and equitable
practices?
How does the EPP disaggregate the
completion data and what has been
learned from the analysis across
demographic groups?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
32
Quality Evidence
Disaggregated data by preparation program, race/ethnicity, and other demographic
items highlighted in R3.1 show no or few disparities OR disparities are identified and
explained, including steps to remedy them.
Evidence that actions are taken when there are problems with the progression of
individual candidates.
Evidence is triangulated so there is more than one source that demonstrates
candidates are proficient in the areas identified.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria
for Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of
EPP-Created Surveys.
Possible sources of evidence:
Progression level threshold/criteria for success at completion
EPP-created measures
Proprietary measures (e.g., edTPA rubrics, PPAT rubrics, Praxis Content Exams)
State Required Licensure measures
Student-teaching evaluation instruments
Dispositions/Non-Academic Factor Instruments
Connections
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R2.1 (Clinical Partnerships), R2.3 (Clinical Experiences), R3.1
(Recruitment)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
33
Standard R4 Program Impact
Program Impact
The provider demonstrates the effectiveness of its completers’ instruction on P-12 student
learning and development, and completer and employer satisfaction with the relevance and
effectiveness of preparation.
Component R4.1 Completer Effectiveness
R4.1 Completer Effectiveness
The provider demonstrates that program completers:
effectively contribute to P-12 student-learning growth
AND
apply in P-12 classrooms the professional knowledge, skills, and dispositions that the
preparation experiences were designed to achieve.
In addition, the provider includes a rationale for the data elements provided.
Key Concepts
Guiding Questions
The provider presents evidence:
completers have a positive impact on
P-12 student-learning growth with
impact data from a representative
sample of completers and programs
AND
completers apply the professional
knowledge, skills, and dispositions
corresponding with teaching
effectiveness.
How does the EPP demonstrate completer
impact on P-12 student learning and
development?
How is the EPP’s sample representative of
completers and measures used to show the
EPP completers have a positive impact on
P-12 student learning and development?
How does the EPP measure completer
teaching effectiveness in the classroom?
What is the rationale for the measures
chosen to measure impact?
How does the EPP ensure a representative
sample inclusive of licensure areas or a
purposive sample to be enlarged over time?
How does the EPP ensure completers are
effective in contributing to diverse P-12
student learning growth?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
34
and what conclusions and interpretations
have been made.
Quality Evidence
Disaggregated data by preparation program, race/ethnicity, and other demographic items
show no or few disparities OR disparities are identified and explained, including steps to
remedy them.
Rationale/methodology for selection of impact measures used.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
Refer to Appendix D for more detail about quality evidence and possible evidences for
R4.1
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
While the most recent three cycles of data must be provided as part of the accreditation
review, in the course of a seven year accreditation cycle data will be representative of all
programs.
Possible sources of evidence:
Contribute to P-12 student-learning growth
Apply professional knowledge, skills, and
dispositions in the P-12 classroom
State-level data of student
performance (e.g., student growth
measures, value-add measures)
Performance portfolios
Case study
State-level data of teacher
performance (e.g., teacher
evaluations)
Focus groups/interviews
Completers
P-12 students
Observers
Observations of completers
Surveys
Connections
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R4.2 (Satisfaction of Employers), R4.3 (Satisfaction of Completers)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
35
Component R4.2 Satisfaction of Employers
R4.2 Satisfaction of Employers
The provider demonstrates employers are satisfied with the completers’ preparation for their
assigned responsibilities in working with diverse P-12 students and their families.
Key Concepts
Guiding Questions
The provider presents evidence:
that employers perceive completers’
preparation was sufficient for their job
responsibilities.
data from a representative sample of
employers.
employers are satisfied with
completers’ preparation to work with
diverse P-12 students and their
families.
How does the EPP measure satisfaction
with preparation as viewed by employers?
How does the EPP ensure a representative
sample inclusive of most licensure areas or
a purposive sample to be enlarged over
time?
How does the EPP ensure
instruments/methods elicit responses
specific to the criteria in R1 (e.g., the learner
and learning, content, instructional practice,
professional responsibilities, technology)?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
guiding question with which it is aligned.
Quality Evidence
Evidence should include data and subsequent analyses of the assessment results.
Evidence should demonstrate a representative sample (in one cycle of data or over
multiple cycles of data).
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Employer satisfaction surveys
Focus groups or interviews with detailed methodology
36
Employer satisfaction case study
Connections
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R4.3 (Satisfaction of Completers)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
Notes
37
Component R4.3 Satisfaction of Completers
R4.3 Satisfaction of Completers
The provider demonstrates program completers perceive their preparation as relevant to the
responsibilities they encounter on the job, and their preparation was effective.
Key Concepts
Guiding Questions
The provider presents evidence:
completers perceive their preparation
was sufficient for their job
responsibilities.
data from a representative sample of
completers.
completers are satisfied with their
preparation to work with diverse P-12
students and their families.
How does the EPP measure satisfaction with
preparation as viewed by completers?
How does the EPP ensure
instruments/methods elicit responses
specific to the criteria in R1 (e.g., the learner
and learning, content, instructional practice,
professional responsibilities, technology)?
How does the EPP ensure all of the
programs are included within the data
cycles?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
NOTE: The parenthetical examples provided
(e.g.,) are not intended to be used as a
checklist but rather a guide to unpack the
guiding question with which it is aligned.
Quality Evidence
Disaggregated data by program and other demographic items show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include three cycles of data and subsequent analyses of the
assessment results.
Evidence should demonstrate a representative sample (in one cycle of data or over
multiple cycles of data).
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Completer/Alumni Satisfaction surveys
38
Focus groups or interviews with detailed methodology
Employer satisfaction case study
State proprietary measure (administered by state entities)
Connections
R1.1 (The Learner and Learning), R1.2 (Content), R1.3 (Instructional Practice), R1.4
(Professional Responsibility), R4.2 (Satisfaction of Employers)
Standard R5 encompasses process and structure components from R1-R4. Therefore, all
standards connect back to Standard R5.
Notes
39
Standard R5 Quality Assurance System and Continuous
Improvement
Quality Assurance System and Continuous Improvement
The provider maintains a quality assurance system that consists of valid data from multiple
measures and supports continuous improvement that is sustained and evidence-based. The
system is developed and maintained with input from internal and external stakeholders. The
provider uses the results of inquiry and data collection to establish priorities, enhance program
elements, and highlight innovations.
Component R5.1 Quality Assurance System
R5.1 Quality Assurance System
The provider has developed, implemented, and modified, as needed, a functioning quality
assurance system that ensures a sustainable process to document operational effectiveness.
The provider documents how data enter the system, how data are reported and used in decision
making, and how the outcomes of those decisions inform programmatic improvement.
Key Concepts
Guiding Questions
The provider presents evidence:
of a functioning Quality Assurance
System documenting operational
effectiveness.
a rationale for the selection of the
multiple measures
that Quality Assurance System data
is used in decision making
of a responsive Quality Assurance
System with the ability to provide
data relevant for aspects of the
EPP’s context.
of how outcomes of Quality
Assurance System data analysis are
used for program improvement.
How does the EPP maintain a
functioning Quality Assurance System
capable of providing data output that
enables quality control and continuous
improvement?
How are data describing the EPP’s
effectiveness (as provided for
standards R1-R4) collected, analyzed,
monitored, and reported?
What are examples of questions the
system is called on to answer that
make use of the system capabilities to
combine data from various sources
and/or disaggregate data by different
categories?
How is the system used by the EPP to
provide information for review and
decision making?
Can the faculty, staff, candidates, and
stakeholders articulate their role and
engagement in the Quality Assurance
System?
Describe the evidence that most
compellingly demonstrates the EPP’s
40
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Overview or flow chart demonstrating (at a high level) the entire Quality Assurance
system that includes roles and responsibilities of those responsible for the data collection,
monitoring, analysis, and reporting.
Evidence provided that the EPP regularly reviews system operations and data.
The provider demonstrates the Quality Assurance System has the capacity to collect,
analyze, monitor, and report data/evidence from Standards R1-R4.
The provider’s Quality Assurance System supports the disaggregation of data by
licensure area/program, race/ethnicity, and other dimensions identified by the EPP.
Possible sources of evidence:
Graphic representation of the Quality Assurance System
Crosswalk of all measures included in the Quality Assurance System
Verification of the Quality Assurance System through demonstration
Connections
R5.1 is an overarching structure and process component. Therefore, all components are
connected with R5.1.
41
Component R5.2 Data Quality
R5.2 Data Quality
The providers quality assurance system from R5.1 relies on relevant, verifiable, representative,
cumulative, and actionable measures to ensure interpretations of data are valid and consistent.
Key Concepts
Guiding Questions
The provider presents evidence:
of a clear link between what is being
measured and what the EPP intends
to measure (relevant)
a measure or result can be confirmed
or substantiated; all assessments
should be valid and reliable
(verifiable)
data encompass candidates and
completers from programs under
review (representative)
measures of candidate or EPP
performance results across
successive administrations (3 cycles
of data). (cumulative)
of a clear link between the measure
and EPP action taken as a result of
this measure (actionable)
EPP created assessments meet the
CAEP Criteria for EPP Created
Assessments
EPP created surveys meet the CAEP
Criteria for EPP Created Surveys
What strengths and weaknesses in the
Quality Assurance System do faculty
find when they use data and analyses
from the system?
How are the data relevant, verifiable,
representative, cumulative, and
actionable?
How are the scoring procedures
aligned with the CAEP Criteria for
Evaluation of Assessments?
What procedures does the EPP take in
design, collection, analysis, and
interpretation of data to ensure its
validity?
What procedures does the EPP take in
design, collection, analysis, and
interpretation of data to ensure its
reliability?
Can findings be triangulated with
multiple data points so they can be
confirmed or found conflicting?
How do the EPP created assessments
meet the CAEP Criteria for EPP
Created Assessments?
How do the EPP-created surveys meet
the CAEP Criteria for EPP Created
Surveys?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Distinguish between EPP created and propriety assessments/instruments.
Documentation of steps taken to establish instrument validity, reliability, and resultant
data.
42
Description of modifications to instruments based on feedback, validity, and reliability
work.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Reliability and validity process documentation and data
Sampling procedures
At least 3 cycles of data for R1-R4
Documentation of instrument revision with timeline
Connections
R5.2 is about data quality for instruments and data that is presented in each of the other
standards. Therefore, overall data quality in any standard is connected to R5.2.
Notes
43
Component R5.3 Stakeholder Involvement
R5.3 Stakeholder Involvement
The provider includes relevant internal (e.g., EPP administrators, faculty, staff, candidates) and
external (e.g., alumni, practitioners, school and community partners, employers) stakeholders in
program design, evaluation, and continuous improvement processes.
Key Concepts
Guiding Questions
The provider presents evidence of:
internal and external stakeholder
involvement in program design,
evaluation, and continuous
improvement process.
What EPP process is used to involve
stakeholders in data driven decision
making?
How and when do external partners
participate in the EPP’s continuous
improvement process?
How are clinical partners(external
stakeholders) included in the continuous
improvement process?
In what ways are stakeholders involved in
program design?
In what ways are stakeholders involved in
evaluation?
In what ways are stakeholders involved in
continuous improvement?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Evidence identifies examples of input from stakeholders and uses of that input.
Evidence that stakeholder groups include members with a variety of roles and
responsibilities.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
MOUs/partnerships
Advisory Board feedback/input
Co-construction or assessments/surveys
Documentation of meetings and decisions
44
Connections
R5.3 is about the stakeholder involvement in the accreditation process. Therefore, all standards
are connected with R5.3.
Notes
45
Component R5.4 Continuous Improvement
R5.4 Continuous Improvement
The provider regularly, systematically, and continuously assesses performance against its goals
and relevant standards, tracks results over time, documents modifications and/or innovations
and their effects on EPP outcomes.
Key Concepts
Guiding Questions
The provider presents evidence:
the EPP assesses performance in
relation to EPP goals and standards
Documenting performance results
over time
Documenting modifications and
tracking the effects over time.
that information from the Quality
Assurance System is the basis for a
continuous improvement function.
the EPP documents regular and
systematic data-driven changes
grounded in
data analyses and interpretations
from its quality assurance system
changes linked to its goals and
relevant standards
program decisions are directly
supported by data and/or
contradictory data are explained.
How does the EPP support continuous
improvement through procedures that
gather, input, analyze, interpret and
use information from the QAS
effectively?
What actions has the EPP taken to
pilot specific program improvements
and study their effectiveness?
What examples of changes in courses,
clinical experiences, or other candidate
experiences represent the
effectiveness of continuous
improvement efforts?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
The examples indicate that changes are clearly connected to evidence and provider
performance is systematically assessed against goals.
Written documentation confirms that the EPP regularly and systematically: reviews,
analyzes and interprets QAS data, identifies patterns across programs, uses data for
continuous improvement and innovative modifications.
Not all changes need to lead to improvement, as CAEP encourages data-driven
experimentation, but changes should trend toward improvement.
The EPP examines the outcomes currently achieved and identifies gaps between current
results and established standards. Examines why these results occur.
EPP documents the process of examining results and decisions made (e.g., keep,
46
modify, discontinue).
Possible sources of evidence:
Decision grid (Question, Data, Stakeholder group, Decision)
Condensed Meeting Minutes that highlight data review and decisions
Outcomes of changes/modifications (what happened after changes were made)
Goals crosswalk table (goals and where in the process)
Connections
R5.4 is the overarching theme of continuous improvement. Therefore, all standards are
connected with R5.4.
Notes
47
Standards for Advanced-Level Preparation
Spring 2022 visits must still use the Standards for Advanced-Level Preparation in the
Consolidated Handbook. The revisions to the Standards for Advanced-Level Preparation
included in this workbook will apply to Fall 2022 visits and beyond.
Standard RA.1 Content and Pedagogical Knowledge
Content and Pedagogical Knowledge
The provider ensures that candidates for professional specialties develop an understanding of
the critical concepts and principles of their discipline and facilitates candidates’ reflection of their
personal biases to increase their understanding and practice of equity, diversity, and inclusion.
The provider is intentional in the development of their curriculum for candidates to demonstrate
their ability to effectively work with diverse P-12 students and their families.
Component RA1.1 Candidate Knowledge, Skills, and Professional
Dispositions
RA1.1 Candidate Knowledge, Skills, and Professional Dispositions
Candidates for advanced preparation demonstrate their proficiencies to understand and apply
knowledge and skills appropriate to their professional field of specialization so that learning and
development opportunities for all P-12 are enhanced, through:
Applications of data literacy;
Use of research and understanding of qualitative, quantitative and/or mixed methods
research methodologies;
Employment of data analysis and evidence to develop supportive, diverse, equitable, and
inclusive school environments;
Leading and/or participating in collaborative activities with others such as peers,
colleagues, teachers, administrators, community organizations, and parents;
Supporting appropriate applications of technology for their field of specialization; and
Application of professional dispositions, laws and policies, codes of ethics and
professional standards appropriate to their field of specialization.
Key Concepts
Guiding Questions
The provider presents evidence that
most advanced program candidates:
Can identify the different types of data
that exist, evaluate the
appropriateness and sufficiency of the
data, analyze and synthesize data into
How does each specialty program address
each of the six skills within their program?
Of the six addressed, how does each
specialty program assess the three most
important for their specialized field?
48
meaningful forms that guide decisions
making.
Identify problems and employ one or
more research methodologies to
develop solutions or understandings.
Use research and data to improve
teaching and learning.
Demonstrate application of
professional standards of practice,
relevant laws, and policies and codes
of ethics.
How does the EPP know candidates have
developed proficiencies expected of
professionals in their specialized field?
How does the EPP know candidates are
able to apply their skills effectively to
enhance learning and development
opportunities for all P-12 students?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Most advanced program candidates perform adequately or better on at least three of the
six generic knowledge and skill abilities that are most relevant for the professional
specialty field.
Evidence is not just coverage of skills in course materials, but evidence of candidate
performance on generic advanced level skill areas.
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidence:
Course assignments or tasks
State Licensure Measures
SPA Status and Use of Results
Professional Portfolio/Tasks in Field of Specialization
Legal compliance assessments (e.g., reporting requirements)
Problem-based projects/group projects
Synthesis and interpretation of research relevant to a specialty specific problem
Action research project, thesis, or dissertation
Connections
RA3.3 (Monitoring and Supporting Candidate Progression), RA3.4 (Competency at Completion),
RA5.2 (Data Quality), RA5.4 (Continuous Improvement)
49
Component RA1.2 Provider Responsibilities
RA1.2 Provider Responsibilities
Providers ensure that program completers have opportunities to learn and apply specialized
content and discipline knowledge contained in approved state and/or national discipline-specific
standards. These specialized standards include, but are not limited to, Specialized Professional
Association (SPA) standards, individual state standards, standards of the National Board for
Professional Teaching Standards, and standards of other accrediting bodies [e.g., Council for
Accreditation of Counseling and Related Educational Programs (CACREP)]. Evidence of
candidate content knowledge appropriate for the professional specialty should be documented.
Key Concepts
Guiding Questions
The provider presents evidence of
candidate understanding and
application of:
Central concepts, tools of inquiry,
and structures of discipline specific
to content
Accessible and meaningful learning
experiences to ensure mastery of
content
Content specific pedagogy (e.g.,
connecting concepts, using differing
perspectives, engaging learners in
critical thinking, creativity and
collaborative problem solving;
encouraging learner exploration,
discovery, and expression across
content areas)
Diversity and equity in content
knowledge (e.g., intentional design
and implementation of inclusive
curriculum, awareness that content
knowledge is culturally situated,
teaching with multiple perspectives,
promoting critical analysis, awareness
of and responsiveness to bias,
inclusion of authentic and global
issues)
How does the EPP know candidates know
the specialized content of their field?
How does the EPP know candidates are
able to apply their specialized knowledge
effectively in education settings?
What evidence does the EPP have that
candidates in advanced preparation are
able to work effectively with diverse
students and colleagues to create effective
learning environments for diverse P-12
students?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
50
Quality Evidence
Demonstrate a majority of candidates enrolled in P-12 licensure, certificate, or
endorsement programs understand critical concepts and principles for their specialized
field of study in the following ways:
High level of proficiency, as might be represented by completion of SPA National
Recognition or state approval of meeting state standards.
Disaggregated data by preparation program and race/ethnicity show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys
Possible sources of evidence:
Outcome assessments submitted as part of the SPA National Recognition or state
approval process
EPP created measures
Proprietary measures (e.g., edTPA, PPAT, Praxis Content Exams)
State Required Licensure measures
Connections
RA3.3 (Monitoring and Supporting Candidate Progression), RA3.4 (Competency at Completion),
RA5.2 (Data Quality), RA5.4 (Continuous Improvement)
51
Standard RA.2 Clinical Partnerships and Practice
Clinical Partnerships and Practice
The provider ensures that effective partnerships and high-quality clinical practice are central to
preparation so that candidates develop the knowledge, skills, and professional dispositions
appropriate for their professional specialty field.
Component RA2.1 Partnerships for Clinical Preparation
RA2.1 Partnerships for Clinical Preparation
Partners co-construct mutually beneficial P-12 school and community arrangements for clinical
preparation and share responsibility for continuous improvement of candidate preparation.
Key Concepts
Guiding Questions
The provider presents evidence:
they establish and maintain partnerships
with schools and school districts, as well
as other appropriate organizations.
P-12 schools and/or community partners
and EPPs have both benefited from the
partnership.
both partners are active participants in
the on-going, collaborative process to
improve candidate preparation
(co-construction).
How does the EPP document partnerships?
How are partnerships mutually beneficial?
How does the EPP ensure all partners are
involved - or have the opportunity to be
involved - in the development, maintenance,
and modification of the partnership? In other
words, how does the EPP ensure
partnerships are co-constructed?
How does the EPP engage P-12 partners in
an on-going collaborative process?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
The provider presents evidence that a collaborative process is in place with P-12 partners
that is reviewed periodically and involves activities such as:
Collaborative development, review, or revision of instruments and evaluations
Collaborative development, review, or revision of the structure and content of the
clinical activities
Mutual involvement in ongoing decision-making about partnership structure and
operations
Agreed upon provisions to ensure diversity of clinical settings
Creation of opportunities for candidates to work with diverse P-12 students who have
52
differing needs
The EPP provides evidence that the P-12 schools and EPPs have both benefited from
the partnership.
Possible evidences can include:
Documentation of collaboration (meeting decisions, agenda topics)
MOUs
Advisory Boards
Connections
RA5.3 (Stakeholder Involvement), RA5.4 (Continuous Improvement)
Notes
53
Component RA2.2 Clinical Experiences
RA2.2 Clinical Experiences
The provider works with partners to design varied and developmental clinical experiences that
allow opportunities for candidates to practice applications of content knowledge and skills that
the courses and other experiences of the advanced preparation emphasize. The opportunities
lead to appropriate culminating experiences in which candidates demonstrate their proficiencies,
through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action)
that are characteristic of their professional specialization as detailed in component A1.1.
Key Concepts
Guiding Questions
The provider presents evidence:
they design varied and developmental
clinical experiences
candidates have opportunities to
practice applications of content
knowledge and skills their preparation
emphasizes.
candidates engage in a culminating
experience to demonstrate their
proficiencies identified in RA1.1.
candidates engage in problem-based
tasks or research that are
characteristic of their professional
specialization.
What opportunities have candidates had to
prepare in diverse settings and to work in
their specialized field of study?
What features of clinical experiences allow
candidates to demonstrate their proficiencies
through problem-based tasks or research?
How has the EPP studied clinical experience
data to improve candidate outcomes?
What clinical experiences have enhanced
completer’s understanding of diversity and
equity issues and their readiness to use that
understanding in employment situations?
How are clinical experiences effective in
preparing candidates for the chosen
proficiencies in RA1.1?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence describes the role of clinical practice in the advanced-level program, including
campus-based and field-based activities that involve practical applications of knowledge and
skills appropriate for the advanced-level specialty field.
Evidence shows that the EPP and its partners ensure advanced-level clinical experiences
are planned, purposeful, and sequential;
Evidence shows that clinical experiences are designed to help candidates grow and develop
in the practice of the knowledge and skills that make up the advanced-level preparation
program; and are assessed with performance-based protocols.
Evidence shows that candidates engage in a culminating experience and are able to
54
demonstrate their proficiencies through problem-based tasks or research that are
characteristic of their professional specialization.
Disaggregated data by preparation program and race/ethnicity show no or few disparities OR
disparities are identified and explained, including steps to remedy them.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Artifacts or completed assignments reflective of on-the-job tasks for the specialized field
Candidate evaluation of their preparatory activities for clinical practice
Preparation of a school budget or school improvement plan
A proposal for a district’s response to criticism of some aspect of school functions (e.g.,
complaints of discriminatory responses given by principals to parent complaints)
Demonstrate an understanding of a student’s IEP and to suggest appropriate child
activities responsive to the IEP
Connections
RA1.1 (Candidate Knowledge, Skills, and Professional Dispositions), RA1.2 (Provider
Responsibilities), RA3.3 (Monitoring and Supporting Candidate Progression), RA3.4
(Competency at Completion), RA5.2 (Data Quality), RA5.3 (Stakeholder Input), RA5.4
(Continuous Improvement)
55
Standard RA.3 Candidate Quality and Selectivity
Candidate Quality and Selectivity
The provider demonstrates that the quality of advanced program candidates is an ongoing and
intentional focus so that completers are prepared to perform effectively and can be
recommended for certification where applicable.
Component RA3.1 Recruitment
RA3.1 Recruitment
The provider presents goals and progress evidence for recruitment of high-quality candidates
from a broad range of backgrounds and diverse populations that align with their mission. The
provider demonstrates efforts to know and address community, state, national, regional, or local
needs for hard-to-staff schools and shortage fields. The goals and evidence should address
progress towards a candidate pool which reflects the diversity of America’s P-12 students.
Key Concepts
Guiding Questions
The provider presents evidence of:
goals towards admitting high-quality
advanced program candidates from a
broad range of backgrounds and
diverse populations.
routinely monitoring the employment
landscape to identify shortage areas,
openings, forecasts, and related
information in the community, state,
regional, or national markets for
which it is preparing completers.
recording, monitoring, and using
recruitment results to plan and, as
appropriate, modify recruitment
strategies and goals.
descriptions of strategies and actions
in place to achieve the EPP’s goals
together with periodic evaluation of
the effectiveness of those strategies.
recruitment results disaggregated by
demographic groups, particularly
race and ethnicity, and analyzes its
progress toward greater diversity in
the pool of candidates.
How does the EPP recruit an increasingly
diverse and strong pool of candidates?
How do recruitment strategies respond to
and serve employer needs?
How does the EPP determine the success of
recruitment efforts?
How are recruitment efforts supported as
evidence-informed, meaningful, and feasible
given the context of the EPP?
How do the recruitment strategies and
actions meet the needs of employers for
which the EPP prepares candidates?
How do the recruitment strategies and
actions align with the mission of the EPP?
How have recruitment strategies and actions
and their implementation moved the EPP
toward the goal of greater candidate
diversity?
Evidence is disaggregated on applicants,
those admitted, and enrolled candidates by
a broad range of backgrounds and diverse
populations that align with the EPP
mission.
56
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence documents the EPP’s periodic examination of the employment landscape in
order to identify shortage areas, openings, forecasts, and related information in the
community, state, regionals, or national market for completers.
Evidence documents baseline points and longitudinal data on current measures of
academic achievement and diversity.
Evidence documents measurable target outcomes and timeline for achievement
Evidence documents that the EPP monitors annual progress toward admission goals and
fields where there are employment opportunities. Data are disaggregated to describe
gender, ethnicity, academic achievement, and/or candidate fit for high-need areas or
communities and trends are analyzed.
Evidence documents that admissions data are disaggregated for enrolled candidates by
relevant demographics, branch campuses, and individual programs.
Evidence documents strategies and actions specifically for the EPP and its programs.
While this can be part of an institution recruitment strategy, the evidence must document
recruitment for specific EPP programs and the EPP’s input opportunities to the
institutional goals.
Possible sources of evidence:
Basic descriptive information such as baseline points and numerical goals
Results from annual monitoring of characteristics related to academic achievement,
diversity, and employment needs
Results of EPPs monitoring of progress towards goals
EPP’s interpretation of its progress and revising goals, as needed
Connections
RA5.3 (Stakeholder involvement), RA5.4 (Continuous Improvement)
57
Component RA3.2 Candidates Demonstrate Academic Achievement and
Ability to Complete Preparation Successfully
RA3.2 Candidates Demonstrate Academic Achievement and Ability to Complete
Preparation Successfully
The provider sets admissions requirements for academic achievement, including CAEP
minimum criteria (group average college GPA of 3.0 or group average performance in top 50th
percent of those assessed on nationally normed assessment), the state’s minimum criteria, or
graduate school minimum criteria, whichever is highest, and gathers data to monitor candidates
from admission to completion.
Key Concepts
Guiding Questions
The provider presents evidence of:
descriptions of its criteria used to
ensure that candidates are likely to
complete preparation successfully,
together with its analysis of the
efficacy of the criteria it uses.
How does the EPP define cohorts?
If the EPP uses GPA to demonstrate
academic achievement, when is GPA
measured?
If the EPP uses nationally normed
assessment data to demonstrate academic
achievement, are cohort’s group average in
the top 50th percent of those assessed?
Does the EPP have additional admission
requirements for academic achievement? If
so, please describe them and explain how
the EPP uses that data in admission
decisions?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence disaggregates results on the CAEP minima (GPA OR test performance) by
admission year.
Evidence presents results separately for mathematics, reading, and writing if providing
test scores.
Disaggregated data by preparation program, race/ethnicity, and other demographic items
highlighted in RA3.1 show no or few disparities OR disparities are identified and
explained, including steps to remedy them.
Possible sources of evidence:
58
College GPA
Nationally normed assessment scores
Program admission requirements
Connections
RA3.1 (Recruitment), RA5.4 (Continuous Improvement)
Notes
59
Component RA3.3 Monitoring and Supporting Candidate Progression
RA3.3 Monitoring and Supporting Candidate Progression
The provider creates criteria for program progression and uses disaggregated data to monitor
candidates’ advancement from admissions through completion. The provider ensures that
knowledge of and progression through transition points are transparent to candidates. The
provider plans and documents the need for candidate support, as identified in disaggregated
data by race and ethnicity and such other categories as may be relevant for the EPP’s mission,
so candidates meet milestones. The provider has a system for effectively maintaining records of
candidate complaints, including complaints made to CAEP, and documents the resolution.
Key Concepts
Guiding Questions
The provider presents evidence:
of support for candidates who are at
risk with the intent to help ensure their
successful program completion.
of criteria for program progression by
way of transition points from
admission through completion
for monitoring progression from
admission through completion,
including attention to how candidates
develop in their specialized field.
transition points and related criteria
are shared with candidates.
of using disaggregated demographic
data to advise and support candidates
who may not progress.
of a system for tracking and resolving
candidate complaints/appeals.
How does the EPP monitor candidate
progress through transitions points in the
program?
How does the EPP communicate with
candidates the criteria required for each
transition point?
How does the EPP collect and respond to
complaints/appeals?
How does the EPP demonstrate the
transition point process is followed with
fidelity within the EPP (e.g., how does the
EPP ensure there are no loopholes to work
around the system)?
Identify and describe the support
mechanisms for candidates not meeting
program expectations (e.g., advising,
remediation, or mentoring) that are
available and how do recommendations
occur?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence documents performance reviews, remediation efforts, and/or provisions
illustrating that the EPP sets goals for candidate support and monitors progress towards
goals of providing sufficient support to candidates to facilitate successful program
60
completion.
Disaggregated data by preparation program, race/ethnicity, and other demographic items
highlighted in RA3.1 show no or few disparities OR disparities are identified and
explained, including steps to remedy them.
Evidence should include data and subsequent analyses of any transition point
assessment and results.
Evidence that actions are taken when there are problems with the progression of
individual candidates.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Crosswalk/curriculum with transition points that includes where six skills from RA1.1 are
addressed and assessed and content expectations from RA1.2
Assessments used at key points during the program including data and analyses
Documentation of complaints/appeals (no identifying names) and demographics of those
submitting complaints/appeals.
Connections
RA1.1 (Candidate Knowledge, Skills, and Professional Dispositions), RA1.2 (Provider
Responsibilities), RA2.2 (Clinical Experiences), RA3.1 (Recruitment), RA5.1 (Quality Assurance
System), RA5.2 (Data Quality), RA5.4 (Continuous Improvement)
61
Component RA3.4 Competency at Completion
RA3.4 Competency at Completion
The provider ensures candidates possess academic competency to help facilitate learning with
positive impacts on diverse P-12 student learning and development through application of
content knowledge, data literacy and research-driven decision making, effective use of
collaborative skills,and application of technology in the field(s) where certification is sought.
Multiple measures are provided and data are disaggregated and analyzed based on race,
ethnicity, and such other categories as may be relevant for the EPP’s mission.
Key Concepts
Guiding Questions
The provider presents evidence:
of using disaggregated data to verify
candidate quality at completion for
their field of specialization.
that candidates reach a high standard
in the following areas:
Content knowledge
Data literacy
Research-driven decision
making
Effective Use of collaborative
skills
Applications of technology
Applications of dispositions,
laws, codes of ethics and
professional standards
illustrating proficiency at completion in
the areas identified.
How does the EPP support candidates who
may not meet the expected level of
proficiency in each of the areas by
completion?
How does the EPP disaggregate the
completion data and what has been learned
from analysis across demographic groups?
How does the EPP use multiple sources of
evidence to triangulate that candidates are
prepared for certification at completion?
What evidence does the EPP use to ensure
that by the end of the program a candidate is
ready to move into the profession at the
advanced level?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Disaggregated data by program, race/ethnicity, and other demographic items highlighted
in RA3.1 show no or few disparities OR disparities are identified and explained, including
steps to remedy them.
Evidence that actions are taken when there are problems with the progression of
individual candidates.
Evidence is triangulated so that there is more than one source that demonstrates
candidates are proficient in the areas identified.
Evidence should include data and subsequent analyses of the assessment results.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
62
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
EPP created measures
Proprietary measures
State Required Licensure measures
Dispositions/Non-Academic Factor Instruments
Connections
RA1.1 (Candidate Knowledge, Skills, and Professional Dispositions), RA1.2 (Provider
Responsibilities), RA2.2 (Clinical Experiences), RA3.1 (Recruitment), RA5.1 (Quality Assurance
System), RA5.2 (Data Quality), RA5.4 (Continuous Improvement)
Notes
63
Standard RA.4 Satisfaction with Preparation
Satisfaction with Preparation
The provider documents the satisfaction of its completers and their employers with the
relevance and effectiveness of their preparation.
Component RA4.1 Satisfaction of Employers
RA4.1 Satisfaction of Employers
The provider demonstrates that employers are satisfied with the completers’ preparation for their
assigned responsibilities.
Key Concepts
Guiding Questions
The provider presents evidence:
that employers perceive completers’
preparation was sufficient for their job
responsibilities.
of data from a representative sample
of employers.
employers are satisfied with
completers’ preparation to work with
diverse P-12 students and their
families.
How does the EPP measure satisfaction with
preparation as viewed by employers?
How does the EPP ensure a representative
sample inclusive of most licensure areas or a
purposive sample to be enlarged over time?
How does the EPP ensure
instruments/methods elicit responses
specific to the criteria in R1 (e.g., the learner
and learning, content, instructional practice,
professional responsibilities, technology)?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Evidence should include data and subsequent analyses of the assessment results.
Evidence should demonstrate a representative sample (in one cycle of data or over
multiple cycles of data).
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Employer satisfaction surveys
Focus groups or interviews with detailed methodology
64
Employer satisfaction case study
Connections
RA1.1 (Candidate Knowledge, Skills, and Professional Dispositions), RA1.2 (Provider
Responsibilities), RA4.2 (Satisfaction of Completers), RA5.2 (Data Quality), RA5.3 (Stakeholder
Input), RA5.4 (Continuous Improvement)
Notes
65
Component RA4.2 Satisfaction of Completers
RA4.2 Satisfaction of Completers
The provider demonstrates that program completers perceive their preparation as relevant to the
responsibilities they confront on the job, and their preparation was effective.
Key Concepts
Guiding Questions
The provider presents evidence:
completers perceive their preparation
was sufficient for their job
responsibilities.
of data from a representative sample
of completers.
completers are satisfied with their
preparation to work with diverse P-12
students and their families.
How does the EPP measure satisfaction with
preparation as viewed by completers?
How does the EPP ensure
instruments/methods elicit responses
specific to the criteria in R1 (e.g., the learner
and learning, content, instructional practice,
professional responsibilities, technology)?
How does the EPP ensure all programs are
included within data cycles?
Describe the evidence that most
compellingly demonstrates the EPP’s case,
what they have learned from the evidence,
and what conclusions and interpretations
have been made.
Quality Evidence
Disaggregated data by program and other demographic items show no or few disparities
OR disparities are identified and explained, including steps to remedy them.
Evidence should include data and subsequent analyses of the assessment results.
Evidence should demonstrate a representative sample (in one cycle of data or over
multiple cycles of data).
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Completer/Alumni Satisfaction surveys
Focus groups or interviews with detailed methodology
Employer satisfaction case study
State proprietary measure (administered by state entities)
Connections
RA1.1 (Candidate Knowledge, Skills, and Professional Dispositions), RA1.2 (Provider
66
Responsibilities), RA4.1 (Satisfaction of Employers), RA5.2 (Data Quality), RA5.3 (Stakeholder
Input), RA5.4 (Continuous Improvement)
Notes
67
Standard RA5 Quality Assurance System and Continuous
Improvement
Quality Assurance System and Continuous Improvement
The provider maintains a quality assurance system that consists of valid data from multiple
measures and supports continuous improvement that is sustained and evidence-based. The
system is developed and maintained with input from internal and external stakeholders. The
provider uses the results of inquiry and data collection to establish priorities, enhance program
elements, and highlight innovations.
Component RA5.1 Quality Assurance System
RA5.1 Quality Assurance System
The provider has developed, implemented, and modified, as needed, a functioning quality
assurance system that ensures a sustainable process to document operational effectiveness.
This system documents how data enter the system, how data are reported and used in decision
making, and how the outcomes of those decisions inform programmatic improvement.
Key Concepts
Guiding Questions
The provider presents evidence:
of a functioning Quality Assurance
System documenting operational
effectiveness.
a rationale for the selection of the
multiple measures
that Quality Assurance System data
is used in decision making
of a responsive Quality Assurance
System with the ability to provide
data relevant for aspects of the
EPP’s context.
of how outcomes of Quality
Assurance System data analysis are
used for program improvement.
How does the EPP maintain a
functioning Quality Assurance System
capable of providing data output that
enables quality control and continuous
improvement?
How are data describing the EPP’s
effectiveness (as provided for
standards RA1-RA4) collected,
analyzed, monitored, and reported?
What are examples of questions the
system is called on to answer that
make use of the system capabilities to
combine data from various sources
and/or disaggregate data by different
categories?
How is the system used by the EPP to
provide information for review and
decision making?
Can the faculty, staff, candidates, and
stakeholders articulate their role and
engagement in the Quality Assurance
System?
Describe the evidence that most
compellingly demonstrates the EPP’s
68
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Overview or flow chart demonstrating (at a high level) the entire Quality Assurance
system that includes roles and responsibilities of those responsible for the data collection,
monitoring, analysis, and reporting.
Evidence provided that the EPP regularly reviews system operations and data.
The provider demonstrates the Quality Assurance System has the capacity to collect,
analyze, monitor, and report data/evidence on ALL CAEP Standards.
The provider’s Quality Assurance System supports the disaggregation of data by
licensure area/program, race/ethnicity, and other dimensions identified by the EPP.
Possible sources of evidence:
Graphic representation of the Quality Assurance System
Crosswalk of all measures included in the Quality Assurance System
Verification of the Quality Assurance System through demonstration
Potential program or EPP action minutes documenting the creation or implementation of
the system
Assessment Reports
Stakeholder feedback on system and data
Connections
RA5.1 is an overarching structure and process component. Therefore, all components are
indirectly connected with RA5.1.
69
Component RA5.2 Data Quality
RA5.2 Data Quality
This provider’s quality assurance system from RA5.1 relies on relevant, verifiable,
representative, cumulative, and actionable measures to ensure interpretations of data are valid
and consistent.
Key Concepts
Guiding Questions
The provider presents evidence:
of a clear link between what is being
measured and what the EPP intends
to measure (relevant)
a measure or result can be confirmed
or substantiated; all assessments
should be valid and reliable
(verifiable)
data encompass candidates and
completers from programs under
review (representative)
measures of candidate or EPP
performance results across
successive administrations (3 cycles
of data). (cumulative)
of a clear link between the measure
and EPP action taken as a result of
this measure (actionable)
EPP created assessments meet the
CAEP Criteria for EPP Created
Assessments
What strengths and weaknesses in the
Quality Assurance System do faculty
find when they use data and analyses
from the system?
How are data relevant, verifiable,
representative, cumulative, and
actionable?
How does the EPP’s scoring
procedures align with the CAEP
Criteria for Evaluation of Assessments?
What procedures does the EPP take in
design, collection, analysis, and
interpretation of data to ensure its
validity?
What procedures does the EPP take in
design, collection, analysis, and
interpretation of data to ensure its
reliability?
Can findings be triangulated with
multiple data points so they can be
confirmed or found conflicting?
How do EPP created assessments
meet the CAEP Criteria for EPP
Created Assessments?
How do EPP created surveys meet the
CAEP Criteria for EPP Created
Surveys?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Distinguish between EPP created and propriety assessments/instruments.
Documentation of steps taken to establish instrument validity, reliability, and resultant
70
data.
Description of modifications to instruments based on feedback, validity, and reliability
work.
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
Reliability and validity process documentation and data
Sampling procedures
At least 3 cycles of data for RA1-RA4
Documentation of instrument revision with timeline
Connections
RA5.2 is about the data quality for instruments and data that is presented in each of the other
standards. Therefore, overall issues with data quality in any standard are connected to RA5.2.
Notes
71
Component RA5.3 Stakeholder Involvement
RA5.3 Stakeholder Involvement
The provider includes relevant internal (e.g., EPP administrators, faculty, staff, candidates) and
external (e.g., alumni, practitioners, school and community partners, employers) stakeholders in
the program design, evaluation, and continuous improvement processes.
Key Concepts
Guiding Questions
The provider presents evidence of:
internal and external stakeholder
involvement in program design,
evaluation, and continuous
improvement process.
What EPP process is used to involve
stakeholders in data driven decision
making?
How and when do external partners
participate in the EPP’s continuous
improvement process?
How are clinical partners(external
stakeholders) included in the continuous
improvement process?
In what ways are stakeholders involved in
program design?
In what ways are stakeholders involved in
evaluation?
In what ways are stakeholders involved in
continuous improvement?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
Evidence identifies examples of input from stakeholders and uses of that input
Evidence that stakeholder groups include members with a variety of roles and
responsibilities
EPP-created assessments and surveys must meet the criteria on the CAEP Criteria for
Evaluation of EPP-Created Assessments or CAEP Criteria for Evaluation of EPP-Created
Surveys.
Possible sources of evidence:
MOUs/partnerships
Advisory Board feedback/input
Co-construction or assessments/surveys
Documentation of meetings and decisions
72
Connections
RA5.3 is about the stakeholder involvement in the accreditation process. Therefore, all
standards/components are indirectly connected with RA5.3.
Notes
73
Component RA5.4 Continuous Improvement
RA5.4 Continuous Improvement
The provider regularly, systematically, and continuously assesses performance against its goals
and relevant standards, tracks results over time, documents modifications and/or innovations
and their effects on EPP outcomes.
Key Concepts
Guiding Questions
The provider presents evidence:
the EPP assesses performance in
relation to EPP goals and standards
Documenting performance results
over time
Documenting modifications and
tracking the effects over time.
that information from the Quality
Assurance System is the basis for a
continuous improvement function.
the EPP documents regular and
systematic data-driven changes
grounded in
data analyses and interpretations
from its quality assurance system
changes linked to its goals and
relevant standards
program decisions are directly
supported by data and/or
contradictory data are explained.
How does the EPP support continuous
improvement through EPP procedures
that gather, input, analyze, interpret
and use information from the QAS
effectively?
What actions has the EPP taken to
pilot specific program improvements
and study their effectiveness?
What examples of changes in courses,
clinical experiences, or other candidate
experiences represent the
effectiveness of continuous
improvement efforts?
Describe the evidence that most
compellingly demonstrates the EPP’s
case, what they have learned from the
evidence, and what conclusions and
interpretations have been made.
Quality Evidence
The examples indicate that changes are clearly connected to evidence and provider
performance is systematically assessed against goals.
Written documentation confirms that the EPP regularly and systematically: reviews,
analyzes and interprets QAS data, identifies patterns across programs, uses data for
continuous improvement and innovative modifications.
Not all changes need to lead to improvement, as CAEP encourages data-driven
experimentation, but changes should trend toward improvement.
The EPP examines the outcomes currently achieved and identifies gaps between current
results and established standards. Examines why these results occur.
EPP documents the process of examining results and decisions made (e.g., keep,
modify, discontinue).
74
Possible sources of evidence:
Decision grid (Question, Data, Stakeholder group, Decision)
Condensed Meeting Minutes that highlight data review and decisions
Outcomes of changes/modifications (what happened after changes were made)
Goals crosswalk table (goals and where in the process)
Connections
RA5.4 is the overarching theme of continuous improvement. Therefore, all
Standards/components are indirectly connected with RA5.4.
Notes
75
Standard 6: Fiscal and Administrative Capacity
For EPPs whose institution is accredited by an institutional accreditor recognized by the U.S.
Secretary of Education (e.g., SACS-COC, HLC), such accreditation will be considered sufficient
evidence of compliance with Standard 6. The required evidence for this standard is listed below
in the orange box. No narrative is required for this standard if the EPP is accredited by an
accreditor recognized by the U.S. Secretary of Education.
Standard 6: Fiscal and Administrative Capacity
The EPP has the fiscal and administrative capacity, faculty, infrastructure (facilities,
equipment, and supplies) and other resources as appropriate to the scale of its operations
and as necessary for the preparation of candidates to meet professional, state, and
institutional standards. For EPPs whose institution is accredited by an institutional
accreditor recognized by the U.S. Secretary of Education (e.g., SACS-COC, HLC), such
accreditation will be considered sufficient evidence of compliance with Standard.6. If
an EPP’s institution is not accredited by an accreditor recognized by the U.S. Secretary of
Education, the EPP must address each component of ST 6 in narrative supported by
evidence.
R6.1 Fiscal Resources The EPP has the fiscal capacity as appropriate to the scale of its
operations. The budget for curriculum, instruction, faculty, clinical work, scholarship, etc.,
supports high-quality work within the EPP and its school partners for the preparation of
professional educators.
R6.2 Administrative Capacity The EPP has administrative capacity as appropriate to the
scale of its operations, including leadership and authority to plan, deliver, and operate
coherent programs of study so that their candidates are prepared to meet all standards.
Academic calendars, catalogs, publications, grading policies, and advertising are current,
accurate, and transparent.
R6.3 Faculty Resources The EPP has professional education faculty that have earned
doctorates or equivalent P-12 teaching experience that qualifies them for their assignments.
The EPP provides adequate resources and opportunities for professional development of
faculty, including training in the use of technology.
R6.4 Infrastructure The EPP has adequate campus and school facilities, equipment, and
supplies to support candidates in meeting standards. The infrastructure supports faculty and
candidate use of information technology in instruction.
76
Required Evidences
If the EPP IS considered accredited by an institutional accreditor recognized by the U.S.
Secretary of Education:
Institution’s regional or national accreditation letter
Program Characteristics Table
EPP Characteristics Table
Qualification Table for EPP-Based Clinical Educators
Capacity Table
Off Campus, Satellite, Branch Table
If the EPP’s institution IS NOT accredited by a recognized accreditor :
Narrative addressing each component
Program Characteristics Table
EPP Characteristics Table
Qualification Table for EPP-Based Clinical Educators
Capacity Table
Off Campus, Satellite, Branch Table
77
Standard 7: Record of Compliance with Title IV of
the Higher Education Act
Freestanding EPPs may be able to use CAEP accreditation to access Title IV funds or other
federal funds. If it is their intent to do so, the EPP is required to meet Standard 7.
**Only For Freestanding PPs seeking to have CAEP serve as its federal Title IV
gatekeeper**
Standard 7: Record of Compliance with Title IV of the Higher Education Act
Freestanding EPPs relying on CAEP accreditation to access Title IV of the Higher
Education Act or other federal funds must demonstrate 100% compliance with their
responsibilities under Title IV of the Act, including but not limited to on the basis of
student loan default rate data provided by the Secretary, financial and compliance audits,
and program reviews conducted by the U.S. Department of Education. Freestanding
EPPs will need to provide narrative and evidence for all components of ST 7.
78
Notes
79
Appendices
Appendix A: CAEP Criteria for Evaluation of EPP-Created
Assessments & Surveys
CAEP Criteria for Evaluation of EPP-Created Assessments
1. Administration and Purpose
Sufficiency Criteria
The time/point at which the assessment is administered during the preparation program is explicit.
The purpose of the assessment and its use in candidate monitoring or decisions on progression are
specified and appropriate.
Instructions provided to candidates about what they are expected to do are informative and
unambiguous.
The basis for judgment is made explicit for candidates.
Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional, and
state standards
2. Content of Assessment
Sufficiency Criteria
Indicators assess explicitly identified aspects of CAEP and InTASC Standards, in addition to national,
professional, or state standards.
Indicators reflect the degree of difficulty or level of effort described in the standards.
Indicators unambiguously describe the proficiencies to be evaluated.
When the standards being informed address higher level functioning, the indicators require higher
levels of intellectual behavior (e.g., create, evaluate, analyze, and apply). For example, when a
standard specifies that candidates’ students “demonstrate” problem solving, then the indicator is
specific to candidates’ application of knowledge to solve problems.
Most indicators require observers to judge consequential attributes of candidate proficiencies in the
standards.
80
3. Scoring
Sufficiency Criteria
The basis for judging candidate performance is well defined.
Each proficiency level descriptor (PLD) is qualitatively defined by specific criteria aligned with
indicators.
PLDs represent a developmental sequence from level to level (providing raters with explicit
guidelines to evaluate candidate performance and giving candidates explicit feedback on their
performance).
Feedback provided to candidates is actionable—it is directly related to the preparation program and
can be used for program improvement as well as for feedback to the candidate.
Proficiency level attributes are defined in actionable, performance-based, or observable behavior
terms. [NOTE: If a less actionable term is used such as “engaged,” criteria are provided to define the
use of the term in the context of the category or indicator.]
4. Data Reliability
Sufficiency Criteria
A description or plan is provided that details the type of reliability that is being investigated or has
been established (e.g., inter-rater, internal consistency, consensus building activities with
documentation) and the steps the EPP took to ensure the reliability of the data from the assessment.
Training of scorers and checking on inter-rater agreement and reliability are documented.
The described steps meet accepted research standards for establishing reliability.
81
5. Data Validity
Sufficiency Criteria
A description or plan is provided that details steps the EPP has taken or is taking to ensure the
validity of the assessment and its use.
The plan details the types of validity that are under investigation or have been established (e.g.,
construct, content, concurrent, predictive) and how they were established.
If the assessment is new or revised, a pilot was conducted.
The EPP details its current process or plans for analyzing and interpreting results from the
assessment.
The described steps meet accepted research standards for establishing the validity of data from an
assessment.
Notes
82
Notes
83
CAEP Criteria for Evaluation of EPP-Created Surveys
1. Administration and Purpose
Sufficiency Criteria
The point or points when the survey is administered during the preparation program are explicit.
The purpose of the survey and its use are specified and appropriate.
Instructions provided to survey respondents about what they are expected to do are informative and
unambiguous.
2. Survey Content
Sufficiency Criteria
Questions or topics are explicitly aligned with aspects of the EPP’s mission as well as CAEP, InTASC,
national, professional, or state standards as appropriate.
Individual items have a single subject; language is unambiguous.
Leading questions are avoided.
Items are stated in terms of behaviors or practices instead of opinions, whenever possible.
Surveys of dispositions make clear to candidates how the survey is related to effective teaching.
3. Data Quality
Sufficiency Criteria
Ratings scale choices must be clear and have balanced keying (same number of positive and
negative options in Likert scale).
Feedback provided to the EPP is actionable.
EPP provides evidence that questions are piloted to determine that respondents interpret them
as intended and modifications are made as needed.
84
Appendix B: Transition and Phase-in Plan Schedules and
Guidelines
***ALERT***
Transition Plans are only applicable to the 2022 Revised Standards for Initial-Licensure
Preparation. Phase-in plans are only applicable to Standards for Advanced-Level Preparation
and are no longer accepted after Spring 2023 visits.
Standards for Initial-Licensure Preparation
The Transition Plan Schedule*
Standards for Initial-Licensure
Preparation
CAEP Standards for Initial-Licensure Preparation required for all
accreditation SSRs, reviews, and decisions beginning in Spring 2022
(including stipulation or probation visits)
The Transition plan schedule for
accreditation of Revised Standards
for Initial-Licensure Preparation is
indicated by the time of the site
visit in the columns of this chart→
Specific components of the CAEP
Revised Standards for
Initial-Licensure preparation are
allowed transition plans and are
listed below
Spring 2022 or
Fall 2022
Spring 2023 or
Fall 2023
Spring 2024 and
beyond
R 1.1 The Learner and
Learning
R 1.2 Content
R 1.3 Instructional
Practice
R 1.4 Professional
Responsibility
R 2.3 Clinical Experiences
R 3.3 Competency at
Completion
R 4.1 Completer
Effectiveness
SSR includes
Transition plans that
are to be accompanied
by current/previous
practice and the most
recent cycle of data
available.
SSR includes
Transition plans and
progress steps
Transition plans end
All components
require full data
complement
*Required Evidence:
At least one cycle of
previous data
AND
CAEP Sufficient
Transition Plan for
revised component
*Required Evidence:
At least one cycle
of previous data
AND
At least one cycle
of pilot/preliminary
data aligned with
revised
components
AND
CAEP Sufficient
Transition Plan for
revised component
*Required Evidence:
Three cycles of
data (This may
include one
previous cycle of
data and two
cycles of revised
or three cycles of
revised)
*All reviews are utilizing the Revised standards
85
Sufficiency Criteria for Initial-Licensure Preparation Transition Plans
Relationship to Standard or Component
Sufficiency Criteria
Guiding Questions
An explicit link of the intended
data/evidence to the standard or
component it is meant to inform
A description of the content and
objective of the data/evidence
collection is included
Does the transition plan have a specific
connection (e.g., identify by number
and text the component) with
provisions of a revised component?
Does the transition plan make a
compelling argument that the
data/evidence would be an appropriate
and strong measure of the revised
component?
Timeline and Resources
Sufficiency Criteria
Guiding Questions
Detailing of strategies, steps, and a
schedule for collection through full
implementation, and indication of
what is to be available by the time of
the site visit
Specification of additional
data/evidence that will become
available in the calendar years
following accreditation until
completion of the transition plan
steps and will be addressed in
annual reports
A description of the personnel,
technology, and other resources
available and needed to fulfill the
plan; institutional review board
approvals, if appropriate; and EPP
access to data compilation and
analysis capability
Does the transition plan include all
steps needed in detail with description
of action and a schedule?
Does the transition plan identify what
steps have been completed by the time
of SSR and site visit?
Can the transition plan be reasonably
accomplished within the resources
available to the EPP?
Does the transition plan specify the
resources needed to complete the
plan, including personnel, technology,
access, or other resources?
Does the transition plan identify all
semesters/years until full
implementation?
86
Timelines for transition plans include
process until 3 cycles of data are
collected.
Data Quality
Sufficiency Criteria
Guiding Questions
A copy of the collection instrument if
it is available, together with
information called for in the scoring
rubrics, Sufficiency Criteria for
EPP-Created Assessments and
Surveys
Description of procedures to ensure
that surveys and assessments reach
the sufficient level of the
EPP-Created Assessment and
Surveys
Steps that will be taken to attain a
representative response, including
the actions to select and follow up a
representative sample (or, a
purposeful sample if that is
appropriate for the data collection)
and actions to ensure a high
response rate
Steps to ensure content validity and
to validate the interpretations made
of the data
Steps to analyze and interpret the
findings and make use of them for
continuous improvement
What was the EPP using previously to
meet this component?
If tools are already created, do the
surveys or assessments identified in
the transition plan meet the CAEP
Criteria for Evaluation of EPP-Created
Assessments or Surveys? If the tools
are not already created, does the plan
include how they will ensure they will
meet the CAEP Criteria for Evaluation
of EPP-Created Assessments or
Surveys
How do the surveys and assessments
identified in the transition plan
reasonably be expected to achieve a
representative response and have an
appropriately high response rate?
How does the transition plan identify
appropriate analyses that have
been/will be conducted with the
data/evidence and appropriate
interpretations are likely to be made?
87
Phase-In Plan Schedule for Standards for Advanced-Level Preparation
The Phase-in Schedule
Standards for Advanced-Level
Preparation
CAEP Standards for Advanced-Level Preparation
required for all accreditation SSRs, reviews, and
decisions beginning in Fall 2018
The Phase-in schedule for
accreditation at the
Advanced-Level is indicated by the
time of the site visit in the columns
of this chart→
The policy applies to specific
components of the CAEP
Standards for Advanced-Level
Preparation listed below
Spring 2021
Fall 2021 or
Spring 2022
Fall 2022 or
Spring 2023
Fall 2023 or
Spring 2024
RA.1.1, Candidate
Knowledge, Skills, and
Professional Dispositions
RA.2.1, Partnerships for
Clinical Preparation
RA.2.2, Clinical Experiences
RA.3.1, Recruitment
RA.3.2, Candidates
Demonstrate Academic
Achievement and Ability to
Complete Preparation
Successfully
RA.3.3, Monitoring and
Supporting Candidate
Progression
RA.3.4, Competency at
Completion
RA.4.1, Satisfaction of
Employers
RA.4.2, Satisfaction of
Completers
RA.5.2, Data Quality
RA.5.4 (formerly A5.3),
Continuous Improvement
SSR can
include plans
for new
evidence
items if
evidence is
not complete
or available
SSR includes
plans and
progress
steps
(including
data, if any)
SSR includes
plans and
progress
steps
(including
data, if any)
SSR plus
on-site data
provide EPP
evidence to
document
each standard
88
Sufficiency Criteria for Advanced-Level Preparation Phase-In Plans
Relationship to Standard or Component
Sufficiency Criteria
Guiding Questions
An explicit link of the intended
data/evidence to the standard or
component it is meant to inform
A description of the content and
objective of the data/evidence
collection is included
Does the phase-in plan have a specific
connection (e.g., identify by number
and text the component) with
provisions of a CAEP component?
Does the phase-in plan make a
compelling argument that the
data/evidence would be an appropriate
and strong measure of the component?
Timeline and Resources
Sufficiency Criteria
Guiding Questions
Detailing of strategies, steps, and a
schedule for collection through full
implementation, and indication of
what is to be available by the time of
the site visit
Specification of additional
data/evidence that will become
available in the calendar years
following accreditation until
completion of the phase-in plan steps
A description of the personnel,
technology, and other resources
available and needed to fulfill the
plan; institutional review board
approvals, if appropriate; and EPP
access to data compilation and
analysis capability
Does the phase-in plan include all
steps needed in detail with description
of action and schedule?
Does the phase-in plan identify what
steps have been completed by the time
of SSR and site visit?
Can the phase in plan be reasonably
accomplished within the resources
available to the EPP?
Does the phase-in plan specify the
resources needed to complete the
plan, including personnel, technology,
access, or other resources?
89
Data Quality
Sufficiency Criteria
Guiding Questions
A copy of the collection instrument if
it is available, together with
information called for in the scoring
rubrics, Sufficiency Criteria for
EPP-Created Assessments and
Surveys
Description of procedures to ensure
that surveys and assessments reach
the sufficient level of the
EPP-Created Assessment and
Surveys
Steps that will be taken to attain a
representative response, including
the actions to select and follow up a
representative sample (or, a
purposeful sample if that is
appropriate for the data collection)
and actions to ensure a high
response rate
Steps to ensure content validity and
to validate the interpretations made
of the data
Steps to analyze and interpret the
findings and make use of them for
continuous improvement
If tools are already created, do the
surveys or assessments identified in
the plan meet the CAEP Criteria for
Evaluation of EPP-Created
Assessments or Surveys? If the tools
are not already created, does the plan
include how they will ensure they will
meet the CAEP Criteria for Evaluation
of EPP-Created Assessments or
Surveys
How do the surveys and assessments
identified in the plan reasonably be
expected to achieve a representative
response and have an appropriately
high response rate?
How does the plan identify appropriate
analyses that have been/will be
conducted with the data/evidence and
appropriate interpretations are likely to
be made?
90
Notes
91
Appendix C: Additional Resources
CAEP Accreditation Council Policy and Procedures
Glossary
Standard R4 Program Impact
Example Template for Transition Plans
92