Quality Assurance System
A Quality Assurance System is composed of multiple components such as Analysis & Assessment Planning, Assessment Policies & Procedures, and Data Tools & Human Resources. The purpose of a quality assurance system is to demonstrate alignment to overarching standards or learning outcomes and develop an efficient, sustainable process for systematic review of assessment instruments and data for continuous improvement of program and/or the unit.
The following PowerPoint presentation was developed in a three part workshop at the 2017 American Association of Colleges of Teacher Education (AACTE) Quality Support Workshop – Midwest.
Conn, C., Bohan, K., & Pieper, S. (2017, August). Developing a quality assurance system and examining the validity and reliability of performance assessments. Presentation at the annual conference of the American Association of Colleges of Teacher Education (AACTE) Quality Support Workshop – Midwest, Minneapolis, MN.
Within the tabs below are a variety of resources and templates.
Quality Assurance System Tab Open
Validity Inquiry Process Tab Closed
Inter-rater Agreement and Calibration Strategies Tab Closed
Quality Assurance System Accordion Open
Several strategies can be used to assist with developing a comprehensive Quality Assurance System. These include:
- Conduct a high level needs analysis
- Complete an assessment audit
- Identify and implement data tools
- Develop assessment policies and procedures
- Discuss results and engage stakeholders throughout the process
Following are resources and templates related to the strategies.
- Quality Assurance System: Guiding Questions for Strategies (pdf)
- CAEP EPP Assessment Audit Template (pdf)
- CAEP EPP Assessment Plan – Actions Planned Timetable Template (pdf)
- CAEP EPP Master Assessment Plan and Calendar Template (pdf)
- CAEP Evidence File Template for Quantitative Data (pdf)
- CAEP Evidence File Template for Qualitative Evidence or Documentation (pdf)
- Process for Updating, Reviewing, and Reporting EPP Level Self-Study Data Files (pdf)
- Example Policy and Procedures for Systematic Review of Program Level Assessment Data (pdf)
- Biennial Report Template (pdf)
- Program Level Biennial Reports Master Chart Template (xlsx)
Validity Inquiry Process Accordion Closed
The following PowerPoint presentation was developed for a pre-conference workshop at the 2015 Council for the Accreditation of Educator Preparation Conference:
Conn, C., Bohan, K., & Pieper, S. (2018, April). Ensuring meaningful performance assessment results: A reflective practice model for examining validity and reliability. Presentation at the annual conference of the American Educational Research Association, New York City, New York.
The Validity Inquiry Process (VIP) Model is intended to assist in gathering evidence to build a validity argument regarding performance assessment(s). The model is aligned to eight validity criteria outlined in the literature (Linn, Baker, & Dunbar, 1991; Messick, 1994):
- Domain Coverage
- Content Quality
- Cognitive Complexity
- Cost and Efficiency
The VIP Model includes practical resources, including strategies and instruments, for examining locally developed performance assessments. The purpose of these strategies and instruments is to gather evidence for building a validity argument for the interpretation and use of the performance assessment results. The instruments developed to guide the validity inquiry process include:
- Content Analysis Strategies for Building a Validity Argument (word)
- Content Analysis: Example chart for documenting Content Quality and Generalizability, Strategies 2 and 5 (word)
- Validity Inquiry Form for Examining Performance Assessments (word)(pdf)
- Metarubric for Examining Performance Assessment Rubrics (word)(pdf)
- Student Survey: Meaningfulness of Performance Assessments (word, pdf)
- Validity Argument (word, pdf)
Conn, C., & Pieper, S. (2014). Validity inquiry process: Model and resources for examining performance assessments. Manuscript currently under revision.
Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), 15-21.
Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13-23.
Pieper, S. L. (2012, May 21). Evaluating descriptive rubrics checklist (pdf)
Do not reprint information contained or linked from this website without written permission of the authors.
Inter-rater Agreement and Calibration StrategiesThe inter-rater agreement and calibration strategies described on this page are a component of the Quality Assurance System (QAS) implemented by NAU Professional Education Programs (PEP). NAU PEP will continue to put in place these strategies and may make changes to the process at any time.
Inter-rater Agreement and Calibration Strategies Accordion Closed
The inter-rater agreement and calibration strategies described on this page are a component of the Quality Assurance System (QAS) implemented by NAU Professional Education Programs (PEP). NAU PEP will continue to put in place these strategies and may make changes to the process at any time.
Inter-rater agreement training:
Important note about these files: The Excel spreadsheets contain password-protected cells. If you’d like to start from scratch and modify the files, please create a copy of the file on your computer. The Final Report PDF is a secured document to prevent editing.