Evidence That Tells Your Story: Building an Accreditation-Ready Data Portfolio
- kelly93055
- 6 hours ago
- 2 min read
The EPiC™ Key Assessment was designed to do more than collect scores. It was built to organize performance evidence in ways that make program impact visible.
Accreditors don’t just review data tables. They look for coherence, consistency, and evidence of continuous improvement. They want to see how candidate performance in Part A: Planning & Instruction connects to Part B: Assessment & Analysis, how scoring decisions are grounded in clearly defined Evidence-First™ markers, and how programs respond to trends over time.
An accreditation-ready portfolio doesn’t compile documents. It presents a coherent story.
Start with an Integrated Assessment Map
In EPiC, key assessments are not isolated events. Part A and Part B are intentionally structured across ten integrated rubrics with clearly defined evidence markers, allowing programs to map:
Which rubrics and markers align with specific professional standards
Where candidate strengths and growth areas emerge
How performance trends evolve across cohorts
When reviewers can trace evidence from rubric language to candidate artifacts to aggregated cohort data, the system becomes transparent and defensible.
Move Beyond Ratings to Evidence-Based Patterns
EPiC preserves more than final scores. It captures scoring decisions tied directly to evidence markers within each rubric.
That allows EPPs to:
Analyze performance patterns across all ten rubrics
Identify consistent strengths in planning, instruction, assessment, and feedback
Detect performance gaps early and provide targeted support
Instead of presenting averages alone, programs can demonstrate exactly what documented performance evidence supports those outcomes.
Align Data Directly to Accreditation Standards
Because EPiC rubrics are aligned to professional teaching standards, including InTASC, and support documentation for accreditation agencies such as CAEP, AAQEP, and NASDTEC, programs can structure their portfolio evidence by mapping:
Part A and Part B results to accreditation components
Cohort-level trends to program review cycles
Scoring calibration efforts to reliability documentation
This eliminates the scramble to retrofit assessment data into accreditation language. The structure already exists.
Present a Clear Narrative of Program Impact
When built from EPiC data, an accreditation-ready portfolio demonstrates:
Clear connections between rubrics and professional standards
Documented evidence supporting scoring decisions
Cohort trend analysis across Part A and Part B
Program adjustments informed by performance data
Reviewers don’t have to infer the story. The structure makes it visible.
From Data Collection to Reviewer-Ready Evidence
The goal isn’t to gather more artifacts. It’s to organize the evidence you already collect into a clear, standards-aligned narrative grounded in documented performance markers.
When EPiC data is structured intentionally, accreditation shifts from explanation to demonstration.
Evidence That Tells Your Story: Building an Accreditation-Ready Data Portfolio
Next week’s webinar, Evidence That Tells Your Story: Building an Accreditation-Ready Data Portfolio, explores how EPPs use EPiC’s unified data system to organize evidence, address gaps early, and present a clear, compelling accreditation narrative.
Want More?
Explore additional EPiC webinar topics that may interest or meet your needs related to teacher candidate preparation. Scheduled dates are now posted through June 2026.



Comments