Case Study

From Differentiated Assistance to Math Excellence

How one high school transformed math instruction through assessment literacy and continuous improvement—proving the AchieveMetrics framework works under real pressure.

15
Years Teaching Socioeconomically Disadvantaged Students
95%
Free Lunch Eligible
DA
Status (CDE)
2+
Years of Improvement

The Situation

A high school serving a predominantly low-income population was identified for Differentiated Assistance (DA) status by the California Department of Education. The math department was struggling. Standard test prep strategies weren't working. Teachers felt reactive, not empowered. And the pressure was real: either improve, or face deeper CDE intervention.

"Fix math, or we're facing CDE intervention." That's what the principal said. Not a suggestion. A directive. And the stakes were impossible to ignore.

The challenge wasn't a lack of effort. Teachers were working hard. The challenge was that everyone was operating from incomplete understanding. They knew what students needed to learn, but not why those standards were structured the way they were, or how assessment actually measured that understanding.

The Approach

Rather than layering on test prep or external programs, we rebuilt the system from the ground up using a framework grounded in understanding CAASPP architecture itself. This framework was first tested in Madera through custom curriculum development, then refined through six years of high-poverty classroom teaching, then pressure-tested in a DA school.

The Process

Bring secondary math teams together. Have them evaluate SBAC assessment literature—not summaries, but the actual blueprints and cognitive complexity frameworks. Ask them: What is the test actually measuring? What does mastery look like at each claim? What should we be teaching to develop this thinking?

Let the teachers build the curriculum from that understanding. Don't hand them a program. Help them see the architecture, then let them design instruction that aligns to it. This is what we did in Madera. It's what we did in Alisal. It's what those curriculum materials—still in use by other districts—prove actually works.

The difference: Test prep teaches students tricks to pass. Assessment literacy teaches teachers to understand what the test measures, then teach toward that understanding. The results aren't comparable.

2015-2016
Assessment Literacy Materials & Curriculum Development
As a secondary math district academic coach, I facilitated content-specific professional development where teachers studied SBAC assessment literature directly—blueprints, claims, depth of knowledge frameworks. I created materials like "Measuring Student Understanding" to help teachers see how student thinking is actually measured. Then I led curriculum development by course level. Teachers studied the assessment, understood the architecture, and built curriculum aligned to it. Those materials are still in use by other districts across California.
2018-2024
Real Classroom Implementation & Teacher Facilitation
I brought that framework to the classroom. I've taught secondary mathematics in socioeconomically disadvantaged schools for 15 years. During this period, I also facilitated the same curriculum alignment process at Alisal and other sites, helping teachers understand CAASPP architecture through assessment literacy—the same work I'd started in Madera. I tested the theory against reality every single day.
2020-2021
Continuous Improvement Cycles Under CDE Pressure
A school where I was teaching was identified for Differentiated Assistance. The principal asked me to lead math improvement. I implemented the framework I'd built: assessment literacy, targeted standards, claims analysis, and continuous improvement cycles. Teachers began using assessment data not to judge themselves, but to understand: "What do my students know about this claim? What comes next?" That shift from evaluation to understanding changed everything.
2024-Present
The AchieveMetrics System
Everything is clear now. The framework works. I've proven it in Madera (curriculum development), proven it in high-poverty classrooms (6 years of teaching), proven it under DA pressure (continuous improvement that worked). Now I'm scaling it through AchieveMetrics—helping districts replicate the same approach that transformed instruction at the schools I worked in.

Results

The outcomes didn't happen overnight. But they happened consistently, across multiple measures, and most importantly, they were noticed by the right people.

Math Performance Gap Closed
Consistent year-over-year improvement. Math department outperformed district schools with similar demographics.
Teacher Mindset Shift
From "How do we get students to pass the test?" to "How do we build the thinking the test is trying to measure?"
Sustainability Built In
Improvement wasn't dependent on one person. The system was embedded in PLC cycles and departmental practice.
Student Agency Increased
Students began understanding what they were learning and why—not just how to answer test questions.

The external validation: County-level educational leadership reached out directly. "Whatever you're doing is working," they said. "We need to replicate this across the county." They didn't ask. They requested a meeting. That's the difference between good results and results that get noticed.

External Validation

The real proof of a system isn't internal metrics. It's when other districts notice and want to replicate it.

A Collaborative for Educational Excellence liaison reached out directly to request a meeting. When I asked why they wanted to meet, they said: "Whatever you're doing is working. We need to replicate this across the county."

This wasn't a sales pitch. This was a district office recognizing that the framework works and wanting to scale it. That's external validation that matters. It comes from seeing results, not from marketing.

Why This Approach Works

It's Not a Program. It's a System.

Most improvement initiatives focus on adding something new: a curriculum, a test prep program, a PD workshop. This approach removes the guessing. It builds understanding. When teachers understand CAASPP architecture, they don't need an external program telling them what to do. They become architects of their own instruction.

It Works in Real Conditions.

This framework was tested in a high-poverty school under CDE pressure. Not in a well-resourced district. Not with ideal conditions. With 180 students per teacher, limited planning time, and the real weight of accountability. If it works there, it works anywhere.

It Scales Without External Dependence.

The goal isn't to create a dependency on a consultant. It's to build capacity so deep that the school can continue improving on its own. The system becomes part of how the school operates, not an add-on.

The AchieveMetrics Framework

This case study is the proof of concept for the AchieveMetrics system. What worked at this school isn't proprietary. It's replicable. It's based on:

CAASPP Architecture
Deep understanding of test design, claims, and cognitive complexity.
Teacher Capacity
Building thinking, not compliance. Cognitive coaching that shifts how teachers approach instruction.
Assessment Literacy
Using assessment as a window into thinking, not just a test to prep for.
Continuous Improvement
PDSA cycles embedded in departmental practice and PLC protocols.

The Bottom Line

If you're in DA status, you need more than a consultant who reads about assessment. You need someone who has studied CAASPP at the architectural level, taught in high-poverty schools, and proven the system works under real CDE pressure. That's what this case study proves. That's what AchieveMetrics delivers.

The school in this case study isn't unique. 99% socio-economically disadvantaged students, 95% free lunch, CDE pressure—these aren't anomalies in California. They're the norm for the schools that need help most. And the framework works for them. Not through luck. Through system.