Back in March, as students were filling in the last of their CSAP ovals, I wrote a post encouraging a discussion of what to look for with 2010 CSAP scores — which were then still 6 months away. And while I agree with Mark that CSAPs are an autopsy and do next to nothing to help teachers gauge student progress and deficiency during the school year, like an autopsy they do provide valuable insight into overall trends at a broad level.
While not so useful to teachers, CSAPs and the comparisons in the Colorado Growth Model, can help both a district and individual schools see where they are making progress, and where they are not. In Denver, we also now also have the 2010 School Performance Framework, for which CSAPs are the primary engine, which adds a little more color and multiple measures of assessment.
Usually CSAP scores are used in hindsight to justify existing positions (um, like the end of this post). So last March, I identified four areas where I thought CSAP results would be particularly illuminating — well before anyone knew what those scores would be. Now we do.
Here are those same areas revisited, and what we might discern from the results:
1. DPS Academic Growth
March: So when the 2010 CSAPs come out, start here: how much real academic growth has the district achieved?
September: When you look at the district’s results on the basis of one year’s growth, the petty pace of progress is a little underwhelming. But it increasingly appears — as an intentional strategy or not — that DPS is pursuing a path of slow but steady operational improvement instead of trying more comprehensive and radical reforms in search of more rapid change. As most reports noted, DPS has shown growth in excess of almost all Colorado school districts two years in a row – even if the absolute growth in proficiency has been minimal.
For overall proficiency in the core subjects of math, reading and writing, DPS has seen an average annual increase of 1.7 percentage points over the past five years. Now, that does not sound like much, but the cumulative impact has been an 8.3 percentage point gain.
And now consider the size of the ship: a perfectly distributed increase in a steady 75,000 student district would mean over 6,200 more students are now proficient, or an overall gain equal to roughly 15 new quality schools of 400 students each — or three new schools each year. Now there is some considerable double counting here with the gains at specific charter and innovation schools, but over a five-year term, this is clearly a positive trend and comprises better results than in almost all urban districts nationally.
I’m also somewhat hopeful at the increase in 10th grade proficiency, which is the last test before a student graduates. Academic gains in early years, if not sustained, are problematic — the proper goal of any school district is to have proficient high school graduates, not just proficient elementary students. Here again, the long-term results are solid, if unspectacular: 10th grade proficiency has improved from 26.8% in 2005 to 32.2% in 2010 — there is a smaller gain in 10th grade than overall, but many districts are only seeing improvements in the early years, which are reversed in middle and high school.
Again, a school system where just one-third of 10th graders are proficient can in no way be considered a success, but the steady, incremental progress of the District should be seen for what it is. Many people (including me) will argue for policies to accelerate this growth, but even we should pause to acknowledge the gains. One can still plausibly argue over glasses half-empty or half-full, but we should all now agree that the water is rising.
2. District turnarounds: Cole/CASA, Trevista/Horace Mann, Gilpin
March: These three schools were all part of transformation plans in the 2007-2008 reform efforts, the 2010 scores will show if they are on track after a transitional year.
September: There are really six separate growth scores here as each school has both an elementary and a middle program. What we see from the scores is that overall, the track that these schools are now on is heading straight to the undistinguished shed of average (in what is still an underperforming district). Of the six programs, Trevista’s elementary school looks like it is still cause for concern, with a growth score of just 39% — 11 percentage points from the median. Grouped near the median growth score of 50% were CASA’s elementary program (47%), Trivista’s middle school (52%) and both programs at Gilpin (51% and 52%).
CASA’s middle program did the best, with a growth score of 68%. The SPF shows a similar level of mediocrity: CASA and TriVista are both in the middle category of “Accredited on Watch” while Gilpin is in the penultimate category of “Priority Watch.”
It’s hard to know if this constitutes success or not — these were programs that were previously failing miserably, and the improvement to average could well be seen as an accomplishment. But if so, it is deeply limited, as median growth will not help DPS close its achievement gap with the rest of the state. The shift in students also largely prevents an apples-to-apple comparison. But my sense is that a set of average schools was not what DPS and community leaders envisioned during the wrenching turnaround process. It’s somewhat disappointing news.
3. Charter Expansions: West Denver Prep, DSST
March: The ability of these two schools to maintain their high academic standards while they grow is a critical test.
September: Any questions about quality replication should be banished, at least for now, as WDP and DSST combined for four of the top 10 schools in academic growth in the entire state, and were the only schools in the top 10 not serving elementary kids (full disclosure: I serve on the WDP board). Both of the new schools finished slightly higher than their existing siblings. Median growth percentiles for the new WDP campus were 89% (compared to 84% for the existing campus); the new DSST middle school came in at 80%, just ahead of the existing DSST high school at 77%.
The SPF combines scores for DSST’s middle and high schools, so WDP and DSST took three of the top five spots, and were the only schools not serving elementary students in the District’s top category of “Distinguished.” This is remarkable success and bodes well for continued expansion (both schools just open additional campuses this fall).
4. Program Expansions: Kunsmiller Creative Arts Academy (KCAA)
March: If the program can show clear academic growth while serving their local community, it could open the door for a similar attempts with different district programs, and a movement to spread successful magnet programs to different demographic groups.
September: KCAA was opened with strong backing amid the hope that it could show success at least within shouting distance of the magnet Denver School of the Arts. In its first year, it did not. In fact, the Colorado Growth Model shows that KCAA managed proficiency of 47% and growth of 41% in their elementary program, and proficiency of 30% and growth of just 41% in middle school — well under the median in all areas. Also disenchanting were their SPF results, where KCAA ranked in the bottom category of “On Probation” as one of the lowest 15 schools in the district — which astonishingly showed worse scores than the school had in the 2009 SPF, before the redesign.
I was surprised at these scores, and my initial assumption was that there was probably considerable discrepancy by grade. However the five grades tested showed little difference: grade 8 was just above the median in two of three subjects (math and reading both 52%); grade 7 was above median growth in one (reading 57%), as was grade 5 (writing 53%). The other 11 grade and subject score areas – which make up almost 75% of the 15 total — all tested below median growth. Oddly enough, KCAA’s 8th grade — which consisted of legacy students from the former program and was least affected by the redesign — did the best. If you remove the 8th grade scores, KCAA managed above median growth in just two of 12 areas.
This is a deeply inauspicious start for a program that carries the considerable hope of its supporters. While I have no doubt the school’s passionate defenders will claim that KCAA’s intangible benefits are superior to the annoying application of quantitative data and comparison, I’d be hard pressed to believe that anyone in the planning stages would have voiced their support for a future if it included this backwards trajectory.
So these were the four areas I identified last March, but as always, there are some terrific surprises in the data. One school deserves a special shout-out: Beach Court Elementary. On the Colorado Growth Model, Beach Court had the highest overall growth in the state, with an average score of 91%. Compared to other DPS programs, Beach Court was tops in two subjects: writing (96%) and reading (92%) and tied for third in math (86%).
This is not a sudden outlier, as last year’s growth compared to other DPS programs was similar: highest in writing, second in reading, and tied for third in math. Beach Court ranked 6th on the SPF, and with a FRL population of over 90% was just one of two traditional schools managing the rank of “Distinguished” with poverty rates above the DPS average. That is simply phenomenal work – and Frank Roti and his entire team deserve ample congratulations, and a lot more of our attention.