The OKCPS 2017 Statistical Abstract is out (and embedded at the end of this article).
As we already knew, the state’s shift to higher standards produced a dramatic drop in test scores.
The Matthew Effect (basically, the rich get richer while the poor get poorer) and other research explain why future test-score growth will almost certainly be slower in our high-poverty schools than in more affluent districts.
So, we can either escalate the rhetoric about failing schools or shift our focus from bubble-in accountability to meaningful instruction.
Report raises more questions than it answers
What can we learn from OKCPS data? First, the district’s continued refusal to publish raw numbers along with percentages compromises its potential value. For example, what is the value of reporting how many students in a school earned a passing score in April when the number who started the year isn’t reported? I know from experience that the OKCPS planning office has the talent to produce useful data, and I hope they are still providing it to board members, but we need to get back to the MAPS for Kids approach where the public was provided meaningful information.
On the other hand, recently released data would have been helpful in deciding the most important question in terms of keeping the district’s school calendar: Does it reduce students’ time in the classroom? Had the Statistical Abstract been released before the board debated the calendar, we might have had a discussion on absenteeism during the first month of school.
Only about 36,500 students show up for the first day, Aug. 1, and it is not until Sept. 12 that attendance gets to the 40,000 level. How many kids fall impossibly far behind before enrolling in school? And, if the OKCPS returned to a traditional calendar, how much smaller would that part of the chronic absenteeism problem be?
Sadly, school accountability has focused on the easy-to-manipulate attendance rates as opposed to chronic absenteeism, which is the reason for one of the State Department of Education’s new metrics. For too long, the blame game has dominated, leading to the absurd claim that better instruction and principal leadership could make a large difference in persuading students, who face structural barriers to attending school, to come to class much more often. If we followed the wisdom of the Johns Hopkins Everyone Graduates Center and invested in early warning systems for fighting truancy, that would benefit our high-challenge schools.
Time in class remains crucial metric
The outcomes of the schools that I know best are greatly determined by the time students spend in class. Belle Isle and Classen middle schools are unquestionably great. They serve few poor kids, but – perhaps more importantly – they have mobility rates of only 8 percent and 9 percent, respectively. Belle Isle students attended class 174 days last year, and Classen students attended 171. Conversely, Douglass and Centennial middle schools have low-income rates of 100 percent and very low passing rates on tests. The main point, however, is that Douglass has a mobility rate of 67 percent, and its students are enrolled for an average of 124 days. Centennial has a mobility rate of 64 percent and an average enrollment of 134 days.
High-performing charter middle schools follow the same pattern. Their mobility rates range from 2 percent (Santa Fe South) to 21 percent (Harding Prep), and their days in class range from 156 (Harding Fine Arts) to 172 (Harding Prep). (For some reason, a significant number of metrics for charters aren’t reported.)
The same dynamic applies to neighborhood high schools. Their mobility rates range from a low of 41 percent (Grant) to 67 percent (Douglass), and average days enrolled range from 124 (Douglass) to a high of 144 (Grant and John Marshall).
Marshall is clearly improving, and Santa Fe South H.S. is an excellent school. I don’t want to make this sound like an accusation, but the 97 percent low-income Centennial scores are pretty close to those of the 72 percent low-income Marshall and better than Santa Fe South (100 percent transfer, only 6.4 percent special education and 0 percent mobility rates). That charter doesn’t have to face anything like the challenges faced by my old school, with a 59 percent percent mobility rate and an average enrollment of 136 days. (Last year, Santa Fe South H.S. reported an average enrollment of 190 days.)
Were I to borrow the spin of reform publicists, I’d phrase the finding: Once the lowest-ranked school in the state, Miracle School nearly matches high-performing charter school in only 136 days!
Three takeaways from the OKCPS 2017 Statistical Abstract
First, we need to acknowledge the hard fact that our primitive student-performance data mostly reveal what ZIP codes the test-takers come from. To increase learning in urban districts, teachers and administrators must ignore the scoreboard and focus on blocking and tackling (i.e. teaching, learning and relationship building).
Second, we must realize that the topic we’re worrying over is just a matter of wordsmithing. New state tests are correlated in terms of difficulty with the reliable NAEP national test. It’s a shame that, decades ago, the NAEP adopted the term “proficiency,” which it defines as “challenging subject matter.” That measure is higher than being on grade level. As the National Superintendents Roundtable and Horace Mann League explain in How High the Bar?:
In no nation do a majority of students meet the NAEP Proficient benchmark in Grade 4 reading.
The misunderstanding of NAEP metrics is used to attack American schools as broken when, “… far from failing, the U.S. ranked fifth among the world’s 40 largest and wealthiest nations in Grade 4 reading.”
Last, business leaders who praise John Rex Elementary charter school (37 percent low-income) should note that its mobility rate was 0.1 percent. North Highland, which feeds into Centennial, has a mobility rate of 56 percent, with the average student being enrolled only 133 days.
If Rex and North Highland swapped faculties, would outcomes change noticeably?
Better data, better schools
Again, the district’s Statistical Abstract should serve as a reminder that test scores are hopelessly inadequate for accountability purposes. If we were to show the professionalism that the SDE has, end the blame game and use data for diagnostic purposes, the OKCPS report could be turned into an invaluable tool for school improvement.