In the space of just two days, Kentuckians have been treated to not one, but two separate examples of how hard it is to find really incisive research about how our commonwealth’s public school system truly performs.
The first example came during Monday’s “Kentucky Tonight” show on KET:
At 5 minutes 30 seconds into the broadcast, Brigit Blom Ramsey of the Prichard Committee said Kentucky had been at the bottom on education indicators in the 1980s but has now risen to the middle of the pack on indicators like the National Assessment of Educational Progress (NAEP).
Ramsey’s conclusion appears correct, but only if you limit your view to overall average scores with no drilling down for additional context. Due to very different student demographics across the states, that limited approach, which considers only overall average scores, winds up comparing a whole lot of white students’ performance in Kentucky to a lot of minority students’ performance in other states. It is misleading, giving Kentucky a very unfair advantage.
To get a better picture about how Kentucky education really performs, you have to break scores out by race and then do comparisons. When you do that, as you can read in my Tuesday blog, “Kentucky’s real progress (?) on the National Assessment of Educational Progress – Obviously needed update,” the picture changes drastically.
For example, it turns out that Kentucky’s white students in 2017 only statistically significantly outscored whites in just two other states on NAEP Grade 8 Math. That, as Figure 1 illustrates, isn’t middle of the pack.
Get an even better handle on how Kentucky really performs on NAEP Grade 8 Math by looking at my blog above or just using the search term “NAEP” in our blog’s search feature to learn more.
So, Ramsey didn’t give you the full, right stuff about Kentucky’s educational performance. But, to my surprise, she got joined Wednesday by another group that doesn’t understand how student demographics and a solid understanding of data must be considered if you want to fairly rank Kentucky’s education against other states.
The new entrant in the incomplete-at-best research category is the Pegasus Institute, which posted a blog titled “Kentucky has below average ACT scores, but there’s a catch . . .” on August 15, 2018.
Well, there is indeed a catch or two here, but it involves Pegasus very incomplete analysis. To see what’s going on, just click on the “Read more” link.
There are a couple of notable problems with this Pegasus blog, not the least being a dismal lack of references. In this day when hyperlinks are king, the article offers not one reference for all of its numbers. And, that’s where the trouble begins.
Pegasus actually seems to draw from two different sets of ACT data, one set provided by the ACT, Inc. itself, and a second set of data on Kentucky’s 11th grade ACT test results, which comes from the Kentucky Department of Education’s web sources.
To begin, reports issued directly from the ACT itself include results for all students in each state – public, private and homeschool combined. Also, these ACT, Inc. reports cover final ACT scores for high school graduates.
Of important note here, private and homeschool students in Kentucky, at least, score notably higher than the public school students do. We’ve tracked that for years at the Bluegrass Institute, and the Composite Score difference in 2017 for public school students and the overall average for private and homeschool students was quite large – 19.7 for the public school graduates and 23.2 for the non-public school students (see our latest Excel spreadsheet with this information by clicking here).
Even though there aren’t large numbers of private and home school students in Kentucky, adding in their better performance pulls up the Kentucky overall graduate averages that ACT, Inc. reports.
Also important to note, many students take the ACT multiple times before graduation and usually improve their scores, sometimes notably, as a result. In general, ACT testing conducted in the 11th grade will show lower scores than tests taken later, say in the fall of the 12th grade.
For example, according to ACT data in the 2015-16 Kentucky School Report Cards for the state, the 11th grade public school ACT Composite that year was 19.5. One year later, in its News Release 17-114, the Kentucky Department of Education showed that the 2017 Kentucky public high school graduates’ composite score was 19.7, 0.2 points higher.
In notable contrast to reports from ACT, Inc. and other reports dealing with high school graduates’ average scores, the Kentucky Department of Education’s reports on school district level results include only scores from a state-financed, single administration of the ACT to 11th-grade students and only include public school students.
Assimilate all of this and it isn’t suitable to compare graduates’ ACT scores to scores from a single ACT administration given in the 11th grade.
And, it certainly is misleading to compare scores for all graduates combined – public, private and homeschooled – to those for public schools only without any explanation of what is being presented.
But, these are things Pegasus apparently did, though, again, they didn’t provide any references.
For example, the last table in Pegasus’ blog shows scores that my spot checking indicates are for 2015-16 ACT 11th grade testing from five Kentucky school districts. But, Pegasus’ table compares those 11th grade district scores to scores for the state which match ACT, Inc.’s report of overall average scores for all high school graduates – public, private and homeschool – for 2017 and DO NOT match the actual Kentucky public high school graduates’ 2017 scores from the department’s News Release 17-114. In fact, as expected, the statewide scores Pegasus tries to compare to the district scores are uniformly too high by 0.3 to 0.4 point, which short-changes true performance in those districts. And, when we are talking about a test with a maximum score of only 36 points, those differences are not trivial.
But, there are still more troubles with the Pegasus analysis. They never mention it, but, as we have discussed many times before, different student demographics from state to state matter. An apparent win situation can completely reverse once the data gets broken out by race, as we discussed in “How does Kentucky rank against other states on the ACT?” That blog points out how different Kentucky’s student demographics are from most other states and what happens once you break the ACT Composite Scores for the states out by race and then compare.
Once you give the ACT data the more complete attention it needs, Kentucky isn’t in the middle of the pack. In fact, if you look at only those states that have at least three years’ experience testing all their students with ACT, Kentucky is still down near the bottom, as Figure 2 shows.
But, go read “How does Kentucky rank against other states on the ACT?” to get a far more complete picture than you just got from Pegasus and Prichard. And, by the way, we put that “How does Kentucky rank…” blog out almost a year ago.