Are Kentucky’s KPREP tests starting to inflate – Part 1?

Kentucky has completed its fourth year of testing with Common-Core-State-Standards-aligned Kentucky Performance Rating for Educational Progress (KPREP) tests in reading and mathematics. Because inflation in the state’s earlier reform test programs – the first one known as KIRIS and the second as the CATS Kentucky Core Content Tests – proved problematic, it is time to ask if Kentucky’s new tests are beginning to show similar signs of inflation.

To begin exploring this important issue, I assembled Kentucky’s eighth grade reading scores from the KPREP as listed in each year’s Kentucky School Report Card for the state. The Benchmark Scores from the ACT, Inc.’s EXPLORE tests are also from the Kentucky School Report Cards. I obtained the state’s scores from the National Assessment of Educational Progress (NAEP) using the Main NAEP Data Explorer web tool.

The scores I obtained for these three eighth grade reading assessments are found in the following graph. Let’s see what this information indicates.


Without question, there are disturbing signs that scoring of KPREP reading for the eighth grade, at least, is inflating.

According to the KPREP, between 2011-12 and 2014-15 eighth grade reading proficiency increased in Kentucky from 47 percent to 54 percent, a fairly substantial gain of seven percentage points.

However, in sharp contrast, the ACT’s EXPLORE test’s college readiness Benchmark Scores show that between 2011-12 and 2014-15 Kentucky’s eight grade reading performance actually decayed by two percentage points from 42 percent to just 40 percent. That is a real decay because the EXPLORE, just like the KPREP, is given to all eight grade students in Kentucky so there are no sampling errors in the scores.

The story from the NAEP is also problematic, though it is a bit more involved to see that. We have to interpolate between the NAEP’s 2010-11 and 2012-13 scores to develop this picture because NAEP reading has only been given in odd years recently. An interpolated NAEP score for 2011-12 eighth grade reading proficiency rate for Kentucky would be 37 percent, one percentage point higher than the rate in 2014-15. However, the NAEP is a sampled assessment and the scores have plus and minus errors. Thus while the small, one-point difference in the NAEP scores is not large enough to be statistically significant, NAEP eight grade reading results for Kentucky certainly do not show notable improvement, either.

If we take the NAEP results at face value, the gap between KPREP and our interpolated NAEP proficiency rate for 2011-12 would be 10 percentage points. By 2014-15 the gap in reported eighth grade reading proficiency rates in Kentucky for these two tests nearly doubled to 18 percentage points. That growth in the gap is more than large enough to be a real concern given that the NAEP sampling errors for these Kentucky scores are on the order of plus or minus three percentage points.

Without question, the divergence in what EXPLORE and KPREP are telling us is rather dramatic, especially when you consider that in 2011-12 the score difference was only five percentage points but in just three years has grown to nearly three times that with the current 14-percentage point difference. Furthermore, the NAEP reading results back up what EXPLORE tells us. Neither the NAEP nor the EXPLORE results support the KPREP tests’ claims of progress on eighth grade reading in Kentucky over the time frame covered by the graph.

There certainly appears to be inflation creep with the KPREP eighth grade reading scores. Because inflation in the scores played a major role in the demise of both KIRIS and CATS testing, the Kentucky Department of Education needs to pay attention – quickly – to this growing KPREP problem.

Inflation in test scores from the Kentucky Department of Education is nothing new. Since the early days of the Kentucky Education Reform Act of 1990, the state has encountered repeated problems with inflation in scoring on its testing programs.

For example, notable inflation in performance improvement claims became apparent over time in the old Kentucky Instructional Results Information Systems (KIRIS) tests. The Kentucky Department of Education’s (KDE) “KIRIS School and District Accountability Results for Accountability Cycle 3 (1994-95 to 1997-98)” report (no longer online) showed the percentage of students scoring at or above proficient for elementary school reading was just 8 percent in 1993 but rose amazingly fast to 33 percent by 1998. Meanwhile, the National Assessment of Educational Progress (NAEP) showed the state’s proficiency rate for reading was 23 percent in 1992 but only rose to 29 percent by 1998 (Results from NAEP Data Explorer).

Clearly, the rate of progress reported by KIRIS was seriously overstated. By 1998 KIRIS lost acceptance as a result.

The replacement for KIRIS, the Kentucky Core Content Tests portion of the state’s now defunct Commonwealth Accountability Testing System (CATS) also inflated badly between 1999 and 2009 when this program was voted out by the legislature.

By 2009, for example, the CATS KCCT middle school math proficiency rate was reported to be 60.92 percent per the “BRIEFING PACKET, STATE RELEASE, Kentucky Core Content Test (KCCT), 2008-2009 Results” report from the KDE (Also no longer online). The NAEP said Kentucky’s Grade 8 math proficiency rate was only 27 percent that year, a dramatically lower figure by any measure.

Thus, it was no surprise when the legislature disbanded the CATS in 2009, scheduling the KCCT for the trash heap as soon as a new set of standards and aligned tests could be created.

Moving to the present, Kentucky legislators are starting to worry about progress under our new Common Core aligned testing program. It may well be that those concerns need to be extended to the tests we are relying upon to tell us how well our students can read and do math.