Kentucky has completed its fourth year of testing with the Common Core State Standards aligned Kentucky Performance Rating for Educational Progress (KPREP) tests in reading and mathematics. It is time to start asking whether or not Kentucky’s new tests are showing signs of inflation. This post closes out our examination of this important concern with a look at the KPREP trends in fourth grade math versus Kentucky’s trends on the National Assessment of Educational progress (NAEP).
First, to briefly review, in Parts 1 and 2 of this blog series we examined the trends in eighth grade reading and math from the KPREP, the NAEP and also the ACT, Inc.’s EXPLORE tests. Part 3 only looked at the fourth grade reading results from NAEP and KPREP because there is no comparable ACT test for the fourth grade similar to the EXPLORE test. We found there is considerable cause for concern with inflation in the KPREP eighth grade reading results and somewhat less concern about eighth grade math. As of 2015 the difference in reported proficiency rates between KPREP and the other tests for eighth grade is at least 12 percentage points. Part 3 shows there isn’t a detectable problem at this time with inflation in the KPREP fourth grade reading results but the KPREP has consistently reported more than 10 percent higher reading proficiency rates than the NAEP shows.
Now, let’s complete this blog set with a look at Kentucky’s fourth grade math results.
Notice that here, just as we found in the eighth grade reading situation, there has been rather notable inflation in the KPREP fourth grade math scores over time. The KPREP and NAEP started out reporting essentially equal reading proficiency rates around 2011-12, but the disparity quickly increased to become a 9 point differential by 2014-15.
Essentially, given the sampling error in its scores, the NAEP indicates there has been no notable improvement in Kentucky’s fourth grade math performance since 2010-11. The picture presented by the KPREP is obviously much different. Furthermore, the difference in the 2014-15 scores for both tests is so large that even the sampling error in the NAEP scores (again, those are about plus or minus a 3-point amount) cannot hide this problem.
Let’s review some points:
For NAEP versus KPREP
• In 2015 KPREP reported proficiency rates notably higher than NAEP for both math and reading in both the fourth and eighth grades. The difference runs from 9-point proficiency rate variation for fourth grade math to 18 points for eighth grade math. All of the differences notably exceed the approximately 3-point statistical sampling error in the NAEP scores. Therefore, all of the 2015 KPREP scores are definitely different, and higher, than the NAEP results.
• KPREP scores notably diverged from the NAEP during the period around 2011-12 to 2014-15 on fourth grade math and on both eighth grade reading and math. For eighth grade reading the increase in the gap between KPREP and NAEP reported proficiency rates rose from about 10 points around 2011-12 to 18 percentage points in 2014-15. For eighth grade math the gap increased from about 11 to 16 percentage points. For fourth grade math the gap rose nine points, with essentially no difference in reported rates around 2011-12 jumping to a 9-point gap in 2015.
• KPREP already reported considerably higher proficiency rates than NAEP for a similar time frame in 2011-12 on fourth grade reading. The difference between NAEP and KPREP has not changed much since for fourth grade reading.
For eighth grade EXPLORE Versus KPREP
• The absence of statistical sampling errors in both EXPLORE and KPREP and the availability of 2011-12 scores from EXPLORE allow somewhat more precise analysis of the trends.
• For eighth grade reading KPREP and EXPLORE show very serious score divergence between 2011-12 and 2014-15. There was only a 5-point differential in 2011-12, but this grew nearly three times to a 14-point difference in 2014-15.
• For eighth grade math the growth in the KPREP to EXPLORE score differential is not large – just one point – but this is a definite increase because neither test suffers from the sampling errors found in the NAEP. Furthermore, the score differential between the two assessments was already notable at 11 points even back in 2011-12.
A key point is that, after four years of KPREP testing, we now see enough to begin to become concerned that Kentucky’s state tests are showing some signs of inflation, just as happened with our earlier KIRIS and CATS KCCT tests. To be sure, so far the problems are not nearly as pronounced as the ones that eventually developed with the CATS tests. However, there is indication that action needs to be taken now to insure scores don’t inflate more in the future.
Furthermore, given that score improvements to date have generally been small in KPREP, any inflation contained in those scores could largely cancel the ability to claim any real improvement has actually occurred.
Certainly, as Rhonda Sims, Kentucky’s associate commissioner for assessment and accountability, recently told legislators during the December 2015 meeting of the Interim Joint Committee on Education about the recent flatness of many KPREP scores:
“That’s not the pattern that anyone in this room would wish to see.”
If the scores Sims refers to are actually inflated, Kentucky’s true education performance since Common Core State Standards took effect might actually be even less than flat.