The Kentucky Department of Education sent out News Release 15-051 yesterday, which contains some inaccurate information about how the state’s KPREP test compares to the National Assessment of Educational Progress (NAEP). While the release claims “Kentucky Among Handful of States with Reliable Test Scores,” that might be stretching things.
To begin, the term “Reliable” has specific meaning when we are talking about tests. This gets technical, but you can read about the formal meaning of test reliability here if you want. In any event – unless I missed something – the news release does not seem to refer to any formal determination of test reliability for Kentucky’s Kentucky Performance Rating for Educational Progress tests (KPREP). In fact, I don’t know if any formal reliability studies have been completed for KPREP. Perhaps the media staff at the department just made an unfortunate selection of terms, but the news release needs a correction.
There are more problems because claims in this release are math-challenged.
For example, someone solved their Common Core school year math line problem wrong. The news release talks about a comparison of NAEP and KPREP scores for the 2013-14 school term. Well, that’s wrong. The most recently available State NAEP math and reading results are for the mid-to-late winter administration in early 2013. That was during the 2012-13 school year. If someone compared 2013-14 KPREP scores to 2013 NAEP scores, that was a bit apples to oranges because different students would have taken the different tests. Furthermore, it is unnecessary to do such a cross-year comparison because we have 2013 KPREP results for the same Kentucky fourth and eighth grade students who took the NAEP in 2013.
There is another, more involved math error, as well. It has to do with this quote from Kentucky Commissioner of Education Terry Holliday in the news release:
“The report verifies the increased rigor of our assessments; statistically, we are well within NAEP standard error of measurement.”
Let’s explore the problem with the commissioner’s statement.
The NAEP is a sampled assessment, so there are plus and minus errors in the scores just like we hear about all the time with sampled polls of likely voting patterns in elections. However, the people who create the NAEP provide information on how to calculate those plus and minus errors. I used that information to find out if the commissioner was right in claiming Kentucky’s KPREP math and reading scores for 2013 fell within the plus and minus error range for the NAEP.
This simplified table shows what I found.
In almost every case, even with the maximum likely error correction added to the NAEP scores, the difference in the proficiency rates from that federal test and the KPREP is on the order of 10 percentage points. To be sure, that is much better than we had with the old CATS tests, but parents still need to be wary if their student’s scores are only a little above the KPREP proficiency minimum. That could indicate that their child is going to have problems in college.
By the way, parents of high school students probably have things a bit better. Under KPREP, high school students currently are tested with products developed by the ACT, Inc. So, the results should be on target with the actual ACT test. But, the NAEP indicates this is small consolation for parents of younger students, some of whom may not really be on track although they are getting Proficient scores in KPREP.
To be clear, Kentucky’s KPREP is certainly more rigorous than the old CATS assessments. But, comparisons to the NAEP show KPREP may not be rigorous enough. It may not be safe to assume your child is on target just because he or she gets a proficient score in KPREP. And, most KPREP tests are not as rigorous as the NAEP.