By Richard G. Innes, staff education analyst
Kentucky’s recently retired Commissioner of Education, Terry Holliday, spent his final day on the job tooting his own horn to the Kentucky School Advocate about improvements in his “college and career ready” statistics.
The numbers Holliday cites certainly have risen sharply. In fact, what currently are correctly described as college and/or career ready statistics (hereafter, just college/career ready) rose again in 2015. So did the reported high school graduation rates, as the Kentucky Department of Education gleefully reported at the top of their Media Advisory on the 2015 Unbridled Learning Results.
But, is this picture really accurate? Do Kentucky’s schools deserve to be, “…congratulated for their continued progress on graduating more students with the skills and knowledge they will need to succeed in the 21st century,” as Interim Kentucky Commissioner of Education Kevin Brown claims?
Sadly, Holliday and others are actually hitting on sour notes, using “apples to oranges” comparisons in what they consistently mislabel as the state’s “College and Career Readiness” rates. Aside from presenting numbers that are not comparable over the time period cited, a growing number of people think Kentucky Education’s latest numbers themselves are a shaky.
Furthermore, when the college/career ready numbers are tied to those graduation rate figures, it turns out that a gruesomely large proportion of our students are leaving high school with only a hollow piece of paper. Thousands being declared ready are not really getting the educations they need.
How many kids are we talking about? Even if we accept both Kentucky Education’s college/career rates and graduation rate data as accurate, the Bluegrass Institute estimates that more than 40 percent of the students who started the ninth grade with the high school class of 2015 failed to leave school with an adequate preparation for life. Some of those who failed dropped out of high school, but many were socially promoted all the way to an empty diploma.
That would mean out of 51,491 ninth graders who entered high school with the Class of 2015, somewhere around 21,266 of those students either dropped out before graduation or only got a hollow piece of paper at the commencement ceremony. This is an awful lot of kids who are not ready for the demands of life in the new, very technical economy they have been turned loose to survive in.
As a side note, it’s also interesting that Holliday took a shot at the Bluegrass Institute in his departure interview for criticizing his numbers. I guess we made an impression on him for our unwillingness to blindly accept anything the bureaucracy chooses to hand out to the public.
Holliday failed, however, to mention another very important critic of his readiness data – the Legislative Research Commission’s Kentucky Office of Education Accountability (OEA). The truth is that the OEA presented a very extensive research report in December 2014 that raises strong concerns about the validity of Holliday’s readiness numbers.
So, are Kentucky’s citizens getting a truly accurate picture of readiness and graduation success, or not? Read on for the detailed answer.
This blog mostly examines the Kentucky Department of Education’s determination for the “College Readiness” part of the total College and/or Career Ready statistics. The limitation is necessary because there isn’t enough data currently available to investigate if the “Career Ready” determination is valid. The true answer on the “Career Ready” numbers will ultimately depend upon employers’ reactions to young job applicants that the Kentucky Department of Education claims are ready for a career.
Regarding the “College Ready” part of the equation, back in 2010 – the earliest year of data the commissioner has been citing and the earliest data still being reported in a misleading manner in the 2015 Kentucky School Report Cards, this was the only determination being made by the department. A Career Ready calculation didn’t begin until 2012.
Furthermore, back in 2010 the determination of college readiness was solely based on a student’s scores from the ACT college entrance tests in English, math and reading. The passing scores, called “Benchmark Scores,” were set by the Kentucky Council on Postsecondary Education (CPE). These CPE Benchmarks are not the same as, and in most cases are lower than, other Benchmark Scores actually developed by the ACT, Inc. itself. As the following table shows, the current CPE ACT Benchmarks are 18 for English, 19 for math, and 20 for reading.
Because the ACT scoring scale only goes to 36 maximum, those CPE math and reading Benchmark Scores represent notably lower performance than the Benchmark Scores established by the ACT, itself.
Clearly, the CPE has set rather undemanding minimum hurdles for both reading and math that won’t work well for many colleges.
In fact, the CPE admits its ACT math benchmark of 19 only allows entry into the most undemanding college math courses. These courses are taught below the level of college algebra. Meeting only such a very low standard might not indicate a truly adequate preparation in math at a 21st Century college. The ACT, Inc. sets its own, notably higher Math Benchmark of 22, which indicates a student can probably handle college algebra. Algebra is still a relatively undemanding math course for the college level.
It’s also worth noting that the ACT has a Benchmark Score for science of 23. There is no CPE science benchmark what so ever. The ACT’s science benchmark indicates a student can handle a course in Freshman Biology.
Thus, the CPE’s Benchmark Scores for math and reading represent a very bare minimum preparation, at best.
In any event, the facts are that the only measure used to determine student college readiness in 2010 and 2011 in Kentucky came from applying the CPE’s Benchmarks to the ACT test. During those years, an entering Kentucky college student who failed to meet one or more of those CPE-determined ACT Benchmark Scores had to take a remedial course in the related area before being allowed to enter a credit bearing course in that area.
Things changed in 2012. Unbridled Learning, Kentucky’s new school accountability program, was launched and two additional ways for students to be declared college ready were added at the same time. The CPE now said it would admit students to credit-bearing courses if they did well enough on either the KYOTE or COMPASS college placement tests. Students who stubbed their toe on the ACT’s English, math or reading tests now could avoid remedial college courses if they scored well enough on equivalent subject tests with either the KYOTE or COMPASS.
In addition, a new Unbridled Learning graduate success category, “Career Readiness” was also created in 2012. Students who met any of the several methods used to determine readiness for a career also were counted as a success in Unbridled Learning’s newly revamped College and/or Career Ready statistics.
Are these measures accurate?
While analysis of the career ready statistics will have to wait, it is already possible to start examining the credibility of Unbridled Learning’s “College Ready” numbers. The OEA does that in some detail in their December 2014 report, in the process raising serious questions about whether the rigor in COMPASS and KYOTE is really adequate.
The figure below (Figure 3.G from the OEA’s report) shows that students who qualified to enter college credit-bearing courses only thanks to KYOTE or COMPASS tended to have very low Grade Point Averages (GPAs) in their freshman year in Kentucky’s colleges. Among those students who qualified using “COMPASS/KYOTE only,” nearly half (46 percent) got GPAs below 2.0 in their freshman year. Told they didn’t need remedial courses, those students have been set up to fail. Thus, the OEA’s report raises notable concerns about the reliability of the COMPASS and KYOTE as real readiness indicators. The error rate from the combined use of these tests looks very high.
Note: the OEA didn’t separate the impacts of the KYOTE from the COMPASS, so it is possible that most of the problems lie with only one of these assessments. Further research is obviously needed in this area because a second-chance assessment for students who don’t do well on the ACT is not an inherently bad idea.
We can learn something more from Figure 3.G.
Figure 3.G shows 18 percent of the 6,939 students who qualified to enter college in the fall of 2012 based on the use of the CPE’s very low Benchmark Score hurdles and their 11th Grade ACT test results ran into trouble once admitted and had a GPA below 2.0. Thus, among these 6,939 prior year (2011-12) Kentucky high school graduates who met the CPE benchmarks on their 11th Grade ACT tests and were Kentucky public college students in the spring of 2013, about 1,249 probably were not ready.
Using a similar analysis, 503 students from the “ACT Graduate” pool of 2,513 students, plus 863 graduates from the “Combination of at Least One ACT and Compass/KYOTE” pool of 2,537 students, plus 81 students from the Compass/KYOTE pool of 175 students also were not really ready although they had been declared ready in the Unbridled Learning reports.
Thus, overall, Figure 3.G indicates that among the total of 12,164 students in the OEA’s study who were declared college ready by one of the methods listed in Figure 3.G and remained in a Kentucky public college in the spring of 2013, 2,696 students among those declared college ready actually were not well prepared at all.
The 2012 Kentucky School Report Card for the state shows that 18,766 students out of a total graduation group of 43,121 students were declared college ready in that year. This would be a college ready rate of 43.5 percent. However, the material from the OEA’s report indicates this college ready rate is too high by about 22 percent. As such, the real college ready rate for Kentucky’s high school graduates of 2012 would only be about 34 percent, a reduction of nearly 10 percentage points.
More importantly, 2,696 students might have been turned loose in college to fail because they didn’t get the remedial support they needed.
The numbers are not comparable
The OEA’s report says still more. As we mentioned above, the OEA points out that the statistics being reported as “College and Career Ready” by the Kentucky Department of Education – the numbers Commissioner Holliday has been citing – have not been computed in a stable and consistent manner over the years from 2010 onward. Thus, a comparison of what the Kentucky Department of Education calls “College and Career Ready” results between 2010 and 2014 (now extended to 2015 in the new Kentucky School Report Card for the state for this year) is not valid. The 2010 numbers were calculated in an entirely different way from the 2012 and later numbers.
All of the readiness information is tied together in Figure 2 below.
In its December 2014 report, in Figure 2.L the OEA presents results from a consistent calculation of college readiness using the current CPE ACT Benchmark Scores to determine the percentage of Kentucky graduates who met the benchmarks in all three subject areas of English, math and reading. Those results through 2014 are shown by the bottom (blue) line in Figure 2. I extrapolated this line to 2015 in the graph, as well.
The middle (orange) line in the graph does not begin until 2012, the first year the data required to compute this statistic became available. The middle line shows the percentage of high school graduates the Kentucky Department of Education reported were ready for college using the combination of the all results from all three tests – ACT, COMPASS and KYOTE – to make that determination. Data to create the middle line comes from student counts listed the statewide Kentucky School Report Cards’ “Accountability” – “Learners” – “CCR” tab for each year (Note: all report cards are available from menus here).
Again, I extrapolated the trend to 2015. In addition, I added the actual 2015 college ready rate of 58.4 percent to Figure 2. This is based on student count data for 2015 in the CCR section of the newly released 2015 Kentucky School Report Card for the state. This actual data point is connected by the thin black line.
Finally, the top (red) line in the graph shows the “apples to oranges” percentages of students that the Kentucky Department of Education has been reporting as “College and Career Ready” to the public for all years from 2010 onward. These “Apples and Oranges” figures for all of the years are found together in the 2015 State School Report Card under the “Delivery Targets” — “CCR” tab. The tab provides no indication that the numbers are not calculated in a consistent manner across all the years listed. I also added the actual 2015 reported total College/Career Readiness Rate of 66.8 percent from the new school report card. Again, this is connected to the earlier data with a thin black line.
To summarize, the 2010 and 2011 numbers are actually a college ready number only, while the data for 2012 and later are properly titled “College and/or Career Ready” statistics. The 2010 and 2011 data are not comparable to the later years’ data.
What is this data really telling us?
The OEA explicitly points out in their December 2014 report that the only data consistently available across the entire time period Commissioner Holliday has been talking about is shown by the bottom line in Figure 2. There has been a small improvement here, but nothing terribly exciting. Nearly two out of three Kentucky graduates in 2014 were not ready for college per the ACT, even using the CPE’s rather watered down Benchmark Scores.
In sharp contrast, the numbers Commissioner Holliday and many others have been citing are those shown by the graph’s top line. These individuals often inaccurately refer to this information as the percentage of students “College and Career Ready” and make the more serious error of presenting the data as though it is consistently calculated across the period from 2010 to 2015.
Notice that the middle (orange) line in the graph shows a very significant part of the total “College and Career Ready” figures Commissioner Holliday has been reporting to the public actually comes from the college ready data. To date, the “Career Ready” data hasn’t made much of a contribution to the overall figures. And, as the OEA’s report has made clear, the college ready figures shown by the orange line are MUCH larger than the higher confidence data based only on the ACT as shown by the blue line.
Basically, most of the increase in claimed college and/or career readiness (to use the proper term) that the commissioner has been talking about comes from the KYOTE and COMPASS, tests the OEA’s report shows may have dubious validity as true readiness measures. The Council on Postsecondary Education might accept those KYOTE and COMPASS test results, but the students’ actual college GPAs, as the OEA points out, show the council probably does a disservice to a number of students by doing that.
While the commissioner claims great gains, his boasting may be coming at the expense of thousands of students who got turned loose in Kentucky’s colleges without getting the remedial support they needed to survive there.
What about the empty diplomas?
Even if we were to rely on the graduation and readiness numbers the Kentucky Department of Education has been hawking, it is obvious that an astonishingly large proportion of our entering ninth grade students are not leaving high school with the educations they need. Table 2 shows what the Bluegrass Institute is calling the “Effective High School Graduation Rates” for Kentucky’s public high schools. The Effective High School Graduation Rates are based on the combined message from the published high school graduation rates and the college and/or Career readiness rates for those graduates, which are also listed in Table 2.
The Effective High School Graduation Rate is an important concept, so let me explain how it was developed by considering the latest, 2014-15 data.
According to the new Kentucky School Report Card for the state for 2015, 87.9 percent, or 87.9 out of every 100 students who entered the ninth grade in the 2011-12 school year graduated on time from high school in 2015. However, that same report card says that only 66.8 percent of those 87.9 students were actually ready for what would come next in their lives – either college or a career – even using the undemanding criteria we discussed earlier. That works out to only 58.7 students out of each original 100 students who entered the ninth grade with the Class of 2015 leaving school ready for the rest of their lives. That is an effective graduation rate – after more than 25 years of expensive KERA reforms – of only 58.7 percent! No one should be proud of that.
Don’t forget that the Effective High School Graduation Rates shown above are actually too high. They are based in large measure on the inflated information we are getting from the combined use of KYOTE and COMPASS to compute college readiness. The true effective high school graduation rates are certainly lower than those shown in Table 2.
(Note: For more on the Effective High School Graduation Rate, read “What the new high school graduation and readiness rates really tell us.”)
To sum up
- It is misleading to compare the college readiness (only) data from 2010 and 2011 to data from 2012 and later. The more recent data include a lot more ways for students to be declared college and/or career ready.
- Some of the ways students in Kentucky are currently being declared college ready based on the combined impacts of KYOTE and COMPASS testing appear to be excessively watered down. Students declared ready with this process are producing very low GPAs in college.
- Even using the Kentucky Department of Education’s and the Kentucky Council on Postsecondary Education’s watered down criteria for who is college and/or career ready, it is clear that a large amount of social promotion to a high school diploma is currently under way in Kentucky.
- It is too soon to know if the definitions for career ready status are accurate, and research with Kentucky’s employers is badly needed to confirm this information. However, so far this part of the College and/or Career Ready calculation isn’t adding much input to the final numbers.
- There is nothing inherently wrong about using more than one method to determine college or career readiness, but it is incumbent on Kentucky’s educators to insure that the methods employed are producing accurate information. Otherwise, students and colleges will be misled about the real level of preparation, and that sets students up to fail.
Richard G. Innes is staff education analyst for the Bluegrass Institute for Public Policy Solutions, Kentucky first and only free-market think tank. He can be reached at 859-466-8198 or firstname.lastname@example.org.