– Quick answer – no-one is really sure
Over at the Prichard Blog they are happy about a simplistic comparison of the performance of Kentucky’s students with learning disabilities on the federal government’s National Assessment of Educational Progress (NAEP) tests.
Let’s temper that enthusiasm a bit.
As recently reported in Education Week (subscription), the people running the NAEP are uncomfortable about the way they test these kids and how many of them get excluded from the tests. The governing board that oversees the NAEP has been uncomfortable for a decade, ever since I pointed out issues about uneven exclusion rates from state to state on the NAEP 1998 reading assessment.
Right now, no one knows how a number of factors might be impacting the validity of the scores for these special kids, not even the people running the NAEP program.
So, approach those NAEP learning disabled scores with caution – the people running the program do.
There are a number of issues that could impact the validity of State NAEP scores for students with learning disabilities. Here is a short discussion of a few of them.
First, the rules on who gets identified as learning disabled vary from state to state. Thus, for the recent 2009 NAEP fourth grade mathematics assessment, the percentage of each state’s raw sample of students that were identified as learning disabled varied from a low of 10 percent of all students in the state to a high of 20 percent. That is a very large difference, which raises questions about whether some states over/under identify true learning disabled students.
If kids who really are not learning disabled are unevenly included in the mix in some states, then the learning disabled scores might become inflated. That is especially true if those kids get special testing accommodations for learning disabled students that regular students are not allowed to have.
Secondly, the percent of each state’s raw sample that gets excluded because they are considered too disabled to take the NAEP also varies widely. In some states, only one percent of the entire sample got excluded in the 2009 NAEP fourth grade math assessment, while in others the rate was as high as five percent. Kentucky’s fourth grade exclusion on that assessment was 3 percent, above the national average of two percent.
If you exclude more of your weakest students, your scores are going to get inflated. The problem here is while we know the percentages of kids excluded varies, no-one knows for sure how much that impacts scores. Research to date in this area has not included scientific split sample models and is therefore unconvincing and continues controversial.
A third problem involves the testing accommodations used in the NAEP.
Those accommodations can include such things as having people read the tests to the student (not allowed for NAEP reading, but legal on all of Kentucky’s state assessments and on other NAEP assessments), providing extra time, and even providing someone to write down the student’s answers. They can contribute to inflated scores if improperly used. They are also easy to abuse (scribe to student – “Are you SURE that is your final answer?”).
There has been little, if any, real scientific research into whether or not accommodations might be inflating scores. The decision to go completely to accommodated testing in NAEP was a policy decision made without adequate research. That is why, years later, the governing board for the assessment is still troubled about the situation, as Education Week reported.
Accommodations could give Kentucky another unfair advantage. Kentucky’s state assessments have a much longer history of using accommodations in testing than most other states. So our teachers and students have a lot of experience with accommodations that could create an advantage that isn’t present elsewhere. That could corrupt the meaning of our scores, as well. Again, I am not aware of any research on this familiarity issue.
Finally, NAEP is a sampled test, pulling somewhere around 3,000 students in Kentucky for each assessment at each grade level. Because learning disabled students form only a portion of the sample, the relatively small size of the group leads to rather notable statistical sampling errors. What looks like a big score difference might not be.
Furthermore, Kentucky’s very non-standard student demographics – our students run about 85 percent white versus a much lower national percentage – sometimes gives Kentucky unfair advantages in NAEP comparisons.
The data Prichard used includes both true students with learning disabilities and those students who are in a closely related, but different category known as the federal 504 program. I think the numbers Prichard cites also include English Language Learners (ELL), which again gives Kentucky an advantage because we have few ELL students here and they tend to score much higher than ELL in states with more heavy immigration.
I assembled the set of tables below with the NAEP Data Explorer.
Table 1 below separates the scores for students with learning disabilities only (SD), as shown in the orange frame, from the ELL students.
Note that Kentucky’s true learning disabled students (without the 504 plan students included) only scored four points above the national average for similar public school students. Also note that Kentucky has hardly any learning disabled students who are also ELL. In fact, there were so few of them in Kentucky that the NAEP won’t even report their scores.
However, across the rest of the nation, the learning disabled ELL score notably lower than other learning disabled students (3 points lower). That lack of very low-scoring ELL in Kentucky gives the Bluegrass State an unfair advantage in the comparison of the numbers Prichard cites.
By the way, as the following figure shows (also assembled from the NAEP Data Explorer), the four point score difference for Kentucky versus the national SD students isn’t statistically significant.
Clearly, the real situation here is quite complex and not nearly the same as Prichard’s simple analysis might lead you to believe.
Even the people running the NAEP have doubts.
That is why they have been running research to try and figure out what is really happening for a decade – and these NAEP experts still are not set on answers.
– How come only a “handful” of Kentucky school districts use it?
Kentucky schools have chronic problems with teaching math. For example, white eighth grade students – they comprise the vast majority of all students here – were recently outscored in math on the National Assessment of Educational Progress by their peers in 42 other states around nation. Kentucky’s white kids only bested whites in one state, and it wasn’t Mississippi. Mississippi’s whites actually tied us!!!
In sharp contrast, the math program in Singapore consistently comes out on top every time there is international testing such as the Trends in International Math and Science.
Since Singapore math is readily available in a North American edition, and because math instruction here badly needs improvement, you’d think all Kentucky schools would be picking it up.
Well, guess again.
According to the November 2009 “Kentucky Teacher” article titled “Singapore Math digs deeper into mathematics learning (See Page 5),” only a “handful” of school districts in the state are using Singapore Math.
In fact, the article only names two districts, Fayette County and Marshall County, which use this top notch program. And, both have only used it for a short period of time even though “Singapore Math has rated tops in the nation in mathematics achievement for more than 12 years,” according to Natalee Feese. Ms Feese is the Fayette County mathematics coordinator.
Singapore does exactly what state education leaders said was needed during the CATS Task Force meetings in late 2008 and in the discussions that led up to Senate Bill 1 in early 2009. It goes deeper into fewer subjects rather than doing the ineffective “mile wide, inch deep” sort of thing that current math programs in Kentucky tend to do.
Says Marshall County teacher Julie Teague, “Singapore Math teaches each concept to mastery. The philosophy behind the program is that students gain a deep understanding and develop number sense, which will help them be more successful when entering higher-level mathematics classes at the middle school and high school levels.”
Teague adds that the program has a focus on problem solving – which is exactly what KERA has supposedly required from day one. She didn’t say it, but Singapore also helps kids master fractions, which is absolutely critical to learning algebra and higher level math.
I hope a lot more districts pay attention to Fayette and Marshall County. Their programs are too new to show results, but if they are competently taught as in other parts of the US and in Singapore itself, we should see some good things happening soon.
Certainly, if Kentucky’s kids are going to successfully compete against the rest of the world, they need what works best worldwide – and it looks like that is Singapore Math.
RE: Your status as a competent, independent adult
This letter is to inform you that in light of recently enacted policies you will soon be treated as children. UK has decided that you are not capable of exercising your rights as a free-thinking, independent, and responsible person. Henceforth, you are not allowed to use tobacco on the university campus. You read that correctly. You cannot use tobacco at all. Anywhere.
Do not be alarmed. This is as far as it goes. This power will never be used to impose further sanctions on your freedoms as adult citizens.
Signs have been posted around campus to remind you that UK is tobacco free and it is now a “healthy place to live, work, and learn.” Pay no attention to the underlying implications in the message.
Students, staff, and visitors – you are adults but you are not to be trusted with tobacco.
Thank you for your cooperation.
Your babysitter and nanny,