“Super Saver” won the 136th Kentucky Derby today at Louisville’s Churchill Downs. Wouldn’t that be a great description for the commonwealth’s government? But only if it were true.
There’s a smokescreen of allegations that charter schools are leading to school re-segregation.
Now, a University of Arkansas team takes a look at one specific example of the shaky research behind such claims in a new article in the Summer 2010 edition of Education Next, “A Closer Look at Charter Schools and Segregation.”
“A Closer Look” takes strong issue with the methodology of another recent report from the UCLA-based Civil Rights Project (CRP), claiming that study’s “flawed comparisons lead to overstated conclusions.”
In one, easy to understand example, the Arkansas team points out that if you are looking for data on segregation, you cannot fairly do the job by comparing student demographics of inner city Washington DC charter schools to the demographics of upscale suburban schools that surround Washington.
You have to compare ‘apples to apples’ by looking at the demographics for typical public schools that serve the same population within Washington, DC. Once that is done, assertions about segregation start to crumble.
The University of Arkansas team points out that throughout its analysis of charter schools around the nation, the UCLA crowd’s mistake is to compare charter schools, which are predominantly found in inner city locations that have high minority populations, to much broader geographic areas where far more whites are located.
The Arkansas team provides interesting data that shows this is a very inappropriate comparison. Once inner city charter schools are compared to typical public schools that are also from the inner city, most of the racial gaps between charter and traditional public schools disappear. This graph from the Education Next article shows how that works.
The two sets of bars on the left show there isn’t much difference in the segregation rates of either typical public schools or charter schools in the inner (central) city. In both cases, there is very high minority presence. In contrast, the set of bars on the right of the graph show the flawed method used by the CRP crowd at UCLA identifies less segregation in both systems but also indicates a much larger gap between charters and typical public schools.
Sadly, this argument highlights yet another case where poor education data hampers understanding. The Arkansas team admits that the available data still isn’t good enough to do a really high quality ‘apples to apples’ comparison. The Arkansas team says that even the gap they show in the left bars of their graph would probably be reduced even more if a better set of data were available. Such data would allow precise tracking of all charter school students back to the typical public school they would otherwise attend.
So, here is another example of how educators are failing to produce the data we all need to analyze how our schools are doing. The lack of that data is leading to smokescreens and confusion about many important education questions including this one about segregation. Considering the huge amount of money we now spend on our public schools, such poor research is simply unacceptable.
The 22 states with right-to-work laws, which protect workers’ economic freedoms, enjoyed an average 38.5 percent increase in jobs from 1990 to 2009. States with right-to-work laws experienced more than twice as much job growth as states that force workers to join unions, including Kentucky. —Bureau of Labor Statistics cited in Buckeye Institute’s “State of the State” report.
The Kentucky General Assembly concluded its legislative session on April 15 without a budget. Gov. Beshear, who’s trying to blame the Legislature, didn’t help matters much by proposing a spending plan containing $780 million in gambling revenue that he knew did not have lawmakers’ support.
Click here to read the latest Bluegrass Beacon.
Education Week says a new National Research Council report shows that the best available data indicates teachers who get certified through alternative programs do just as well as teachers who come from the traditional education school route.
However, the report says a lot more, and it is pretty disturbing. The main reason there are no differences in teachers from different certification routes is that very little is really known about how to best train teachers.
Imagine that – despite years of focus on education reform around the nation, the report indicates that credible research about what works in teacher preparation remains very thin. Basically, all those ed school types are being guided by hunches and guesses, not thoughtful research, because there isn’t much thoughtful research.
Anyway, the chaotic lack of knowledge about how to effectively prepare teachers really isn’t news. We have written before about the same issues, including discussing very forthright comments from Arthur Levine, the former president of Columbia Teachers College in New York City. Levine has been decrying the lousy research on teacher preparation for years. He sounds off again in the Education Week article.
But, nothing seems to be changing.
So, somehow, as I read the Education Week article, I can’t help recalling a quote from Kentucky Board of Education Chair Joe Brothers:
“I came on the local (school) board in 1987. What you just said to me is no different than what I heard in 1987. So why should I be hopeful?”
(Comment made at the October 8, 2009 Kentucky Board of Education Meeting in Frankfort, Kentucky after department of education staff briefed on still more fad ideas about how to fix our education system. An audiovisual recording of this meeting is on line).
One last point: There is an interesting quote in the Education Week article with a Kentucky connection:
“The research we have on teacher education isn’t up to answering some of the most basic questions that people would like to have answers to,” said panel member Andrew C. Porter, the dean of the University of Pennsylvania’s graduate school of education. “We don’t want to be in the same position 10 years from now.”
Porter was a member of Kentucky’s National Technical Advisory Panel on Assessment and Accountability for many years. That panel was supposed to help make the CATS assessments really functional and valuable.
CATS is now dead, of course, and we never got credible data on teacher performance from it. CATS was never, “up to answering some of the most basic questions that people would like to have answers to.” But, I don’t recall Porter ever pointing that out.