TAKS grade inflation is nothing new
By RICK CASEY
June 13, 2010, 5:47PM
When my colleague Ericka Mellon broke the story this week that the “passing” mark for some parts of the TAKS test had been altered to as low as 44 percent this year after the results came in, some political and educational leaders expressed concern.
Some even suggested that politics was involved, a suggestion that Education Commissioner Robert Scott called “silly.”
“It’s ridiculous in that the people that make the decision on the cut scores are not political operatives,” Scott told the Austin American-Statesman.
After a careful review of the data, I’m with Scott.
The evidence strongly indicates that the bar wasn’t lowered because, for example, Gov. Rick Perry’s sunny portrayal of the condition of Texas this election year needs rose-tinted tests.
No, We’ve been looking through rose-tinted glasses for years.
The reality is that Texas has set the TAKS bar exceedingly low going back at least to 2003, following a consistent policy of serious grade inflation on our high-stakes tests.
Since that time Dr. Paul E. Peterson, a Harvard political scientist who directs the university’s Program on Education Policy and Governance, has rated the tests used by various states.
His method is straightforward. Every other year, a national test is given to a representative sample of fourth- and eighth-grade students in all 50 states. The National Assessment of Education Progress (NAEP) is a tough test, for the simple reason that it is designed to match the level of difficulty of tests used by the industrialized nations that belong to the Organization for Economic Co-operation and Development.
By contrast, under the U.S. No Child Left Behind law, each state gets to define “proficiency” for itself.
Peterson’s staff looks at what percentage of students in each state received passing grades on the NAEP and compared it to what percentage passed their individual state tests.
Gauging the gap
No states consistently score better on the NAEP than on their state tests, but some do quite well. Massachusetts students, for example, scored better on the NAEP than on their state tests in math, though they did worse in reading, especially in eighth grade.
Peterson’s group ranks the states not on how well their students do, but according to how big the gap is between what percentage of their students pass the NAEP and what percentage are deemed proficient according to their state tests. After Massachusetts, the top five states are Missouri, Washington, Hawaii and New Mexico.
And where is Texas? Well, let’s just say we can be thankful for Illinois, Michigan, Alabama, Nebraska and Tennessee.
Texas vs. the world
In 2009, only 38 percent of our fourth-graders rated as proficient in math and 28 percent in reading on the NAEP.
But 86 percent passed the TAKS in math and 84 percent in reading.
The eighth-graders were similar. On the national test, 36 percent passed in math and 27 percent in reading.
But on TAKS, 79 percent passed the math portion and a stunning 93 percent passed the reading test — more than three times the percentage that passed the national test.
Let me be clear. Some states, such as Washington, didn’t do that much better on the national test. They just didn’t mislead their citizens by making their state test a lot easier. New Mexico students do even worse than Texas students on the NAEP, but roughly half also fail their state tests.
Could it be that the sample of Texas children who took the national test this year was poor? No. We ranked second from the bottom in 2003, 12th in 2005, eighth in 2007 and sixth last year.
“States will claim they have their own standards of proficiency, and that’s fine,” Peterson said Friday. “But the students in Texas are going to have to compete with the rest of the world.”
Noting that leaders don’t like to face up to the fact of low performances, Peterson said, “Texas is one of the states that is trying to obscure that to their citizens.”