When a doctor says you have a health issue, you might get a second opinion, but when the second opinion confirms the original diagnosis, reasonable people take steps to address the issue. Somehow this basic life lesson seems lost on the Colorado Board of Education. Recently, several members of the State Board of Education defaulted to the predictable but disappointing hand-wringing and hyperbole about Colorado’s results from the state’s new standardized test, the Partnership for Assessment of Readiness for College and Careers, also known as PARCC. “If you take these at face value, almost two-thirds of our students have failed,” said board chair Steve Durham, a Colorado Springs Republican. “I have a hard time believing that number.” Durham also said accepting the results as accurate means acknowledging that the state’s education system is a “catastrophic failure.” Board member Val Flores, a Democrat from Denver, said she’d like to throw PARCC out of Colorado. State board observers have grown accustomed to Flores’ frequent detours from logic or coherence, but this one was more bizarre than most. “Kids need a lot of knowledge and I didn’t see that this test was very knowledge based,” she said. “I think probably an engineer or an accountant worked on the English language portion of it. It wasn’t beautiful; it wasn’t a beautiful test at all.” It shouldn’t have surprised anyone who pays attention that the scores were low, because, PARCC sets a considerably higher bar than the previous state test. And that higher bar was the result of careful and considered study of what Colorado must do to ensure that its students can become productive citizens in an increasingly complex and interconnected world.
Preparing Kids for College
Colorado signed on to PARCC in 2012 after adopting the rigorous Common Core State Standards in 2010. The idea behind the Common Core was to set academic standards high enough to prepare kids for college and to bring states more into alignment with one another. Previously, states like Massachusetts had rigorous standards, while Mississippi’s, for example, were lax. PARCC and its competitor, the Smarter Balanced Assessment Consortium, were designed to measure student progress against the Common Core. More rigorous standards mean more challenging tests, which result in, initially at least, lower percentages of students meeting a new, higher “proficiency” bar. And that’s exactly what happened. Whereas, previous state tests showed proficiency scores in the 50-60 percent range, the PARCC test showed scores considerably lower. In fourth grade, 30 percent of students met or exceeded expectations in math and 42 percent in reading. In eighth grade, 19 percent “met/exceeded” expectations in math and 41 percent in reading.
A Second Opinion
For those seeking a second opinion they have only to look at Colorado 2015 scores on the federal NAEP assessments (known as the “nation’s report card”), which were in the high 30’s for the same grades and subjects. Board President Durham decided the test must be flawed, saying, “I am not inclined to believe that these results are as dire as they appear, but there is no way for me to assert that with any greater degree of certainty than anyone else on this dais can assert that they are an accurate and true measure of what's going on and that is the fundamental problem with these tests.” It would be easy to plummet down a philosophical rabbit hole about the meaning of knowledge and truth. Can we ever truly know anything with certainty? And so forth. But let’s keep it simple. The PARCC tests were designed with significant educator input, from Colorado and elsewhere. PARCC items went through multiple layers of review, with educators mapping test questions meticulously against Common Core State Standards (which Durham and others of his ideological bent also dislike). Here’s what Colorado educator Joanie Funderburk, who helped develop PARCC math tests, wrote in a commentary earlier this year for
Although my work focused on content, there were other groups that reviewed each item through other lenses, such as bias and sensitivity. All in all, each item is reviewed by about 30 educators before becoming eligible for inclusion on the test. This process, along with the evidence-centered design of the test, supports the validity of the test items, with many experts affirming that the item is indeed designed to assess the desired content.
But Are We Getting Better?
The real question is, does PARCC tell us how Colorado’s students are doing, measured against the mark they’ll need to hit for success after high school? More important, does it tell us how to get better? Educators here in Colorado and across the country have spent years building toward this moment of objective truth and forward thinking states are using this opportunity to improve. Kentucky, for example, has been administering
Common Core-aligned tests since 2012. In their first year, test scores also plunged but they have climbed each year since and they understand that as teachers and students get used to the new standards and curriculum, the scores will go back up. Durham and company, on the other hand, keep veering toward the ideological extreme, bemoaning the fact that the Colorado attorney general has said that the board can’t allow districts to opt out of the PARCC assessments. “The law requires us to use it and whether we like it or not we have to follow the law,” he said. “Until the policymakers conclude either that this is the right way or this is a flawed process, we are stuck.” And for the time being, we’re stuck with this State Board of Education.
Alan Gottlieb is a veteran Colorado journalist who has covered education at the Denver Post and as the former editor and co-founder of EdNews Colorado and Chalkbeat. He's currently a freelance writer, editor, and communications consultant at
Write. Edit. Think.