Achievement Gaps Closing, but Not Necessarily because of NCLB, says CEP Report
A new report from the Center on Education Policy finds that in most states since 2002, test scores on the NCLB-required reading and math tests have gone up—and in many cases, achievement gaps have closed. While that is good news, CEP makes it clear that such gains cannot necessarily be attributed to NCLB.
NEA urges caution in evaluating NCLB Report findings in its June 5, 2007 News Release.NEA President Reg Weaver noted that “we should be cautious and remember that NCLB was not and is not the only education reform effort in place. States and local school boards have been working together for years with teachers and other educators in making significant progress to improve the quality of education that our children receive. If anything, this report should sound an alarm that we are drawing conclusions without all the facts. The report clearly indicates that given the current available data, 'an accurate and complete picture of NCLB is a moving target.'” NEA offers has been proactive in creating a comprehensive plan to improve NCLB.
Secretary Spellings of course used the report to once again say that “the law is working”. This report on student achievement trends since 2002 is a massive analysis of state test score data for all 50 states. (DC schools didn't submit their test data.) Jack Jennings, president and CEO of CEP, stressed that the report is a quantitative but neutral report that does not make policy recommendations (other than the need for better data collection and reporting). The complete report is on CEP’s Website.
A day after the release of the CEP report, the National Center for Education Statistics issued a report, Mapping 2005 State Proficiency Standards Onto the NAEP Scales, that compares the percentage of students reaching the proficient level on state tests with the percentage of stu dents reaching that level on the National Assessment of Education Progress (NAEP).
The report found “the NAEP score equivalents to the states’ proficiency standards vary widely....” However, it concluded: “These results should be employed cautiously, as differences among states in apparent stringency can be due, in part, to reasonable differences in the assessment frameworks, the types of item formats employed, and the psychometric characteristics of the tests. Moreover, there is some variation among states in the proportion of NAEP sample schools that could be employed in the analysis.”
June 5, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment