Historically, schools have employed testing as a way to measure students’ performance and have used test results to inform parents about their children’s progress. In recent times, Ohio has become one of many states that have established a "high-stakes" testing program. Ohio’s now infamous proficiency tests have become a lightning rod for criticism from parents, educators, and others who believe there is too much testing and that current tests do not measure the right things.
The perception is spreading that teachers are teaching to the test and centering their lessons on the proficiency test itself, at the expense of a more balanced program of study. Teachers have tried hard to make proficiency testing work, but even teachers and their unions formally have protested the tests, after seeing the impact on students.
How are the tests constructed, implemented and interpreted? Are they valid? Well-documented flaws in the tests appear to undermine the entire standards process.
Evidence to support a lack of test validity is mounting. Two years ago, the results of the fourth-grade proficiency test in mathematics indicated four of Ohio’s roughly 600 school districts could boast that half of their students passed the test. In school systems in Bexley, Granville and Upper Arlington, where college-entrance examinations historically have placed high-schoolers in the top 2 percent in the country in mathematics, the passing rates were unbelievably low and lacked credibility.
On the Ohio test, more than half of the top academic achievers in the state could not pass muster. The process created animosity between school systems and the state and undermined the self-confidence of some of the state’s finest students. It is difficult to believe that the education process in Ohio has become an inverted Lake Wobegon, where the majority of our students are below average.
At least one college professor, Randy Hoover of Youngstown State University, concluded that the testing is "immoral, unethical and irresponsible. They absolutely should get rid of it because it does not measure academic achievement or, if it does, it’s by accident."
Hoover and my colleague Steven Sundre of SchoolMatch independently conducted similar studies. They concluded that more than 60 percent of a child’s performance on the Ohio proficiency tests could be predicted by socioeconomic factors, such as the education level of the mother.
"We are spending millions of dollars and wasting a tremendous amount of time and effort," Sundre said, "in designing a system that simply mirrors the census tracks."
Every reputable study has found that socioeconomic status accounts for an overwhelming proportion of variance in test scores when different schools, school systems and states are compared. The level of education in the community, number of children living in poverty and income level of residents continue to be the best predictors of differences in cognitive levels of school-age children.
The high-stakes testing in Ohio requires students to pass reading, writing, math, citizenship and science proficiency tests in order to graduate. What are these test results actually worth? What is the basis for comparison? The tests really tell what we already know: Disadvantaged children need more resources and time to develop their academic skills.
Yet, Ohio law demands that the Ohio Department of Education continue to consume forests of trees and invest already scarce resource dollars that could be spent better on helping students learn than in the distribution of report cards based on what appear to be seriously flawed tests.
State assessments aren’t about to go away, especially since President Bush and even many Democrats are advocating more, rather than less, testing. Late last year, Gov. Bob Taft appointed a commission to review student academic performance. It recommended a "sweeping overhaul of proficiency tests and creation of new ‘in-between’ grades in which students who don’t pass receive intensive tutoring."
Obviously the state needs either to vastly improve the proficiency-testing program or to abolish it in order to give all students a fair and impartial assessment.
If we do not abolish the program, Ohio officials need to change the process radically by providing:
Additionally, a critical factor is that all students exposed to proficiency tests have school-provided access to instruction on exam-taking skills before their performance is measured. If the future of our state’s young people is dependent upon promotion and exit examinations, then the test-taking skills of the students and content of the proficiency exams should be pertinent to all and not just those who can relate to the questions being asked.
Unfortunately, more often than not, behind standards legislation and new initiatives lie a gross misunderstanding of teaching and learning. Experts know the purpose of testing should be to diagnose students’ knowledge and skills in order to guide instruction and enhance learning. For some reason, states have decided that the time-proven achievement tests, such as the Iowa Test of Basic Skills and Stanford 9, were not good enough and that customized tests unique to the political boundaries of the state are more important.
The growing pains in the development process of these tests have created undue stress for students, teachers and parents. Rather than punishing them, proficiency testing should be redesigned to reward students for their efforts.
is Distinguished Research Professor at the
University of Dayton and is President & Chief Executive Officer
of SchoolMatch®, a Columbus based educational auditing, research, data