"Apples to Apples--Using 'Mean-Matching' to Compare Schools." By William L. Bainbridge.
School and College. November 1992.
APPLES TO APPLES USING "MEAN-MATCHING" TO COMPARE
By William L. Bainbridge, Ph.D.
It seemed inevitable. Rarely did a meeting of the school board go by when comparisons weren't drawn between our district and others. The problem, as I saw it in those days as superintendent of a "county seat" system, was that none of the comparisons was fair.
The housewife would suggest that perhaps we should be doing things more like the Uppity Hill schools which had nearly twice our taxable wealth per pupil. The businessman sitting on the board would make comparisons with a rural district which had no distinguishable disadvantaged students and bussed all of their handicapped kids to us. The board veteran would talk about a program he heard of at a convention that was then being implemented by the Metropolis #1 district which had nearly five times our student population and twenty times our central office staff.
These well meaning people were trying to improve our schools. The comparisons of settings, however, served only to frustrate a young "change agent" administrator. The teacher union would create proposals based upon districts of comparable size with no regard for resources. We were a blue collar city district in the middle of a rural county. We had little in common with any of our neighbors. They had no understanding of the complexities which our rapid turnover rate, high numbers of youngsters on ADC and free lunch program brought. Nor did they often deal with the pressures that a daily newspaper and electronic media present.
One of our contiguous districts had parent education levels among the highest in the country, and the superintendent would constantly point to its high scholastic examination scores. Our central administrators frequently joked that the students in that college town would outscore us on the exams even if they didn't have a school!
Meanwhile, the area representative for the computer manufacturer was equally frustrated. Her district manager continued to slap quotas in her direction that were comparable to her counterpart on the other side of the district. Her pleas to the head sales honcho that the schools in her area just didn't seem to have the motivation or resources of those in her counterpart's continued to fall on deaf ears. The fact was that her colleague had been assigned to school systems with high expectations and moderately high state and local revenues. It was just a whole lot tougher to sell a new computer system in a place where the business people and parents didn't use them yet in their daily lives.
These scenarios are not unusual. At the higher education level, differences are readily understood. Everybody knows that M.I.T. and the local community college have student bodies and resources that are not comparable. Television sit-coms even highlight the differences between the fictitious campuses of "Minnesota State" (COACH) and "Hillman College" (A DIFFERENT WORLD). Regretfully, many people like to compare elementary and secondary schools as if they could all be plucked at the same time from the same apple tree. We all know there are vast differences which have a lot to do with student performance, economies of scale and operations in general.
A few years ago, Dr. M. Donald Thomas, highly respected former Superintendent of Schools in Salt Lake City (UT), identified the problem and a solution in the course of conducting educational effectiveness audits for various school districts. When making a presentation to a school board, Don would focus on goals for school improvement based upon performance of "mean-matched" schools.
Thomas explained that schools should be compared with those that have similar characteristics. "You don't want to compare apples to oranges," Thomas said. He explained that schools should be compared in terms of socioeconomic status (family incomes), parent education level, readiness scores for students in kindergarten, dollars expended per pupil on instructional materials, salaries and experience of the teachers, etc.
Initially Don developed his own databases on three states where he had been relied upon by numerous districts for audits of educational effectiveness. Subsequently, he began using our SchoolMatch databases in order to bring a national perspective and to more finely tune the "comparable" districts.
In his missionary fashion, Don has converted a significant number of school leaders to thhe "mean-match" concept. More recently, forward thinking vendors of school products have begun to target their school market by identifying schools and school systems similar to ones in which they have been successful. School systems have used the "mean-match" system to identify "sister districts" around the country with which they can compare programs and results. They seem to be relieved when the "mean matched "districts are not in their local newspaper's circulation area.
My colleague, Steve Sundre, and I have had the pleasure of joining Don on some of these audits of educational effectiveness. We have been extremely impressed with the way in which school boards, administrators, teacher groups and communities have reacted to Don's expertise and commonsense approach. The comparisons with similar districts sometimes brings a completely new tone to attitudes about district performance.
Pressures on administrators to provide school boards, corporate benefactors and community members with quantitative measures of comparison between districts continue to grow. People want to know how well the school district is doing. Many systems have been resistant to releasing comparative information that may show their efforts in a less than favorable light. The "mean-matching" system is one way to draw fairer comparisons that could be less controversial.
Dr. Bainbridge heads SchoolMatch, a Columbus, OH, research firm assisting corporations with school data and consulting services.