STUDIES UNDERCUT VALUE OF TEACHER CERTIFICATION COURSE


Date: Saturday, February 26, 2005


National Board Certification for teachers costs them and their districts a lot of time and money, but does very little to identify truly effective teachers. Directly measuring how much students improve in any given teacher's class works better and costs very little.


That's the message brought by George Cunningham, of the University of Louisville, and J.E. Stone of East Tennessee State University in an article, "Value-added assessment of teacher quality as an alternative to the National Board for Professional Teaching Standards: What recent studies say." It is to appear in a book, Value-Added Models in Education: Theory and Applications, edited by Robert Lissitz, but it is available now at www. education-consumers.com online.


The last time I had occasion to write about national board certification, it was because some officials from the group were in Denver and stopped by the paper to try to win me over to their point of view. They didn't, but I was at least willing to concede that evidence might one day prove them right.


It isn't happening. Stone and Cunningham report on a very large study by Dan Goldhaber and Emily Anthony of board-certified teachers in North Carolina. It found their students made slightly larger one-year gains in reading and math than the students of teachers who had never applied for certification or had been unsuccessful. But the differences were so small that they were unlikely to make any practical difference in the classroom. What statisticians call the "effect size," the percentage of one standard deviation, ranged from 6 percent to 14 percent.


Using the same data, Cunningham and Stone were able to calculate effect sizes for the top 10 percent of teachers, those whose students made the largest one-year gains. For them, the effect size was 128 percent. Or to put it differently, the average gain in reading produced by the top teachers (regardless of certification) was 27 times larger than the average for board-certified teachers. In math, it was 40 times larger.


Board certification costs $2,300 and requires up to 200 hours of unpaid work. School districts often pay board-certified teachers thousands of dollars more each year. If there's no significant difference in achievement, all that money is being wasted. Why not skip the intermediary and just directly reward teachers whose students learn more? There are ways to do that that compensate for students' differing abilities.


Because the differences are so small, there is very substantial overlap between board-certified and non-board- certified teachers. Cunningham and Stone estimate that more than 40 percent of those without board certification perform better than the average of those with it, while more than 40 percent with certification perform worse than the average of those without it.


"It is doubtful," Stone and Cunningham write with deliberate understatement, "that policymakers understood just how much NBPTS-certified teachers overlapped the noncertified group when they agreed to fund the various state and local salary awards for NBPTS certification."


The authors analyze two other small studies that purport to show board-certified teachers perform better, but in both cases the differences are too small to be practically significant.


One by Leslie G. Vandevoort, Audrey Amrein-Beardsley and David C. Berliner, all of Arizona State University, involved 35 teachers board-certified in early or middle childhood education. Of 48 possible comparisons, by grade/subject/year, with a similar group of teachers without board certification, there were 11 that showed a statistically significant difference, all favoring the board-certified teachers, though they were small -- on average 2.45 points on a scale of 65.


But 37 showed no statistically significant difference, 24 of them favoring the teachers with board certification and 13 those without it.


"For reasons not explained," Cunningham and Stone write, "the authors exclude these 13 comparisons as they discuss the results of their study."


Not that it's hard to explain; leaving out the comparisons unfavorable to their desired conclusion allows them to exaggerate the importance of their results. Might not be worth mentioning, except that Amrein-Beardsley and Berliner have pulled this trick before, and I caught them at it.


The other study found that 61 board-certified ninth- and 10th-grade math teachers in Florida produced yearly gains on statewide math tests of 66.70 points, just 1.25 points higher than teachers who were not.


Board certification is proving to be an enormously costly boondoggle, entirely based on the assumption that it is possible to identify truly effective teaching by measuring "attributes and qualifications that are said to be predictive of classroom performance," the authors conclude.


"Value-added assessment, by contrast, is a direct measure of classroom performance and, as such, reflects that which is actually accomplished with students."


And given that student test results already exist, they can be analyzed for around $1 a student and $25 a teacher. That's a bargain.