®lm0¯®rm80¯ CMD WEB MA80LI SLUGCQUALITY PRI SECT ED DATE00/00PAGE FMT LE013.27/0091 HJ NOTES NEXTQlseeba THISQLSEEBA-NEW IN LASTLSEEBA;05/12,19:22FROMLSEEBA;12/16,10:31VER01 CX1 CUR1 FORMS MODELI BY ;12/11,08:31SIDE MSG DI ®RM80¯ ufcolumn1 0000 gflonger,0 gfnorev,19p ir0,14g,6p,14g gflegs,81g,0 UFes1 UFCOLUMNX HD44,17p,33 ufnorev,17p Quality in higher education is an elusive goal. I'm persuaded of the enduring value of a liberal-arts program (I used to say I majored in interesting professors, though really I was a math major) so I'd probably say a college provided a high-quality education if its graduates had broad and deep knowledge across the curriculum, and also the intellectual skills to apply their knowledge in practical situations. But I'd also have to admit that's a pretty squishy definition. Or rather _ to use the somewhat more lofty terminology I learned in my freshman Contemporary Civilization course at Gettysburg College in Pennsylvania _ I have no idea how to make it an operational definition, one characterized by a carefully specified method for establishing whether the definition is satisfied. The Colorado Commission on Higher Education has been struggling with the same definitional problem. In 1996, the legislature passed the Higher Education Quality Assurance Act, assigning to CCHE responsibility for developing a ``quality indicator system'' for the state's public colleges and universities. That makes Colorado one of 21 states with such a system, and one of only seven (the others are Florida, Missouri, Ohio, South Carolina, Tennessee and Washington) that has financial incentives tied to performance. The statewide goals specified in the act are, ``providing a high quality undergraduate education, collaborating with elementary and secondary education, providing workforce preparation and training, using technology to improve both administration and instruction, and providing all services productively and effectively.'' Nothing to object to there, obviously; the difficulty is coming up with measurable proxies for those squishy goals, especially given that the institutions being measured have very different missions. The CCHE has just issued its first report, a baseline for the nine statewide measures it has chosen. They are after-graduation performance, undergraduate student success rates, student satisfaction, advising, employer satisfaction, instructional expenses, technology plan, assessment and accountability and K-12 linkages and teacher preparation. Nothing to object to there, either, yet I can't shake the impression that they fail to capture what's essential about learning. Still, parents and prospective students will find information here worth pondering. The measure of student success, for instance, is the percentage of students who have graduated or transferred, or are still enrolled, three years after enrolling for the community colleges and six years for the four year schools. In general, the institutions that require higher scores for admission have higher success rates, and within each institution, higher-scoring students are more likely to succeed. That's not surprising, but some of the details may be. The commission has set minimum admission standards for the state schools, based on an index score computed from an applicant's high school record (grade point average and class rank) and standardized test scores (SAT or ACT). Each school is allowed to accept up to 20 percent of its freshman class whose index score is below the minimum. At Metropolitan State College of Denver, whose state-set minimum index score is 76, the success rate for students with index score less than 100 is 27.9 percent, and for students with index score at least 100 is 47.8 percent. A student with average SAT scores for Colorado and a GPA of 3.1 will have an index score of 100. For Colorado School of Mines, whose minimum index score is 110, the success rate for students whose scores are below 100 is 57.9 percent, while for those whose scores are at least 100 it's 58.4 percent. These are very different populations, and it's not fair to compare them directly, but the anomaly surely warrants investigation. The CCHE notes that its report summarizes far more detailed data provided by the institutions, which is available on request. I'd like to see the data widely distributed, on the Web perhaps, in the hope that somewhere in it there's a way to measure the miracle that happens between teacher and student. UFpostscri,15p6 Linda Seebach is an editorial writer for the News. She can be reached at (303) 892-2519 or seebach@denver-rmn.com by e-mail.