Monday, November 24, 2008

Dazed and Confused - at 500 (UST editorial, The Varsitarian newspaper)

FIRST the good news: UST is back in the Top 500 of the Times Higher Education Supplement-Quancquarelli Symonds (THES-QS) listing of the world’s universities. The bad news: the survey has become even less credible. (??)

In the survey, Ateneo de Manila University pulled ahead of University of the Philippines to land 254th and become the top university in the country. This is not to disparage Ateneo, but how could an institution that’s barely an expanded liberal arts college and with only a smattering of degree programs tested by state licensure exams become the top Philippine university? (not to disparage this editorial, but this sentence does exactly what it is "trying" not to do: look down at Ateneo. mission accomplished)

According to the PRC, the best performing universities are UP and UST, 274th and somewhere in the 401-500 bracket in the Times list. In coverage and quality of programs, UP and UST are the top universities in the country. But alas, the Times survey does not really grade quality of programs and graduates. None of its indicators really address these.

UST and UP, which sport well-defined academic portfolios in various areas of higher learning, have branded as “problematic” the indicators which THES-QS employed in its survey. At face value, this argument seems sour-grapping (sour-graping), but their observation is neither old-man whining nor childish nag. The survey indicators – research quality, graduate employability, international outlook, and teaching quality – may not be problematic, but the methodology is.

The research quality criterion was hardly defined except through the academic peer-review (40 percent) and citation-per-faculty (20 percent) indicators. According to the THES-QS website, 3,703 who responded out of 190,000 academics who were asked to do the peer review selected up to 30 institutions from their region(s) that they considered as the best in the following areas: arts and humanities, engineering and information technology, life science and bio-medicine, natural sciences and social science.

By this token, how then could Ateneo have outscored UST, UP and De la Salle University (another 401-500 dweller) considering that the Jesuit-run institution could only claim as virtual academic strongholds at best the arts and humanities and social sciences vis-à-vis the three other universities, which not also excel in the same areas (between them, UP and UST have the most number of National Artists and Centers of Excellence in the humanities, for instance) but which also could claim expertise and prestige in the more formal sciences and in IT and engineering.

It may be that Ateneo managed to top the three other universities because of its research output, but the kind of output to justify “research quality” in the survey was never specified.

As in the case of citation per faculty, how come the THES-QS managed to rate the four universities when in fact it did not have pertinent (read: outdated) data to “factor” the “research performance” of the four universities “against the size of its research body?” Meanwhile, graduate employability was anchored on employer/recruiter review (10%). So what professions were checked for employability? Apparently they were professions that did not require licensure exams and weren’t in the more relevant science and engineering fields where UST products excel commendably here and abroad.

The key to all of these indicators is perception. Much depends in fact on how well a school presents and markets itself before its peers and to the world. As Clarita Carillo, the vice-rector for academic affairs, puts it, “The ranking impacts on the public image of UST.” While it is good that UST is back in the list, its relative distance from its peers should show that it has much catching up to do not so much in academic excellence but in marketing and public relations.

But even if it improves its public image, it’s doubtful if UST will improve its rankings in a survey in which, in Carillo’s very cautious remarks, “reservations (have been made) about its validity and reliability.”

Very telling is the fact that student-faculty ratio became the statistical buttress of the survey’s teaching-quality objective. As the THES-QS said in its website, “the higher the number of faculty per student, the higher the score.” In short, the indicator equates classroom size with quality instruction. In reality, of course, we know this does not necessarily reflect on the quality of instruction in a university. Ateneo may have fewer students per faculty but this is because only the rich can afford its high tuition. (??)

International outlook toward a university, gauged through the number of foreign faculty and students (5% each), again reeks of numerical irrelevance and perhaps a betrayal of reality. To be sure, who would want to study in a third-world university other than for reasons of geographical and cultural proximity and financial constraints?

The THES-QS said that for each indicator, the highest scoring institution was assigned a score of 100, and other institutions were calculated as a percentage of the top scorer, which Harvard University again attained for the nth time. How could Philippine universities compete with Harvard? As Carillo has noted, the survey did not consider the “local context of universities.”

In short, the survey was a questionable exercise at globalization. Sadly, Filipino education planners and managers missed out on the import and problematic aspects of the survey. Emmanuel Angeles, the new chair of the Commission on Higher Education, even said that should Philippine schools decide to advertise in Times publications and special projects, their rankings “would move forward.” Spoken in the tenor of an “enterprising” school teacher: “If you buy my sandwiches, you will get higher grades.”

---------------------------------------------------------------------------------

christ. i'll comment when i'm less offended.

No comments: