New Canadian research ranking study reveals a few surprises (Canada)

Some small universities rank higher than much larger research-intensive schools.

by Rosanna Tamburri

University rankings have come to be somewhat of a sport in higher education circles, and institutions alternately love them or hate them. Now, a new report by the Higher Education Strategy Associates, a Toronto consulting firm, provides a different way of measuring research strength at Canadian universities. Rather than using stand-alone publication or citation counts as other measures do, the HESA index uses individual researchers’ H-index scores plus grants they receive from federal granting agencies. The H-index score is a measure of a scholar’s productivity based on a combination of publication and citation information.

The problem with traditional research rankings, argue the study’s authors Paul Jarvey and Alex Usher, is that they favour institutions that are strong in disciplines in which researchers tend to publish more and receive larger grants. Physicists, for example, tend to publish and cite each other more often than historians and also receive larger grants, the study noted.

The HESA index tries to eliminate these biases and “ensure that schools do not receive an undue advantage simply by being particularly good in a few disciplines with high publication and citation cultures.” “These aren’t simply about raw money and publication totals,” wrote Mr. Usher in his daily blog. “Our methods help to correct some of the field biases of normal research rankings.”
As well as providing an overall ranking, the study, “Measuring Academic Research in Canada: Field Normalized Academic Rankings 2012,” divides the data into two categories: natural sciences and engineering, and social sciences and humanities. It excluded medical and health related disciplines because the authors said they couldn’t distinguish between university researchers and those on staff at hospitals. They acknowledged that this was detrimental to the rankings of some schools, particularly the University of Toronto, which would likely have scored higher had the data been included.

Not surprisingly, big research intensive institutions including the University of British Columbia, McGill University and U of T ranked highest overall. UBC came out on top in both categories: science and engineering, and social sciences and humanities. In science and engineering it was followed closely by Université de Montrèal and U of T; the University of Ottawa and McGill placed fourth and fifth respectively. In the social sciences and humanities, UBC was followed by McGill, U of T, and the University of Alberta. (Where an institution, such as U of T, had multiple campuses of sufficient size, they were ranked separately.)

But there were some notable exceptions too. Université du Québec à Rimouski scored exceptionally well in the sciences and engineering category, coming in seventh place and ahead of such heavyweights as the University of Waterloo, Université Laval, Alberta, McMaster and Western. Rimouski’s ranking reflects the high productivity of its marine sciences researchers, the study noted.

Another outlier was the University of Guelph, which ranked fifth in the social sciences and humanities category despite its reputation for being stronger in science disciplines. Trent University also stood out, performing the best among small universities because its professors had good publication records. Simon Fraser University ranked among the top 10 in both categories (sixth in sciences and engineering and 10th in social sciences and humanities) ahead of many leading research-intensive universities.

The study commented that among the schools with substantial research presences in the natural sciences and engineering, Laval, Ottawa, Calgary, Saskatchewan, Sherbrooke, Guelph and Alberta all receive substantially more money in granting council funds on a field-normalized basis than one would expect given their bibliometric performance. Conversely, Simon Fraser, Concordia, York, Manitoba, Trent, Toronto (St. George), UQAM and Memorial all receive substantially less funding than expected, given their bibliometric performance.

In the social sciences and humanities, the study said that McGill, Laval, Guelph, Alberta, Montreal and McMaster receive substantially more money in granting council funds on a field-normalized basis than one would expect, given bibliometric performance. And Queen’s, Trent and Toronto (Mississauga) receive substantially less.

Overall, Ontario institutions, although “they’re funded abysmally,” ranked highly because they perform substantially better on publication measures than anyone else in the country, Mr. Usher said. The report noted that Quebec institutions fared worse in the social sciences and humanities because research papers written in French are cited less often. This doesn’t apply to the engineering and sciences because researchers in those areas, including Francophones, publish in English.

“Perhaps the main thing that has been learned in this exercise is that stripping away the effects of institutional size and field-normalizing bibliometrics and grant awards results in a slightly different picture of university performance than what we are used to,” the study concluded. “If one can be bothered to look behind the big, ugly institution-wide aggregates that have become the norm in Canadian research metrics, one can find some little clusters of excellence across the country that are deserving of greater recognition.”

HESA spent eight months compiling a citation database containing publication records for about 50,000 Canadian researchers and derived an H-index score for each using the Google Scholar database. It also used grants from federal granting councils to calculate an average grant per professor.
Ian Clark, professor at U of T’s School of Public Policy and Governance, said gathering data on so many researchers was “an impressive feat” which no one else has done. “The reason that I find the HESA technique so intriguing is that it builds up an institutional result from professor level data,” Dr. Clark said, calling it a superior methodology. He believes the data will be of interest to provosts, deans and department heads who want to see how their department stacks up against others in the country. “I think that is something that is not available currently,” he said. “Despite what everybody tries to tell themselves, most people are fascinated by rankings.”

Glen Jones, Ontario Research Chair in Postsecondary Education Policy and Measurement at U of T’s Ontario Institute for Studies in Education, said there are some drawbacks to the study. For one thing, it relies on university websites to calculate the number of faculty but, he said, what universities do and don’t put on a website and how they account for faculty vary considerably. That’s the reason why the authors weren’t able to include medical researchers but the difficulty would extend to other fields too. Another downside, he noted, is that the study takes into account only research funding from federal granting agencies. That works against fields with a history of strong industry and commercial support. But nonetheless, said Dr. Jones, there’s probably no better available data that the authors could have used. “We’re all looking for better rankings. We’re all looking for different metrics,” he said. The HESA study makes some headway in this area and certainly “holds some potential.”

Source: University Affairs http://www.universityaffairs.ca/new-canadian-research-ranking-study-reveals-a-few-surprises.aspx