Note to readers: This post appeared as a column in the Armidale Express on 13 April 2011. I am repeating the columns here with a lag because the Express columns are not on line. You can see all the columns by clicking here for 2009, here for 2010, here for 2011.
I finished my last column with a reference to ERA, or Excellence in Research for Australia, a benchmarking exercise that aims to benchmark research in Australian universities against international standards.
This, I suggested, may sound reasonable, but ERA was actually yet another example of a Government induced change that was adversely affecting Australian university education in general, New England in particular.
When I say New England, I didn’t mean just the University, but Northern NSW in general. I said that would explain why in my next column.
There can be no doubt that ERA is having significant rolling effects.
As one example, at Central Queensland University around one third of academic staff have accepted teaching scholar positions that do not have a research requirement. As best I can work out, this allows the university to focus research funds on staff and in areas that will give it brownie points as measured by ERA.
While some of the effects are becoming clear, I always like to check my facts. For that reason, I spent some hours on the ERA web site working my way through the explanatory material.
This is mind-numbing stuff. As I read and tried to understand the various technical papers describing the way ERA worked, I found my eyes glazing.
I still have some work to do to fully understand the process, although it does seem clear that both the creation of the international benchmarks and the measurement of Australian university performance against those benchmarks involve a combination of certain types of publication with citations, the number of times an article is referenced by other writers in the selected publications.
The University of Southern Queensland describes citation indexes as compilations of all the cited references from particular groups of journal articles published during a particular year or group of years.
In a citation index, you look up a reference to a work that you know to find journal articles that have cited it, although you can also search by concepts and authors. This allows you to identity articles of interest and to find cross-links.
Described in this way, citation indexes are a useful tool, an aid to research. However, they have become more than that.
While the first proposal for the creation of science citation index was made in 1955, the first index was not published in the US until 1963, covering the 1961 literature. By the end of 1978, citation indexes had spread to cover all academic disciplines.
The rise of the citation index parallels the rise of new computing and communications technologies. It also paralleled a growing interest in measurement, one facilitated by the new technologies.
Herein lies the rub, for this interest in measurement meant that the indexes were now to be used in new ways.
By the time I became CEO of the Royal Australian (now Australian and New Zealand) College of Ophthalmologists (RANZCO) at the end of 1997, the citation system was very well entrenched.
At the time we were worried that the College's scientific journal, the Australian and New Zealand Journal of Ophthalmology, was dropping down the citation list. The journal really needed to be in the top ten globally to attract the required level of scientific and clinical articles, so under the leadership of the editorial team the journal was renamed Clinical and Experimental Ophthalmology and effectively relaunched to achieve the required citation level.
This is a simple practical example of the influence of these indexes. However, I can give you another more subtle example, the rise of that dreaded university phrase publish or perish.
Searching around, the earliest reference to the phrase that I can find with the same connotations as today was in a 1942 US published book by Logan Wilson. However, to my knowledge, the phrase did not really become really common until the 1980s, with use then exploding.
The idea that academics should research and publish, that academic advancement should be linked to publication, was not new. However, the rise of the citation indexes provided a new pecking order mechanism in a university world that was becoming larger and more complex.
This led to some scandals. These included citation “clubs”, you cite me and I’ll cite you, as well as misuse of student research to achieve publication.
Today we have taken publish or perish to an entire new level in that it applies not just too individual academics, but entire institutions. I don’t think that’s a good thing.