Niloofar Mohaghegh; Hasan Ahmadi
Volume 9, Issue 20 , October 2005, , Pages 149-165
Abstract
Evaluating scientific quality is notoriously a difficult task which has no standard. Ideally, published scientific results should be scrutinized by experts of the related field(s) and should be given scores for qualitative and quantitative factors according to established rules.
It is hardly surprising ...
Read More
Evaluating scientific quality is notoriously a difficult task which has no standard. Ideally, published scientific results should be scrutinized by experts of the related field(s) and should be given scores for qualitative and quantitative factors according to established rules.
It is hardly surprising that methods for evaluating research are being sought, such as citation rates and journal impact factors, which seem to be quantitative and objective indicators directly related to published science. The citation data are obtained from a database produced by the Institute for Scientific Information (ISI) in Philadelphia, which continuously records scientific citations as represented by the reference lists of articles from a large number of the world's scientific journals. The references are rearranged in the database to show how many times each publication has been cited within a certain period, and by whom, and the results are published as the Science Citation Index (SCI). On the basis of the Science Citation Index and authors' publication lists, the annual citation rate of papers by a scientific author or research
group can thus be calculated.
Since journal impact factors are so readily available, it has been tempting to use them for evaluating individual scientists or research groups.
Because Journal impact factors are calculated in a way that causes bias on the other hand for evaluation of scientific quality. There seems to be no alternative to qualified experts reading the publications. Then, much can be done, however, to improve and standardize the principles, procedures, and criteria used in evaluation, and the scientific community would be well served if efforts could be concentrated on this rather than on developing ever more sophisticated versions of basically useless indicators. In the words of Sidney Brenner, "What matters absolutely is the scientific content of a paper, and nothing will substitute for either knowing or reading it.