Swipe om te navigeren naar een ander artikel
In this paper, we assess the bibliometric parameters of 37 Dutch professors in clinical cardiology. Those are the Hirsch index (h-index) based on all papers, the h-index based on first authored papers, the number of papers, the number of citations and the citations per paper. A top 10 for each of the five parameters was compiled. In theory, the same 10 professors might appear in each of these top 10s. Alternatively, each of the 37 professors under assessment could appear one or more times. In practice, we found 22 out of these 37 professors in the 5 top 10s. Thus, there is no golden parameter. In addition, there is too much inhomogeneity in citation characteristics even within a relatively homogeneous group of clinical cardiologists. Therefore, citation analysis should be applied with great care in science policy. This is even more important when different fields of medicine are compared in university medical centres. It may be possible to develop better parameters in the future, but the present ones are simply not good enough. Also, we observed a quite remarkable explosion of publications per author which can, paradoxical as it may sound, probably not be interpreted as an increase in productivity of scientists, but as the effect of an increase in the number of co-authors and the strategic effect of networks.
Van Kammen J, Van Lier R, Gunning-Schepers L. Assessing scientific quality in a multidisciplinary academic medical centre. Neth Heart J. 2009;17:500.
Opthof T, Wilde AAM. Comment: assessment of scientific quality is complicated. Neth Heart J. 2009;17:501–2. CrossRef
Cole S. Citations and the evaluation of individual scientists. Trends Biochem Sci. 1989;14:9–13. CrossRef
MacRoberts MH, MacRoberts BR. Citation analysis and the science policy arena. Trends Biochem Sci. 1989;14:8–12. CrossRef
MacRoberts MH, MacRoberts BR. Problems of citation analysis. Scientometrics. 1996;36:435–44. CrossRef
Adler R, Ewing J, Taylor P. Citation Statistics: A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and Institute of Mathematical Statistics (IMS). Statistical Science. 2009;24(1):1–14.
Leydesdorff L. Caveats for the use of citation indicators in research and journal evaluations. JASIS. 2008;59:278–87. CrossRef
MacRoberts MH, MacRoberts BR. Problems of citation analysis: a study of uncited and seldom-cited influences. JASIST. 2010;61:1–12.
Opthof T. Differences in citation numbers between and within sets of clinical and basic papers in cardiovascular research: why citation indicators fail. Med Biol Eng Comput 2011; 49 (in press)
Bornmann L, Mutz R, Daniel HD. Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. JASIS. 2008;59:830–7. CrossRef
Callaham M, McCullouch C. Longitudinal trends in the performance of scientific peer reviewers. Ann Emerg Med. 2011;57:141–8.
Lotka AJ. The frequency distribution of scientific productivity. J Washington Acad Sci. 1926;16:317–23.
Leydesdorff L, Bornmann L, Mutz R et al. Normalizing the measurement of citation performance: principles for comparing sets of documents. JASIST 2011. doi: 10.1002/asi21534
- Bibliometric data in clinical cardiology revisited. The case of 37 Dutch professors
A. A. M. Wilde
- Bohn Stafleu van Loghum