Picture of Eleftherios DiamandisEleftherios Diamandis is Professor and Head, Division of Clinical Biochemistry, Department of Laboratory Medicine and Pathobiology, Faculty of Medicine, University of Toronto

 

In the 1970s, my mentor and Professor at the University of Athens, Greece, Dr. Themistokles Hadjiioannou, asked me periodically to go to the library and check his citation record. I remember grabbing from library shelves printed volumes of the “Science Citation Index”, which were as heavy as 5 kg each, going through the pages and then recording manually as to who cited his work. This task required many days of intellectual and physical work

Now, the situation is very different. Google Scholar and Web of Science are just two of the many automated methods for tracking publication records, including author impact, journal impact, papers published, lifetime citations, cites per year and various other indices such as h-index, g-index, etc. The new technology not only allows monitoring of your own indices of published work but you can also check your competitors’ indices. It is useful for authors to know which of their work is most cited (and by extrapolation the most interesting to others). This information could be helpful in planning future research endeavors. In this respect, tracking one’s own citation analysis is not much different from the way musicians might keep track of how their latest song ranked in the charts and how many copies they sold.

We and others have outlined repeatedly that citation indices should always be interpreted with great caution. Analysis of personal citations and other related indices, such as a journal’s impact factor, can lead to their over-interpretation and cause unnecessary anxiety in some individuals. By monitoring the behavior of some fellow scientists over the years regarding research impact, I came to the conclusion that a few develop a neurosis which, I believe, should be categorized within the group of obsessive compulsive disorders. I coined this as “The Google Scholar Syndrome” (GSS). This syndrome usually afflicts scientists who have an excellent citation record. These scientists usually have one or more papers which have gone “viral”, collecting a disproportionately high number of citations compared to the paper’s scientific value. The GSS can be recognized from the following 3 cardinal symptoms:

  1. Checking your citation record more and more frequently, such as once a month, then once a week, then once a day, and, in advanced stages, several times a day – checking total citation numbers, citations of one’s ‘greatest hit’ and h-index, looking for upward trends and improvements. I believe it is within the realm of normal to check your citation analysis once every 3-6 months, or on specific occasions (e.g. when you apply for an award, job or promotion).
  2. Comparing your citation indices with those of your competitors (local, national or international), in the hope that you have overtaken a few, at least in some indices.
  3. On reaching a certain milestone, e.g. 100,000 or more lifetime citations, starting to compare yourself to recent Nobel prize winners and developing insomnia in early October as you wait for a telephone call from the Swedish Academy.

I do not have quantitative data on the frequency of this syndrome among scientists but I suspect that it may be widespread if varying severity. I invite Google Scholar to publish an anonymous survey of the frequency of visits and queries of their site by scientists, in order to obtain an idea of the prevalence of GSS. I am not aware of any therapy, but I suspect that the best remedy should be prevention. As mentors we have an obligation to educate young scientists that any indicator of scientific output and quality should always be interpreted with caution. Let us not forget what William Bruce Cameron said about indicators (this quote is frequently attributed to Albert Einstein):

Not everything that can be counted counts,

And not everything that counts can be counted