Jorge Hirsch published a very interesting article in PNAS in 2005. The full test is available from the webpage. The abstract of the publication is so shortest abstract that I have seen so far. The revolutionary vision of this publication was that it does not matter where you publish your paper but how many times your paper is being cited. Also the total number of papers is less important then the number of papers with high citation numbers.

The approach suggested by Jorge Hirsch made the impact factor of Journals less important, however if you publish in a high impact journal, the journal is read by more people and your paper has the chance to attract more citations later. At the end, you come up with one simple number h that characterise a scientist and age. However, in the scientific community a big debate started on is "h" fair and is "h" all we need? It can be more
complicated when measuring impact of research beyond "h".

The
Leiden Manifesto for research metrics (2015) summarises the discussion started by the paper of Jorge Hirsch. The DORA declaration from 2013 is the current state of the art of Research assessment worldwide. The Higher Education Funding Council for England describes in a lot of detail how to develop a bibliometrics indicators for the Research Excellence Framework. There is a very detailed report in the article called The Metric Tide. It is interesting to note that for all authors, the ORCID Id are shown.

Common to all bibliometric exercises is that the underlying data are kept secret. It is therefore difficult to access if the data presented are correct or not. When I inquired to Times Higher, which of my publications are included in the University Ranking published, I was unable to retrieve this information. Given the way, data is associated to individual researchers, it is very difficult to believe that they used the correct list of publications associated to my name. This is even more a problem, when highly cited scientists are computed automatically. My guess is that no scientist with a common name has a chance to be included in such lists. With the availability of the
ORCID database, the situation might improve in the future, at least databases like Scopus or Microsoft Academic Research have no excuse not to get it right.