Self-adapted Faculty Evaluation System

Due to the perverse incentives in current higher education system, the faculties, especially the new assistant professors, are suffering from higher pressure to become tenure or secure a desirable financial funding. To help mitigate the hyper-competition and build a healthy academic environment, I believe a more comprehensive and self-adapted faculty evaluation system should be build in future higher education. Right now, we spend too much time assess a faculty’s achievement through quantified impact factors, publication numbers, and funding amount. This can jeopardize the development of academia, and lead to deterioration in research ethics and misconduct.

Recently, a more advanced quantification tool, iCite, is developed on National Institutes of Health (NIH) with a new metric named Relative Citation Ratio (RCR). iCite is a newly developed tool to access a dashboard of bibliometrics for academic publications. Scholars can upload the PubMed IDs of articles (from SPIRES or PubMed) and get a full portfolio. iCite will output the number of articles, articles per year, citations per year, and Relative Citation Ratio (a field-normalized metric that shows the citation impact of one or more articles relative to the average NIH-funded paper). For customized analysis, the scholar can choose the year range, the article type (all, or only research articles), and individual articles can be excluded. The final report table with the article-level detail can be downloaded by scholar for future use. Below is my personal example (click on the figure to enlarge it), and all I need to do is typing my name into the system.


The cites/year gives you the information of the maximum, the mean, the standard error of the mean (SEM), and the median (MED) of your papers. With all kinds of data, this evaluation system provide a more comprehensive assessment for your research work. In the following column, RCR represents a citation-based measurement of your publication influence. A paper with an RCR of 1.0 means it received the same number of cites/year as the average NIH-funded paper in your field. With this dynamic comparison in your own field, the evaluation will be more direct and unbiased.

The iCite tool offers supplementary information to current impact factor and h-index. As the authors put it, “Scientists and administrators agree that the use of these metrics is problematic, but in spite of this strong consensus, such judgments remain common practice, suggesting the need for a valid alternative.” Gladly, the whole academia is aware of the issue, and more self-adapted evaluation systems should be put forward in the future higher education system.

Leave a Reply

Your email address will not be published. Required fields are marked *