Tenure system reforms: A how-to.

The Tenure System Is Broken. Here’s How to Fix It.

The Tenure System Is Broken. Here’s How to Fix It.

News and views from academia.
Jan. 16 2014 11:34 AM

The Tenure System Is Broken

Here’s how to fix it.

Man with briefcase waiting before a job interview
When academics are sent two job applications for a laboratory manager position with equal qualifications but different names, they choose the male candidate.

Photo by Shutterstock

This article originally appeared in Inside Higher Ed.

For the last 15 years, I have been involved in the study and reform of academic reward systems. Academic reward systems are fascinating to study because they reflect assumptions, values, goals, and aspirations held by institutions and fields.

I have studied academic reward system change in such areas as redefining scholarship, post-tenure review, stopping the tenure clock, and efforts to include ways to appraise new and diverse approaches to scholarly dissemination in the tenure process. My work has caused me to reflect on the current state of dominant academic reward systems, the assumptions that guide them, and the specific things I would like to see colleges and universities NOT do anymore, and start changing.


As a preamble to what I want us not to do anymore, I set forth the following principles. Most colleges and universities are charged with the goal of advancing knowledge within and through a diverse, inclusive community. By inclusive, I mean inclusive of both diverse individuals and diverse contributions to knowledge. Second, academic reward systems are about the valuing of professional lives and contributions. They are symbolic and concrete artifacts of what an institution values and aspires to become. Third, academic reward systems should ensure that faculty making excellent contributions to scholarship, teaching, and service should be retained and advanced. Yet, what excellence looks like in 2013 may differ from what it looked like in 1960 and 50 years from now.

Now, to the attitudes and policies that need to change:

First, the assumption that the process is unbiased, objective, and without partiality is naive at best, and at worst, harmful to professional lives.

Across the world, quality social science has demonstrated the pervasive nature of implicit and explicit bias in every aspect of a faculty career that is evaluated. When professors write letters of recommendation for male and female candidates for academic positions, there is bias in how candidates of equal qualifications are presented. When academics are sent two job applications with equal qualifications but different names for a laboratory manager position, they choose the male candidate and offer him a higher salary. Across research and doctoral universities, more women and faculty of color resign from their institution before going up for promotion or are advised to withdraw from the process. White researchers applying for grants from the National Institutes of Health are nearly twice as likely to win them as African-American scientists. There is bias embedded in scholarly publishing and the order of authors, as well as in service activities, and years to advancement in many fields. The Matthew effect, wherein certain senior scholars benefit from cumulative advantage in the numbers of scholars who cite them, and other less known scholars are not cited for equally meritorious work, have well-documented negative influences on women’s citations. Bias is more than a possibility; it is probable and real.

Second, the assumption that we know a scholar’s work is excellent if it has been recognized by a very narrow set of legitimacy markers, adds bias to the process and works against recognition of newer form of scholarship.

On May 16, 2013, a group of 150 scientists and 75 science organizations prepared a joint statement called the San Francisco Declaration on Research Assessment, which noted that metrics such as the Journal Impact Factor are used as quick and dirty assessments of academic performance and should not be. Thus, even scientists engaged in work most likely to be regarded through these indexes are increasingly negative about their use.

Typically, candidates for tenure and promotion submit a personal narrative describing their research, a description of the circulation, acceptance rate and impact factors of the journals or press where they published, a count and list of their citations, and material on external grants. This model of demonstration of impact favors certain disciplines over others, disciplinary as opposed to interdisciplinary work, and scholarship whose main purpose is to add to academic knowledge.

In my view, the problem is not that using citation counts and journal impact factors is a way to document the quantity and quality of one’s scholarship. The problem is that it has been normalized as the only way. All other efforts to document scholarship and contributions—whether they be for interdisciplinary work, work using critical race theory or feminist theory, qualitative analysis, digital media or policy analysis are then suspect and marginalized.

Using the prestige of academic book presses, citation counts, and federal research awards to judge the quality of scholarship whose purpose is to directly engage with communities and public problems misses the point. Interdisciplinary and engaged work on health equity should be measured by its ability to affect how doctors act and think. Research on affirmative action in college admissions should begin to shape admissions policies. One may find key theoretical and research pieces in these areas published in top tier journals and cited in the Web of Science, but they should also find them in policy reports cited at NIH, or used by a local hospital board to reform doctor training. We should not be afraid to look for impact of scholarship there, or give that evidence credibility.

Work that is addressing contemporary social problems deserves to be evaluated by criteria better suited to its purposes and not relegated to the back seat behind basic or traditional scholarship.