Washington Post sports columnist Michael Wilbon says he can't understand the formula the National Football League uses to rate quarterbacks. Explainer to the rescue! How does the NFL calculate its passer ratings?
The NFL's system uses four metrics: completion percentage, yards per attempt, percentage of touchdowns thrown per attempt, and percentage of interceptions per attempt. The four factors are weighted equally.
A score between zero and 2.375 is calculated for each metric. A score of 1.0 is supposed to be average. A completely average quarterback would complete 50 percent of his passes, average 7 yards per attempt, throw 5 percent of his passes for touchdowns, and throw an interception 5.5 percent of the time.
Here's how each metric is calculated (remember that no score can be lower than zero or higher than 2.375, no matter how well or how poorly the QB throws):
1. Completion percentage: Subtract 30 from the percentage of passes that are thrown for completions, then multiply by .05.
2. Yards per attempt: Subtract yards per passing attempt by three, then multiply by .25.
3. Touchdown percentage: Multiply the percentage of touchdown passes per passing attempt by .2.
4. Interception percentage: Multiply the percentage of interceptions per passing attempt by .25, then subtract that number from 2.375.
The scores for each category are added together. That sum is divided by six and multiplied by 100, which converts it into a rating on a scale from zero to 158.3. A putatively average QB would receive a rating of 66.7 (1 + 1 + 1 + 1 = 4, and 4/6 * 100 = 66.7).
If you don't have the patience for the math, you can enter the numbers into this ratings calculator.