Tip Top 25 in helmets, smaller
                                                    Home

How to Rate College Football Teams

The Record

At the simplest level, a team's record for a given season is its number of wins and losses (and ties for older seasons). For example, Arkansas' straight record for 2006 is 10-4 (10 wins and 4 losses). When you look at the AP poll, teams tend to be ranked mostly by their straight record, though teams from minor conferences are generally treated as though they have an extra loss or two. But a team's record goes beyond its straight record. It is the full chronological list of who they played and where, and the final score. Arkansas' full record for 2006 can be found here (among other places).

Relevant Record

A complete assessment of a team's full record takes into account strength of schedule above all, and to a much lesser extent performance (how much the team won or lost by in each game), as well as improvement over the course of the season (how that team fared in the last month and especially in a bowl game). But a good and simple place to start is with what I will call a team's relevant record. In fact, the bulk of your ranking decisions can be based on just this. A team's relevant record is a list of the relevant teams they defeated (i.e. those teams that you will rank in the Top 25 or nearly in it) as well as all their losses.

Arkansas' relevant record for 2006 is as follows (the rankings are from the AP poll): they defeated the #9 and #25 teams, as well as an 8-5 unranked team, and they lost to the #1, #3, #4, and #7 teams. Just looking at this list, you can logically conclude that Arkansas should be ranked about #8. All of their losses came to teams ranked higher than that, and they defeated the #9 team. A no-brainer. But of course, the chances of a 4-loss team being ranked that high in the AP Poll are beyond low. So what was Arkansas' rating in the final AP poll that year? They were #15. That was the best for any 4-loss team, but it was lower than a number of teams with much worse relevant records (but better straight records).

This can be more easily seen by comparing teams' relevant records side-by-side. As an example, here are the relevant records for Arkansas and the two teams that were ranked directly ahead of them in 2006:

WinsLosses
#13 Texas (10-3)#11 team and two unranked teams: 9-5 and 8-5#2 team and two unranked teams: 9-4 and 7-6
#14 California (10-3)#21 team and a 9-4 unranked team#4 and #25 teams and a 6-6 unranked team
#15 Arkansas (10-4)#9 and #25 teams and an 8-5 unranked team#1, #3, #4, and #7 teams

It is pretty easy to see here that Texas and California have no business whatsoever being ranked ahead of Arkansas. They have better straight records (one less loss), but they played far easier schedules, and their relevant records are not close. Arkansas defeated the highest-ranked opponent (#9) of any of the three, and also beat 2 ranked opponents to 1 each for Texas and Cal. Texas and Cal both lost to 2 lower-ranked teams, whereas all of Arkansas' losses were to top 7 teams. Another way to look at it is this: you can effectively disregard all of Arkansas' losses because none of these three teams defeated a top 7 team-- Arkansas just played far more of them. In this light, Arkansas can be thought of as comparatively 10-0, while Cal and Texas are each 10-2. And that gets much closer to the comparative truth.

Mitigating Factors

Factors such as performance and improvement are of lesser importance, and should generally only be used to break apart and rate teams that have similar relevant records. And Arkansas is so far ahead of Texas and Cal in their relevant records for 2006 that mitigating factors are highly unlikely to make a difference.

Still, when you look at Arkansas, you might conclude that they fell apart at the end, since they lost their last three games, and that that is the reason they are ranked so low in the AP poll. But was Arkansas worse in those three games than they had been before? Those losses came to teams that finished #1, #3, and #7. Of course Arkansas lost those games, but it wasn't because they got worse. Given the rankings of those opponents, Arkansas should have lost to them regardless of when they played them.

Properly Assessing Wins and Losses

One major problem with the AP poll is that voters tend to simply drop teams a few places each week when they lose, with only a little regard to whom the teams lost to. A team will drop more for losing to a lower-rated opponent than for losing to a higher-rated opponent, but they will almost always drop after a loss. For example, if the #12 team plays the #6 team, the #12 team will almost always drop a few places if they lose the game. And that doesn't actually make much logical sense. You already knew the #6 team was better than the #12 team. Why would you drop the #12 team when that fact is simply confirmed?

Let's look at the final regular season AP poll for 2006. Arkansas was ranked #12, and would next play #6 Wisconsin (11-1) in the Capital One Bowl. Texas was ranked #18, and would next play unranked Iowa (6-6) in the Alamo Bowl. In those bowls, Arkansas lost 17-14 (to a 12-1 Big 10 team), and Texas won 26-24 (over a 6-7 Big 10 team). Arkansas dropped three places in the final poll, and Texas so impressed voters with their 2 point win over 6-7 Iowa that they actually rose five places, passing up Arkansas! This is so transparently foolish that it would seem to preclude the necessity of comment, and yet this sort of thing happens every year, and therefore apparently requires some comment. So here it goes.

When your #12 team loses to your #6 team by 3 points, that is actually a positive performance. It is possible, depending on what all the other teams did, for that #12 team to drop, but most of the time it shouldn't. In fact, that performance actually calls more for a rise than for a fall in the ratings. Why would you drop them when the outcome was exactly what you should have expected? And when your #18 team beats a 6-6 team by 2 points (and in a bowl game played in its home state), that is actually a very poor performance. Again, it is possible, depending on what all the other teams did, for that #18 team to rise, but most of the time it shouldn't. In fact, that performance actually calls far more for a drop in the ratings than for a rise.

Properly Assessing Performance and Improvement

The fact is, while Arkansas lost their last 3 games in 2006, if you weigh the later part of a season more heavily than the early part of the season (as most people do), it should actually work in Arkansas' favor-- if you look beyond mere wins and losses. To reiterate, that 3-point bowl loss to #7 (final rating) Wisconsin is a strong performance. It indicates that Arkansas is just behind #7. Ditto for their 5-point loss to #3 LSU in their regular season finale. They lost to #1 Florida by 10 points in the SEC title game, which may not seem as impressive-- until you consider that #2 Ohio State lost to Florida in the national championship game by 27 points! And #3 LSU lost to Florida by 13 points. So it was actually a strong finish for Arkansas performance-wise. They were just playing better teams, and thus lost.

By contrast, they looked mediocre to poor in the first month of the season. In their season opener, they lost to #4 USC 50-14. Then they defeated two teams that finished with losing records by only 2 points and 1 point in their 3rd and 4th games. After that, they started stomping on opponents, including 27-10 at #9 Auburn and 31-14 against #25 Tennessee. They improved after a rough start, and in most of their games, their performance was that of a top ten team-- including their last 3 losses.

Conclusion

In the case of Arkansas 2006, you didn't need to get into the details of improvement and performance. The proper ranking for Arkansas was right there in their relevant record: their only losses were to teams ranked in the top 7, and they defeated #9. The details just serve to corroborate what that simple fact tells you to begin with: Arkansas 2006 should be a top ten team. Arkansas was basically punished for playing a very tough schedule that year. And that is another major problem with the AP poll-- it tends to punish teams that play tough schedules, and it rewards teams that play weak schedules. AP poll voters do take strength of schedule into account, but not nearly enough.

Let's finish up with one more example of a team's relevant record, also from 2006. Michigan's straight record for that season is 11-2. Their relevant record is as follows:

Wins: The #7, #17, and #24 teams     Losses: The #2 and #4 teams

So you can see that they should likely be ranked #5 or #6. The AP Poll, however, had them #8, directly behind a team they defeated (12-1 Wisconsin). This leads to what is likely the AP poll's single biggest problem, and the one people tend to focus on the most: accounting for head-to-head results.

Next: Head to Head Results

Sections that follow "Head to Head Results":
Strength of Schedule
Other Criteria: Performance, Improvement, and Common Opponents
How to Rank the Top 25 Teams
How Not to Rank Teams

Home