Republished with permission from


We thought now would be an appropriate time to revisit the weekly NFL picks of the prognosticating class. We have compiled all the picks made by the ESPN, Yahoo, and CBS Sports pundits through the divisional round games. To their credit, all three sites do a terrific job keeping scorecards.

Here is how the group stacks up on a hit rate basis (# of correct calls divided by # of total calls):

2011-12 Hit Rate
Yahoo Users68.1%YAHOO
Mike Silver67.7%YAHOO
ESPN Users66.5%ESPN
Clark Judge66.5%CBS
Les Carpenter66.2%YAHOO
Dave Richard65.8%CBS
Mark Schlereth65.4%ESPN
Mike Freeman65.0%CBS
Seth Wickersham64.7%ESPN
Eric Allen63.9%ESPN
Merril Hoge63.9%ESPN
Jason Cole63.2%YAHOO
Pete Prisco62.8%CBS
Mike Golic62.4%ESPN
Adam Schefter62.0%ESPN
Chris Mortensen61.3%ESPN
Ron Jaworski60.8%ESPN
Will Brinson60.5%CBS

The so-called experts did not fare well this season. With the exception of Mike Silver (Yahoo), all of them underperformed both Yahoo's and ESPN's users, as well as Accuscore (a prediction service that uses game simulations).


Let's look at the data from a different angle. If you placed $1 on each of the pundits' picks—based on "moneyline" odds—how much would you have ended up with, on average? We removed the sportsbook's commission ("vig") from the odds when calculating these figures:

$1 Bet Yield
Mike Silver$1.08YAHOO
Les Carpenter$1.06YAHOO
Mike Freeman$1.05CBS
Seth Wickersham$1.04ESPN
Yahoo Users$1.04YAHOO
Clark Judge$1.01CBS
Dave Richard$1.00CBS
ESPN Users$1.00ESPN
Eric Allen$0.99ESPN
Mark Schlereth$0.99ESPN
Merril Hoge$0.98ESPN
Ron Jaworski$0.97ESPN
Will Brinson$0.96CBS
Pete Prisco$0.96CBS
Mike Golic$0.96ESPN
Jason Cole$0.96YAHOO
Chris Mortensen$0.95ESPN
Adam Schefter$0.94ESPN

Here, the group looks a smidge better. Four of the members outperformed the Yahoo user base. Still, 10 of the 16 pundits were in the red. We should note that all of these picks are simply based on whom the pundits expected to win; their selections may well have been different if they were factoring in the odds (The guys from CBS Sports also predict against the spread – see that scorecard here).

In our "Scoring Challenges post," we noted the following:



The [hit rate] is useless without context. If I predict each day that the sun is going to rise tomorrow, I am (hopefully) going to have a perfect hit rate. Using this system, I receive a score twice as high as the pundit who predicted the 2008 financial collapse but then missed a trivial call the following year. That hardly seems fair, which suggests that predictions should somehow be calibrated for "boldness."

The "$1 Bet Yield" captures this notion of boldness. Pundits with low hit rates can still have high bet yields (greater than $1) if they get some bold calls right. These pundits may not be the most accurate prognosticators, but the ones they get right are sufficiently out-of-consensus to yield a positive net payout. Meanwhile, the daily sunrise predictor's perfect accuracy would generate virtually no winnings.


(While we used the ESPN and Yahoo users as a proxy for crowd consensus, one of our readers pointed out that a more representative dataset would be the teams favored by Las Vegas oddsmakers. Simply picking the Vegas favorites would have resulted in a 66.2 percent hit rate this season, better than all but three of the pundits we tracked. The $1 bet yield of this group was $0.96. Meanwhile, choosing all of the underdogs would have resulted in a 33.8 percent hit rate and $1.02 bet yield.)

This idea is fairly intuitive in sports, given the prevalence of sportsbook odds, but it can be employed in all categories of predictions. In fact, PunditTracker will explicitly incorporate consensus odds into its scoring system; we will elaborate on our method in a coming post.

PunditTracker's mission is to bring accountability to the prediction industry by cataloging and scoring the predictions of pundits. The site, which is currently running as a blog, is slated to officially launch in the spring at