Republished with permission from PunditTracker.com.
We thought now would be an appropriate time to revisit the weekly NFL picks of the prognosticating class. We have compiled all the picks made by the ESPN, Yahoo, and CBS Sports pundits through the divisional round games. To their credit, all three sites do a terrific job keeping scorecards.
Here is how the group stacks up on a hit rate basis (# of correct calls divided by # of total calls):
|2011-12 Hit Rate|
The so-called experts did not fare well this season. With the exception of Mike Silver (Yahoo), all of them underperformed both Yahoo's and ESPN's users, as well as Accuscore (a prediction service that uses game simulations).
Let's look at the data from a different angle. If you placed $1 on each of the pundits' picks—based on "moneyline" odds—how much would you have ended up with, on average? We removed the sportsbook's commission ("vig") from the odds when calculating these figures:
|$1 Bet Yield|
Here, the group looks a smidge better. Four of the members outperformed the Yahoo user base. Still, 10 of the 16 pundits were in the red. We should note that all of these picks are simply based on whom the pundits expected to win; their selections may well have been different if they were factoring in the odds (The guys from CBS Sports also predict against the spread – see that scorecard here).
In our "Scoring Challenges post," we noted the following:
The [hit rate] is useless without context. If I predict each day that the sun is going to rise tomorrow, I am (hopefully) going to have a perfect hit rate. Using this system, I receive a score twice as high as the pundit who predicted the 2008 financial collapse but then missed a trivial call the following year. That hardly seems fair, which suggests that predictions should somehow be calibrated for "boldness."
The "$1 Bet Yield" captures this notion of boldness. Pundits with low hit rates can still have high bet yields (greater than $1) if they get some bold calls right. These pundits may not be the most accurate prognosticators, but the ones they get right are sufficiently out-of-consensus to yield a positive net payout. Meanwhile, the daily sunrise predictor's perfect accuracy would generate virtually no winnings.
(While we used the ESPN and Yahoo users as a proxy for crowd consensus, one of our readers pointed out that a more representative dataset would be the teams favored by Las Vegas oddsmakers. Simply picking the Vegas favorites would have resulted in a 66.2 percent hit rate this season, better than all but three of the pundits we tracked. The $1 bet yield of this group was $0.96. Meanwhile, choosing all of the underdogs would have resulted in a 33.8 percent hit rate and $1.02 bet yield.)
This idea is fairly intuitive in sports, given the prevalence of sportsbook odds, but it can be employed in all categories of predictions. In fact, PunditTracker will explicitly incorporate consensus odds into its scoring system; we will elaborate on our method in a coming post.
PunditTracker's mission is to bring accountability to the prediction industry by cataloging and scoring the predictions of pundits. The site, which is currently running as a blog, is slated to officially launch in the spring at PunditTracker.com.