To go beyond my own biased week one story lines, I looked at some actual data. I've wondered for some time how detrimental the Primacy Bias really is. What if it's not at all harmful? What if, most times, what you see in week 1 is what you get, more or less for the rest of the season?
So I asked, do week 1 stats actually predict future performance in a meaningful way? I started with the 10 best and worst performances at each position in week 1 from one of my PPR leagues, including only players with reasonably high ownership and who didn't almost immediately go on any kind of IR. I then plotted that against their season average (see below).
There was significant regression at every position. The best week 1 performances declined, and proved unsustainable at all positions. Even the league's leading passer, represented by the Denver blue line above the rest, averages far less on a weekly basis than week 1 would have had you hoping for. The worst performances by players who were/are heavily owned, on the other hand, have tended to improve over the season so far. Thus, sticking to Denver examples, Eric Decker started off the season with 5 fpts, but has averaged 16.2 fpts/game overall thanks to a steady string of good showings. I was frankly surprised that there weren't more straight lines. Jamal Charles is the only guy whose top 10 start actually translated into a slightly better season average in PPR fantasy points. The bottom line is that those memorably outstandingly good or bad week one fantasy points by and large failed to predict the player's subsequent value. We hear about regression all the time in fantasy baseball, owing to its longer season and much greater sample size, but these data show that it has to be expected in fantasy football too. Even though we can't help but place a higher emphasis on the week one data, there is no reason to believe it predicts the future outcome of any given skill player.
No comments:
Post a Comment