***All numbers here are courtesy of Corsica.hockey (thankfully I downloaded the numbers before it went down for the summer)
I guess the first real question here is how do you grade a projection? There are a lot of ways to do this, this is how I did it:
The first thing I had to do was to scale the projections based on the 2016-2017 averages. The numbers change a bit each year so you have to account for it. My actual projections predict how much better/worse than average you would expect a goalie to do. When I made the projections I provided a google docs which had them scaled to the previous year's average. And for the sake of now comparing the projections to what happened I'll scale them so they are on equal footing with this past year's numbers (I could have just made this year's numbers in terms of how much better/worse than average but I prefer it this way).
The next thing is for each goalie with a projection and who played this year (even if they just faced one shot), you take the absolute value of the difference between their projection and their observed Sv%. We then weigh the absolute difference for each goalie by how many shots the goalie faced and then take the weighted average for all the goalies. So goalies who faced fewer shots won't matter as much as those who faced more.
So is that it? No because while checking how far my projections deviated is nice, without some sort of reference we have no idea how good they actually were. We need to check how other ways of predicting this past year's numbers did. Ideally I'd include projections made by other people but I don't know if anyone made projections for Low/Mid/High Sv% so I'll just include a few simple ones. So besides for my projections, I'll also test: the previous year, that player's career numbers (from 2007-2008), and league average.
With all that out of the way here are the numbers (lower is better):
Type | LSv% | MSv% | HSv% |
Projections | 0.0053 | 0.0123 | 0.0208 |
Previous Year | 0.0110 | 0.0193 | 0.0317 |
Career | 0.0069 | 0.0147 | 0.0276 |
League Average | 0.0052 | 0.0121 | 0.0225 |
Let's first look at how the projections did overall for the three danger zones. Basically, Lsv% is better than MSv% which is better than HSv%. You might be tempted to think that this means we are better at predicting them in that order. That's not true though, because remember that LSv% and MSv% both contain less "talent" than HSv%. So the average difference is so small because the observed spread is just smaller. It's just due to the fact that the observed standard deviation is smaller so the average difference is smaller. So it may look better but it actually isn't. A better way of doing this would probably have been in terms of standard deviations (this isn't really a big deal though).
Ok, let's now look at how the four "projections" did. So, for each category there's a clear order. Just using last year's numbers is clearly the worst in each category (which is of course why you shouldn't just trust one year of data). Next you get the player's career (at least from 2007-2008) numbers. Then you get league average and then my projections. My projections and league average are virtually the same for LSv% and MSv% but mine edge it out slightly for HSv%.
Also you might be surprised about how good league average does but I think it's make good sense. Goalie performance contains a lot of randomness so projecting average is a good bet. And of course this is especially true with LSv% and MSv% (and to a lesser extent with HSv%) which both are mostly random.
Conclusion
To conclude my projections for this past year was found to be better than a player's previous year's numbers and (to a lesser extent) his career numbers (with it being the best in HSv% and worst in LSv%). Compared to league average my projections were the same for LSv% and Msv% and slightly better for HSv%.
No comments:
Post a Comment