Statistics are fun, and some people make their living off of them. "Stat guys" like Elliot Kalb and John Hollinger like to break the game of basketball down into its component ones and zeroes. But this isn't The Matrix; this is a sport. Statistics are interesting, often revealing, and can help you recognize trends that might otherwise go unnoticed.
But there is a human element involved that cannot be quantified, nor measured in any way. Bill Russell is the perfect example. He didn't dominate the stat sheet (except in rebounding), but he did dominate the league, winning 11 championships in his 13 NBA seasons. But no one really feels comfortable ranking him as the best player ever. Guys like Michael Jordan, Shaq, and Wilt Chamberlain are more popular choices, essentially because they both won championships and dominated statistically. By explaining that they won while putting up legendary (and in some cases unreachable) numbers, it is easier to justify their all-time ranking. But it takes all three of these men together to equal Russell's championships, the ultimate statisitic.
It's too easy to rely on numbers. The general consensus is that, the deeper we dig, the more detailed the analysis, the closer we can come to The Truth. Hollinger, and sites like 82games.com, have developed a mind-numbing array of ways to measure basketball players and the game itself. True Shooting Percentages, Player Efficiency Ratings, "Hands" Ratings, on and on...these things are supposed to provide us, the fans, with a pure and unbiased view of what really happens on the basketball court.
While I find it interesting to find out what percentage of Wang Zhi-zhi's shots were blocked during the 2003-04 season, I'm not sure it's a true indicator of Wang's worth -- or lack thereof -- as a basketball player. Don't you get the feeling that we're starting to overthink things just a little? Take Hollinger's latest article, Failure To Stop. According to him, that team is this year's version of the Seattle Supersonics. He uses something called the Defensive Efficiency Ranking to prove that Seattle is the worst ever, and he even gives us the top five worst ever teams.
The thing is, a few months ago we posted an article called Worst Defensive Team Ever (that's right John; we did it first). We didn't use anything as convoluted as a DER to prove our points. We just used bottom line numbers...stats that are just as compelling as Hollingers -- and a hell of a lot easier to related to.
Consequently, I sent the following e-mail message to Hollinger in response to his article:
I understand that you're a statistical wizard. I'm sure the Defensive Efficiency Rating is a powerful and fairly accurate formula. However, it must contain some sort of flaw, considering the true "Worst Defensive Team Ever" wasn't even mentioned in your article.I'll be interested to see whether he responds, and what he'll have to say.
I'm talking about the 1990-91 Denver Nuggets. They surrendered 130.8 points per game, which is 34 points more than the best defensive team that season (Detroit allowed only 96.8 ppg) and 24.4 points more than the league average of 106.4 ppg. By comparison, the Sonics (106.3 ppg) allow an average of 17.7 points more than the best defensive team in the league, San Antonio (88.6). Their differential versus the league average is even better.
Furthermore, the '91 Nuggets let their opponents shoot 51.2 percent, while the Sonics' opponents are shooting 48.9 this year. The Nuggets never held a single opponent under 100 points all season. They held opponents under 110 points only four times! They allowed the opposition to score 130 points or more a mind-boggling 37 times...almost half the season! And this team doesn't even warrant a mention in your article about the worst defensive teams? Great googly moogly.