I just got back to Austin from St. Louis today, so Wednesday's post goes up now, as I have no guarantee about when I'm getting up tomorrow. Thus, it's me, my dog and my ipod until this is up.
Before I start, I should apologize to stlcardinals.com for the comment made in the first line of my post last week. The
official site did run a piece looking at steroids, Mac, and the Hall a month earlier. I missed this piece. Sorry, guys. I should have looked closer before making a snide comment.
Now for the content of what I want to talk about today. It has been documented that the Cardinals value veteran players at about $2M per marginal win. What I'm going to try to look at today is whether or not this is a reasonable rate in the aggregate. My focus will be on team data and not player data*, however, so it is probably of some limited use in evaluating individual players. Regardless, let's look at what teams actually paid for their success in 2006.
First, let's not worry about the definition of 'marginal.' Let's just look at what teams paid for a win in 2006. Teams spent somewhere between the $194M that the Yankees spent and the $15M that the Marlins did. An average MLB team had a payroll of $77.5M. The total number of wins they earned ranged somewhere between the 97 that the Yankees earned and the 61 that the Devil Rays had on their final record. The average team obviously won 81 games (to keep all of the teams with an equal number of games played, I gave the Giants and Cardinals each a half game for the unplayed rainout). This puts the average value of a win at $946,148. The Yankees were, unsuprisingly, the least economical, spending $2M for each of their wins, while the Marlins, also unremarkably, were the most thrifty, spending only $192,288 on each of their wins. If we look at Pythagorean data, we get almost identical results, so I won't bother to cite the numbers here.
But, this analysis leaves something to be desired. If a team were to staff itself with twenty five rookies, each making league minimum, they could hardly be expected to lose every game. Really, being a 100 loss team is a pretty big accomplishment. So, what you're really paying for is having more wins than the worst team. Thus, I take a team's win total minus that of the worst team as the team's number of marginal wins**.
For the case of actual wins, the worst team in 2006 was the Devil Rays. And if we look at marginal wins, we see that they are significantly more expensive. The Yankees look a lot better in this analysis, and the teams with low win total slook a lot worse. Using this methodology, the median MLB team spent $3.3M on a marginal win in 2006. The Yankees were less efficient than the median, spending $5.1M per marginal win, but they didn't end up spending excessively. The Cubs, however, look really horrible, spending $16.9M on their five marginal wins, and the Royals ended up spending $37M to get one more win than the D-Rays. The thriftiest teams were the A's ($1.6M per mw), Twins ($1.5M per mw) and Marlins ($294K per mw). This verifies some of the common sense on this topic--it shows the big market teams paying market rate for their wins, while it shows the successful small payroll teams as actually having recieved the value that they were looking for.
Meanwhile, the same analysis, applied to pythagorean wins gives about the same median value (3.6M), but doesn't make the losing teams with low numbers of marginal wins look quite so bad (the Cubs are at $11.8M per mw. It helps the Indians enormously to use pythagorean wins, as they had an excellent win differential dispite a losing season. The Marlins ($273K) remain the thriftiest, followed by the Rockies ($1.6M( and the Indians ($1.6M).
Now, you must be wondering how the Cardinals did in all of this--the answer is that they ended up right in the middle of the pack--spending $3.5M per actual win and $3.6M per pythagorean win (amazingly, the 2006 Cardinals outperformed their pythagorean record), right in line with the league medians in each categories. Considering how much so many of the players on that team underperformed during the regular season, this probably indicates that the 2006 Cardinals were actually a pretty good deal at the beginning of the season. Without the total collapse of Marquis and Mulder, the team would have probably been good for three to six more wins, at least.
Anyway, what have we learned? The main lesson is that a team should expect to pay somewhere in the neighborhood of $3.3M per win over the last place team in the league. Given that even the Devil Rays have players like Carl Crawford and Scott Kazmir, it's certainly fair to say that they aren't entirely staffed by replacement-level players. Thus, it's probably safe to say that you should be able to get away with paying a little less per win over a replacement level player. Despite this, however, this analysis probably shows that going as low as $2M per marginal win, even over a replacement level player, is probably an unrealistically high expectation. I don't think it's possible (and for the 2006 Cardinals, it wasn't) to do all that much better than $2.5M consistently.
If you have any suggestions regarding how I might improve this analysis, or if you want to see the data for an individual team, feel free. I'll probably be asleep until noon, though. I hope that I at least gave people something interesting to think about.
*I really find translating a players' stats into a number of marginal wins a somewhat dicey and very model-dependent business. It has obviously been done, but team wins are team wins. Aggregate team data is very easy to interpret and can ba analyzed much more straightforwardly.
**By this logic, each team is automatically obligated to spend the league minimum of $400,000 on 25 players--so I subtracted this amount from each team's payroll