FanPost

Maybe Bert's Off-Base?


The TwinsGeek has put together three very thoughtful essays on the subject of pitch counts, analyzing work by Rany Jazayerli and Keith Woolner to suggest that the 100-pitch limit used by just about every major league manager these days is actually woefully conservative for the stated goal -- keeping young pitchers' arms healthy.

I think there's an alternate explanation for the popularity of the 100-pitch limit, though, and it has little if anything to do with Jazayerli's and Woolner's work.

None of this is meant to be a knock on TwinsGeek or his analysis, which is typically spot-on. He shows that the 100-pitch limit is largely an artifact of old-school baseball thinking, and that J&W's work shows that serious negative impacts on pitcher's long-term health seem very slim even with higher than 100 pitches thrown. It's a nice piece of sabermetric debunking. Sadly, though, I don't think managers have been falling all over each other to limit their starters to 100 pitches based on J&W's work**; I think it's much more of a mass-CYA maneuver that can be traced back to the Third Law of Management.

** - If I'm wrong, and major league managers are using Jazayerli's creation as an excuse to avoid keeping their starters in the game longer than absolutely necessary, then I'd argue that Jazayerli has only himself to blame. Let's say you're a major-league manager and you've just been told about this new statistic called Pitcher Abuse Points that counts all pitches above 100 and converts them into an Abuse Score. How many Pitcher Abuse Points do you think would be too much, or might get you in hot water with your own management or the local press? Five?

If anything, the idea that Woolner's assistance helped Jazayerli refine his Pitcher Abuse Points to show that you could go as high as a thousand Pitcher Abuse Points without adversely impacting your pitcher would, if anything, convince most baseball men to ignore the stat -- how can you take seriously a stat that says you can pile on a thousand 'Abuse Points' per week and not have any significant negative impact on your pitcher? Had Jazayerli simply called them Pitcher Fatigue Points from the beginning, the measure would likely be much more acceptable to baseball men.

Now let's follow up by saying that there's no doubt that the 100-pitch limit being observed by big-league managers has had a significant impact on the game's statistics. It's very easy to look through baseball history and point out all of the following:

- Starting pitchers receive fewer decisions per start today than they did thirty-five years ago.

As a corollary to this point, starting pitchers reach 20 wins far less often than they used to, though that's a combination of the effect of the 100-pitch limit and the move from a four- to a five-man rotation.

- Starting pitchers pitch a smaller percentage of innings today than they did thirty-five years ago.

From 1974-1978, pretty much the height of Bert Blyleven's career, starters averaged between 71-74% of all innings pitched in the American League. (National League numbers are slightly lower, based on pitchers being in the batting order.) From 2005-2009, starters averaged about 66% of all innings pitched in the AL. Five percent of innings may not sound like a whole lot, but keep in mind that there are over 20,000 innings pitched each year in the AL; over 1000 innings per year have moved from the starting rotation to the relief staff in the past 35 years.

It's gotten to the point where there are only two situations where you'll see a pitcher routinely go over 100 pitches: either he's chasing a milestone (perfect game, no-hitter, shutout), or he's a veteran pitcher who's already demonstrated his effectiveness when throwing beyond 100 pitches (Roger Clemens, Randy Johnson, and Nolan Ryan are just the top-end examples of pitchers who, in their prime, didn't ever worry about pitch counts -- though Ryan's career ended before pitch counts became as common as they are today, it'd be hard to argue that a manager would enforce a pitch count on Ryan today).

Add in a purely economic argument -- a significant number of teams simply don't believe they have the resources to sign a talented young pitcher to a long-term contract once he demonstrates himself as outstanding. For those teams, it makes little sense to husband a future that the team can't afford to take advantage of, and thus, you'd think from a purely economic perspective that teams like the Royals, Pirates, and the like would actually routinely exceed 100 pitches, in order to get the maximum value out of a young pitcher now and let some other team wildly overpay for his now reduced long-term potential. That teams don't do this, despite blatantly persuing other economic strategies (mid-season call-ups to maximize a player's contract time before arbitration, for instance), suggests that there's some other primary factor at work in enforcing the 100-pitch limit, even for clubs that don't necessarily benefit from having a 100-pitch limit. To me, that factor is the Third Law of Management.

The Third Law of Management, which applies to all management, not just baseball management, can be stated like this:

Management makes policy in order to divert blame away from management.

So, for instance, when your manager denies your request for time off to go to Hawaii for your sister's wedding, it's not his fault for refusing to allow you to attend a family function, but your fault for not satisfying the company policy that all requests for paid time off have to be submitted at least five days prior to the start of the paid time off period. (Or, it's your sister's fault for deciding on a last-minute elopement to Hawaii instead of a church wedding in Faribault.)

I've long been a believer that major league managers, as much or more than any other managerial type, do things from a CYA perspective first, and try to justify the move later. In this case, the 100-pitch limit has eliminated what was once one of the more difficult tasks a manager and his pitching coach would face: at what point in the game is your starter too tired to remain effective?

You can't ask the pitcher himself: even Bert has admitted that he might have given up homers to three men in a row, but when the manager comes out to get the ball he'll protest that he feels fine. And observational cues are few and far between: Bill James, in a rare bit of observational analysis (rather than statistical analysis) asserted that one of the best 'tells' for when a pitcher is becoming fatigued is the behavior of the pitcher's trailing leg, but depending on mechanics this may be easier or more difficult to sense in a particular pitcher.

By removing the decision of when to pull a starter from the manager's and pitching coach's guess as to whether or not the starter is tired to an 'objective' measure of pitch counts, the manager in theory aleays makes sure that a better-rested pitcher is in the game in a key situation, and in practice pushes the blame for any resulting blow-up from his own decision whether or not to pull the pitcher onto the summoned reliever -- if the reliever gives up the runs, it's his own fault for not doing his job.

But is it true that managers are making their staffs more effective by making this change? Can we find an adventage to be gained by removing potentially fatigued starters and putting fresh relief arms into the game? Not by raw ERA -- there are more and more significant effects on team ERA than simply whether the pitcher is a fatigued starter or rested reliever that explains why league ERA's are up nearly a run from Blyleven's career peak.

Consider this: if it's true that starting pitchers give up more runs when fatigued, but that fresh relievers give up runs at the same rate regardless of when they come into the game (at least from a fatigue perspective), then it would make sense to assume that we should see a trend in the data -- in eras where starters stayed on the mound until they fell apart, they'd have higher ERAs compared to their relief staffs than they would in an era where the starter is hurried off the field before he can fall apart. Do we see that? Well...

Starter/Reliever ERA (AL):

1974 - 3.69/3.43
1975 - 3.89/3.50
1976 - 3.59/3.34
1977 - 4.14/3.90
1978 - 3.83/3.64

2005 - 4.52/4.03
2006 - 4.73/4.25
2007 - 4.61/4.33
2008 - 4.48/4.13
2009 - 4.62/4.17

In a word, no -- if anything, the difference between starter's ERA and reliever's ERA has grown since Bert's heyday, though without further analysis it's hard to say if that's a real effect or just a side effect of the league ERA being higher today than it was then.

The best we can say based on this tiny sample of data is that the trend of removing starters at 100 pitches isn't obvioulsy helping starting pitchers be more effective at preventing runs, and might possibly even be hurting teams by putting sub-standard relievers into games where only slightly fatigued starters might be more effective.

There's still a lot of work to do on this question.