Saturday, May 07, 2011

Clutch hitting and getting killed by a puck

There was something someone said about clutch hitting a few months ago -- I think it was Tango -- that took a while to sink in for me.

It went something like this: saying that clutch hitting ability "does not exist" is silly. Humans are different, and different people react to pressure in different ways. So, *of course* there must be differences in clutch hitting ability. The question isn't whether or not clutch hitting exists, because we know it must, but *to what extent* it exists.

It turns out that extent is small. The best we can say is that clutch hitting studies have found that the SD of individual clutch tendencies is about 3 percent of the mean. So for players who are normally .250 hitters, two out of three of them will be between .242 and .258 in the clutch, and 19 out of 20 of them will be between .235 and .265. ("The Book" study on the topic used wOBA, rather than batting average, but this is probably still roughly true.)

That's pretty minor, especially compared to other factors like platoon advantage, and so on. More importantly, there's not nearly enough data to know which are the real clutch hitters and which are the real clutch chokers. When your favorite pundit pronounces player X as "clutch," that's still completely pulled out of his butt.

So
the conclusion remains that calling specific players "clutch" is silly, but the stronger statement "clutch hitting does not exist" is unjustified.

------

As I said, that took a while to sink in for me. I'm willing to agree with it. But I still have reservations. Because, you can take this "humans are different" stuff to extremes.

Suppose you found that a certain bench player hits much better than usual on the first day of the month. Some announcer notices and says the manager should always play him on those days. The sabermetricians step in and say, "that's silly: there's no reason to believe it's a real effect, and the guy's not very good on all the other days."

But, by the same logic ... all humans are different. For me personally, every day on the first of the month, I'm amazed how fast the past month flew by. It seems to me that time is going faster as I get older, and it makes me a little sad. On the other hand, other people might be happier on the first of the month. Maybe that's when they get paid, and they're feeling rich.

So, since humans are different, and circumstances are different, why couldn't their be a *real* "first of month" effect? The same logic says there *has* to be.

But, the thing is ... any such effect is probably very, very small. Too small to measure. There's no way it'll affect player performance (in my estimation) even one one-hundredth as much as clutch.

It's real, but it's too small to matter.

In cases like that, is it OK to say that "there's no such thing" as "first of month hitting talent"? Maybe it's not technically true, but I don't think that'll always stop me from saying it anyway. But, if I remember, I'll say "if it exists, it's probably infinitesimal."

For clutch, the Andy Dolphin study mentioned in "The Book" came up with an SD of clutch talent estimated at .008 in wOBA. I'm not 100% willing to accept that, mostly because, as Guy points out in a recent clutch thread on Tango's blog, there might be a natural clutch difference due to batter handedness or batting style that doesn't really reflect "clutch talent" as it's normally understood.

But, what I might choose to say is that there's weak evidence of a small clutch effect, and add that there isn't nearly enough evidence to know who's weakly clutch and who's weakly choke.

Or, I might say that there's "no real evidence of a meaningful clutch effect," which says the same thing, but with a more understandable spin.

------

Anyway, it occurred to me, while thinking about this, that we like to think about things as "yes/no" when what's really important is "how much". By that, I don't mean the moral argument about "black and white" versus "shades of grey." What I mean is something more quantitative -- "zero or not zero" versus "how much?"

Take, for example, Omega 3 fatty acids. I hear they're good for you. You read about them in the paper, and on milk cartons, and fish products.

But, *how* good for you are they? Isn't that important, to quantify it? The media doesn't seem to.

Here, for instance, is an article from the CBC. It talks about which Omega 3s are better than others, and ends with a recommendation that people eat 150 grams of fish a week. But ... how big are the benefits? The CBC article doesn't tell us. You'd expect better from the Wikipedia entry, but even that doesn't tell us.

So, how are we supposed to decide whether it's worth it?

I mean, suppose you really, really don't like fish. Won't your decision on whether or not to eat fish anyway, depend on how much the benefit is?

And, don't other foods have benefits too? If I eat fish for lunch, that means I won't be eating oat bran instead. How can I decide which is better for me without the numbers?

Or, suppose I have a choice ... go to a fish restaurant for lunch, or go for a jog around the block. Both might help my heart. Which one will help more? Or, suppose a nice piece of salmon in my favorite restaurant is $5 more than a piece of chicken. Could I do better with the $5? Maybe I could save up and use the money to switch to a more expensive gym, where I'll go more often. Or, I could put it in a travel fund, to buy better travel health insurance next trip I go abroad. Which option for my $5 is best for my long-term health?

There are lots of people who hear "fish is good for your heart," so they start buying fish. What they're really buying is a security and good feelings. Because if you ask them to quantify the benefits are, they just look at you blankly, or they quote some authority saying it's good for you. They treat it as a yes/no question -- is it good for you? -- when the important question is "how much is it good for you"?

Maybe I'm just cynical, but when someone tells me a food is good for me, without quantifying the benefits, I just assume the benefits are tiny, like clutch hitting. After all, for a study to make a journal, all you need is 5 percent significance, which means that most of the claims are probably not true. And, even if they *are* true, the study probably found that the benefits are small (though possibly statistically significant).

In his book "Innumeracy," back in the 80s, John Allen Paulos suggested there be a logarithmic scale for risk, so when the media tell you something is risky, they could also use that number to tell you how much. I'd like to see that for everything, not just risk. And, I'd like to see the scale changed. Because, if it's just a logarithm, people will still ignore it. If someone tells me I should have had the salmon, I might say, "Why? It's only a 0.3 on the Paulos scale, which is almost nothing." And they'll say, "0.3 is better than nothing. It all adds up. You should look out for your health."

But, what if you expressed the benefits in terms of something real, like exercise? Everyone intuitively understands the benefits of exercise, so the scale would make sense. And, it would make the insignificance of small numbers harder to rationalize away. So when someone insists I eat the salmon, I can say, "look, it's $5, and it's only the equivalent of 30 meters of jogging." They can still say, "30 meters is better than nothing!" And I'll say, "look, I'll have the chicken, and when we get outside the restaurant, I'll jog the 30 meters to the car, and save the $5."

They could use that scale in the supermarket, too. I buy milk with Omega 3 (not for the health benefits, but because it stays fresh longer). Wouldn't it be great if it said on the carton, "Each glass gives you the benefits of 0.3 push-ups"? That would be awesome.

I'm not sure if exercise is the absolute best reference point. Maybe "days of life" is better. ("Each floret of broccoli adds 3.2 seconds to your lifespan!") But, still, it should be possible to come up with *something* that'll work.

------

For risk, a good unit of measure would be "miles of driving". That works well because it's widely recognized that driving is dangerous, and we all know people who died in car accidents, so we have an idea of the risk. But, there's no moral stigma associated with it (unlike, say, smoking), so we can be fairly rational about it.

In a comment thread yesterday on Tango's blog, there was a discussion about putting a protective barrier down the lines of baseball stadiums, to prevent people from getting hurt by foul balls. That would be similar to what the NHL did, when they installed a mesh partition behind each net after the death of a spectator in 2002.

Suppose the hockey netting wasn't there. What would the risk be?

From 1917 to the end of the 2002 season, there were 37,480 regular season NHL games. Assuming 15,000 fans per game, that's 562 million fans. Suppose one-quarter of those fans are sitting in high-risk seats; that's 140 million fans. Finally, suppose that players shoot a lot harder now, so today's risk is double the historical average. That means it only takes 70 million of today's fans to shoulder the same risk as throughout the NHL's history. (We could add a bit for playoffs and pre-season, but never mind.)

So, that's one death out of 70 million people, or a risk of 1 / 70,000,000 of dying at any given game. Is that big or small? It's hard to say. What is it in terms of driving?

In 2010, there were 1.09 deaths per 100 million miles travelled. Let's round that down to 1.00, just to make the calculation easier. So there's 1 / 100,000,000 of a death per mile.

That means the hockey game is the equivalent of 1.43 miles.


So that's how I'd say it: putting the mesh up at hockey games makes each fan behind the net safer by 1.43 miles of driving.

Doesn't that give you a really good intuitive idea of the risk involved?

Of course, that's death only, and not injury. But I'm sure you could find injury data for car accidents, and for puck injuries, and come up with some kind of reasonable scale. I'm guessing that if you did that, you'd probably find that it was still about the same order of magnitude of a couple of miles. But I don't know for sure.

And, hey, now that I think about it, you could treat *healthy* things as "miles of driving saved*. If something saves you one minute of lifespan, that's easily converted to driving. 100 million miles, at an average of 30 miles an hour, is about 380 years of driving. (At six hours of driving (or passenging) a week, that's about 10,000 years of life per fatal accident. That would mean that about one American in 200 eventually winds up dying in a car accident. Sound about right?)

Suppose the average driver has 40 years of life left, on average. Then every 380 years of driving wipes out 40 years of life. That works out to 9.5 years of driving per year of life. Round that to 10. That means that every hour you drive -- 30 miles -- costs you six minutes of life. So five miles of driving costs you a minute of life.

(Again, that's death only, and not injury. In terms of quality of life, you'd probably want to bump up the "5 miles" figure, because bad health usually makes you miserable before you die, but car accidents often kill you instantly. But let's stick with five miles for now.)


So if eating salmon for a week saves you one minute of lifespan, eating salmon for a week is like cutting 5 miles off your commute one day. I made that "one minute" number up; if anyone knows how to figure out what the real number is, let me know.

------

Let's do cigarettes. Actually, let's just do lung cancer, to make it easier.

According to Wikipedia, 22.1% of male smokers will die of lung cancer before age 85. Let's assume that entire amount is from cigarettes.

Since this is a back-of-the-envelope calculation, let's just make some reasonable guesses. I'll assume the average male smoker starts at age 18 and smokes a pack a day. I'll also assume that when a smoker dies of lung cancer, it's at age 70 on average, and it cuts 15 years off his lifespan.

So: 52 years times 365 days times 20 cigarettes equals ... 379,600 cigarettes. 15 years of lost life equals 7,884,000 minutes. So each cigarette equals 21 minutes.

Multiply the 21 minutes by 22.1% and you get 4.6 minutes.


Google "cigarette minutes of life" and you get figures ranging from 3 to 11 minutes ... and that's of *all causes*, not just lung cancer. So, we're in the right range.

4.6 minutes equals 23 miles.

If you're a pack-a-day smoker, your risk is the same as if you drove from New York to Los Angeles every week or so.

------

If I were made evil dictator of the world, I would insist that every media report on risks and benefits tell you *how much*. Every report of a health scare, every quote from a safety group, every recommendation from a nutritionist, would need to include a number. Because, really, when someone tells you "vegetables are healthy," that's useless. Even if it's true, how true? Is it true like "the platoon advantage exists?" Is it true like "clutch hitting exists"? Or is it true like "first of month hitting exists?"

The difference matters. We need the numbers.


Labels: , , ,

10 Comments:

At Saturday, May 07, 2011 12:38:00 PM, Anonymous Anonymous said...

Definitely a useful way to provide some perspective.

On your cigarette example, to get the 21 minutes/cigarette, I don't see where you factored in the 22.1%. The calculation you made looks like all male smokers lose an average of 15 years of their life. So 22.1% of 21 minutes would give about 4.5 minutes/cigarette, which agrees better with the range.

 
At Saturday, May 07, 2011 12:40:00 PM, Blogger Phil Birnbaum said...

Right! Will fix the post. Thanks!

 
At Saturday, May 07, 2011 12:43:00 PM, Blogger Phil Birnbaum said...

Post fixed. For those getting here late, I originally forgot to factor in the 22.3% into the lung cancer calculation, so the result was 4.4 times too high.

 
At Sunday, May 08, 2011 12:46:00 AM, Blogger j holz said...

In cases like this, nobody wants to be politically incorrect and say that one death is acceptable if it improves the viewing experience of millions of fans by some small amount. Yet this is exactly how we should look at it. Baseball fans obviously realize that seats down the foul lines carry the greatest risk of being struck by a batted ball, but they continue to pay premium prices to sit there because the overall experience is worth it to them.

I really like the post, but you don't seriously think you can accurately estimate the risk of dying at a hockey game from one death, right? Other than that, we could definitely use a simple metric for translating risk into everyday terms.

 
At Sunday, May 08, 2011 1:27:00 AM, Blogger Wheell said...

This is your best post ever. What is the marginal value of fish oil? Broccoli? A Snickers bar? 50 sit-ups? That we don't have answers makes us question the claims, both positive and negative.

 
At Sunday, May 08, 2011 10:25:00 AM, Blogger Phil Birnbaum said...

>"you don't seriously think you can accurately estimate the risk of dying at a hockey game from one death, right?"

Well, depends what you mean by "accurately". There's a fairly large confidence interval, sure.

When googling, I found a woman who died at a non-NHL game in New York in the 40s, and someone that said it happened three times in Canada (if I remember correctly). So 1 NHL death in 85 years, or whatever, seems in line with those others.

But, yeah, you could make a case for another reasonable number, if you wanted to.

 
At Tuesday, May 10, 2011 12:13:00 AM, Blogger BobboFitos said...

I loved this entry, thank you

 
At Saturday, May 14, 2011 4:43:00 PM, Blogger Musical Daddy said...

This was great! Thank you!

 
At Saturday, May 14, 2011 10:31:00 PM, Blogger Molly said...

This is great. I love this post because I hate statistics.

 
At Saturday, May 28, 2011 11:07:00 PM, Blogger King Yao said...

brilliant!

 

Post a Comment

<< Home