Today several news outlets reported the World Health Organization's (WHO) claim that 4% of all deaths worldwide are attributed to alcohol consumption (most reports were close to a reprinting of the WHO press release). There is no doubt that overconsumption of alcohol is an important public health problem, and it has become just horrifying in Russia. But 4% of all deaths is a rather extraordinary claim, both in terms of magnitude and accuracy, particularly when it was often phrased with greater precision, as "just under 4%" or 3.8%. It should evoke some skepticism in anyone hearing it, even someone who does not have my experience of discovering that the official WHO claims in his two areas of greatest current expertise are dead wrong. You might think that a reporter would ask for some explanation, at least a hint of how the number was derived if not a defense of the actual analysis, but apparently this was not the case.
Some questions that came to my mind: Which causes of death are you counting? (Some reports did actually list those.) What portion of the total deaths from each of those causes are you counting? (A high percentage of cirrhosis deaths makes sense, but the portion of breast cancer deaths – one of the endpoints that was generally listed – would have to be very small.) Are you trying to estimate a total that includes everyone who dies even so much as a day earlier than they would have as a result of their drinking? (That is quite likely a high number, including most everyone who dies of degenerative disease and ever drank too much, perhaps well over 4%, but not really what we think of when we attribute someone's death to a particular cause.) If not that "even one day" standard, what is the cutoff?
It turns out that the full WHO report did address the second of these questions. If you take a few minutes to trace back to their underlying report, it actually has a few pages (out of the 85 total pages) that unpack the calculations a bit. They report what percentage of the deaths from various diseases they attribute to alcohol. Figuring out these numbers seems ridiculously difficult: you have to make numerous estimates -- about exposure levels, rates of each relevant disease in the absence of alcohol, and how much someone's risk increases are a result of drinking a particular amount -- and repeat that exercise for hundreds of countries because each of these numbers varies radically from one population to another. They claim to have done it but do not explain how.
I cannot judge most of the summary numbers they report, but a few of them certainly gave me pause. More than a quarter of the deaths from oral and esophageal cancers were attributed to alcohol. But since those who calculate deaths attributable to tobacco (including at the WHO) attribute about three quarters of the deaths from those diseases to tobacco use, and it is clear that papilloma virus (HPV) is responsible for a lot (half would be plausible, especially among non-geriatric cases), and many cases are not caused by any of these, there seems to be some disagreement that needs to be resolved. (A single death can have more than one of these as a cause, but the overcount seems rather too great.) I am not an expert on violence statistics, but the attribution of one third of all deaths by violence to alcohol seems rather unlikely given recent events in Iraq, Afghanistan, Sudan, Congo, Mexico, etc.
Attribution of almost a quarter of road traffic deaths to alcohol probably sounds right to most people, based on what we have always been told. But a few years ago my colleagues and I got intrigued and researched the basis for such claims. We found plausible criticisms of those statistics that basically said that if someone died on the road and there was any alcohol involved anywhere (e.g., a sober driver blatantly violated the law and killed a tipsy pedestrian) it was counted as "alcohol related". While I cannot be sure of these accusations (what else should they be called?), we never found any rebuttal of them. This kind of exaggeration could also lead to a grossly exaggerated estimate of deaths from violence (child soldiers are often kept drunk, but it does not mean that the alcohol killed them).
In any case, I am not likely to become sufficiently expert to sort out all the suspicious bits and inevitable errors in the analysis. No one could provide a critical review based on the tiny bit of information WHO provided; I could not find anything that addressed some of my questions. So it is always possible that the claim is right. But it is a bad epistemic strategy to observe someone making several claims that seem rather dubious, and which they do not even try to defend, and then assume that they are otherwise correct.
I recently pointed out, in a critique I was writing about a supposed expert review of a topic, that an "expert" on a topic cannot legitimately say "I only considered peer reviewed information" nor "the WHO said X, so X must be true". I pointed out that an expert on a topic is someone who can independently assess something that has been written whether peer reviewed or not -- i.e., provide a genuine peer review of it -- so it matters little whether the material was peer reviewed by someone else (a major theme of my posts recently which I hope to address again tomorrow). I might have added, based on my experience, it seems that perhaps the qualification for expertise also include being good enough to figure out what the WHO got wrong.
I am not sure what they got wrong about alcohol, so I understand that only the rarest of health reporters could hope to sort out the specific claims. But you would think that it would not be so rare for someone to have sufficient expertise to know that 4% is an extraordinary claim, that it would be unimaginably difficult to make good enough estimates for most numbers needed to calculate the result to two significant figures of precision, and that the people in charge of estimating the toll from a particular health problem generally have a vested interest in exaggerating the numbers.
The problem is one of an addictive cycle and co-dependence. Not the alcohol problem, the reporting problem. Researcher-pundits like those at the WHO, wow governments and others who read their precise estimates and long reports that include everything except the basis for the calculations, and believe that they must come from some magically correct process. Governments eat up those numbers because they need something, and would rather have fake certainty than have to deal with difficult uncertainty. Reporters too. And since governments and reporters would rather accept the claims than challenge them, those producing the claims have their behavior reinforced. As for other researchers, even those who might be inclined to be more honest, discover that if they do not play the same game, what they say might be overshadowed by the indefensible claims of those who do. I would really not be surprised if someone does not have a better estimate of mortality attributable to alcohol than WHO does, but unless you go look for it right now, I suspect you will never hear it.
Some questions that came to my mind: Which causes of death are you counting? (Some reports did actually list those.) What portion of the total deaths from each of those causes are you counting? (A high percentage of cirrhosis deaths makes sense, but the portion of breast cancer deaths – one of the endpoints that was generally listed – would have to be very small.) Are you trying to estimate a total that includes everyone who dies even so much as a day earlier than they would have as a result of their drinking? (That is quite likely a high number, including most everyone who dies of degenerative disease and ever drank too much, perhaps well over 4%, but not really what we think of when we attribute someone's death to a particular cause.) If not that "even one day" standard, what is the cutoff?
It turns out that the full WHO report did address the second of these questions. If you take a few minutes to trace back to their underlying report, it actually has a few pages (out of the 85 total pages) that unpack the calculations a bit. They report what percentage of the deaths from various diseases they attribute to alcohol. Figuring out these numbers seems ridiculously difficult: you have to make numerous estimates -- about exposure levels, rates of each relevant disease in the absence of alcohol, and how much someone's risk increases are a result of drinking a particular amount -- and repeat that exercise for hundreds of countries because each of these numbers varies radically from one population to another. They claim to have done it but do not explain how.
I cannot judge most of the summary numbers they report, but a few of them certainly gave me pause. More than a quarter of the deaths from oral and esophageal cancers were attributed to alcohol. But since those who calculate deaths attributable to tobacco (including at the WHO) attribute about three quarters of the deaths from those diseases to tobacco use, and it is clear that papilloma virus (HPV) is responsible for a lot (half would be plausible, especially among non-geriatric cases), and many cases are not caused by any of these, there seems to be some disagreement that needs to be resolved. (A single death can have more than one of these as a cause, but the overcount seems rather too great.) I am not an expert on violence statistics, but the attribution of one third of all deaths by violence to alcohol seems rather unlikely given recent events in Iraq, Afghanistan, Sudan, Congo, Mexico, etc.
Attribution of almost a quarter of road traffic deaths to alcohol probably sounds right to most people, based on what we have always been told. But a few years ago my colleagues and I got intrigued and researched the basis for such claims. We found plausible criticisms of those statistics that basically said that if someone died on the road and there was any alcohol involved anywhere (e.g., a sober driver blatantly violated the law and killed a tipsy pedestrian) it was counted as "alcohol related". While I cannot be sure of these accusations (what else should they be called?), we never found any rebuttal of them. This kind of exaggeration could also lead to a grossly exaggerated estimate of deaths from violence (child soldiers are often kept drunk, but it does not mean that the alcohol killed them).
In any case, I am not likely to become sufficiently expert to sort out all the suspicious bits and inevitable errors in the analysis. No one could provide a critical review based on the tiny bit of information WHO provided; I could not find anything that addressed some of my questions. So it is always possible that the claim is right. But it is a bad epistemic strategy to observe someone making several claims that seem rather dubious, and which they do not even try to defend, and then assume that they are otherwise correct.
I recently pointed out, in a critique I was writing about a supposed expert review of a topic, that an "expert" on a topic cannot legitimately say "I only considered peer reviewed information" nor "the WHO said X, so X must be true". I pointed out that an expert on a topic is someone who can independently assess something that has been written whether peer reviewed or not -- i.e., provide a genuine peer review of it -- so it matters little whether the material was peer reviewed by someone else (a major theme of my posts recently which I hope to address again tomorrow). I might have added, based on my experience, it seems that perhaps the qualification for expertise also include being good enough to figure out what the WHO got wrong.
I am not sure what they got wrong about alcohol, so I understand that only the rarest of health reporters could hope to sort out the specific claims. But you would think that it would not be so rare for someone to have sufficient expertise to know that 4% is an extraordinary claim, that it would be unimaginably difficult to make good enough estimates for most numbers needed to calculate the result to two significant figures of precision, and that the people in charge of estimating the toll from a particular health problem generally have a vested interest in exaggerating the numbers.
The problem is one of an addictive cycle and co-dependence. Not the alcohol problem, the reporting problem. Researcher-pundits like those at the WHO, wow governments and others who read their precise estimates and long reports that include everything except the basis for the calculations, and believe that they must come from some magically correct process. Governments eat up those numbers because they need something, and would rather have fake certainty than have to deal with difficult uncertainty. Reporters too. And since governments and reporters would rather accept the claims than challenge them, those producing the claims have their behavior reinforced. As for other researchers, even those who might be inclined to be more honest, discover that if they do not play the same game, what they say might be overshadowed by the indefensible claims of those who do. I would really not be surprised if someone does not have a better estimate of mortality attributable to alcohol than WHO does, but unless you go look for it right now, I suspect you will never hear it.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.