03 June 2011

Unhealthful News 154 - Three random thoughts on bad epidemiology

(1) It was reported today that the indoor smoking ban in Britain might have motivated an uptick in pharmaceutical nicotine medical prescriptions during the period leading up to it, but the rate returned to baseline by about the time the ban was implemented.  One possible explanation is that smokers were trying out the pharma products in advance to see if they might be an adequate substitute for cigarettes when they could not get outside to smoke.  The reporter and researchers seem to have overlooked that possibility, and attributed all the purchases to quit attempts.

The story did, however, include the sensible suggestion that perhaps those who attempted to quit were just those who were on the verge of quitting anyway, given a little extra push by the impending ban.  The implication of this is a recognition (by a hardcore tobacco control person, btw) that active anti-smoking (or anti-tobacco or anti-nicotine) interventions may well produce a large "harvest effect".  That term refers to causes of death that are technically the proximate cause of someone's death, but that only kill people who are quite sick or old already, so should not be treated as other causes.  Peak air pollution events is a good example of this. 

If anti-smoking interventions mostly just harvest those who were going to quit soon, they are not working nearly as well as they appear.  It is quite remarkable to see a hint that someone understands that most epidemiology of smoking cessation is junk science.  Among the multiple reasons for this is that it does nothing to measure whether they are really changing someone's preferences and behavior rather than just harvesting.  (Note that I am not suggesting it is ethical to take actions to make people change their preferences and behavior, especially without their consent.  I am just saying that the cessation researchers consistently claim that this is what they have done, and they are quite likely wrong.)

I am currently having a discussion in which I am trying to convince a friend that interventions like telephone "quit lines" are not nearly as effective as is claimed because they are mostly just selecting for and aiding the focus of the almost-quitters.  I actually had a thought about this when I was doing university research:  I could send out a few of my students and researchers – perhaps the best looking ones – to walk up to smokers and say "we are working on encouraging cessation, and it would really make me happy if you would quit; here is my card, and can I get your contact information and check back with you in a month to see if you have quit".  My hypothesis was that this would be about as effective a cessation intervention as any other, because most of the effect of most tools is to just provide a reason to those who are seriously considering quitting soon to quit now rather than getting around to it sometime.

(2) A story today in the New York Times about teenagers getting into bicycle accidents in the San Francisco area was one of the most absurd examples of leaving out the denominator and similar errors in anything I have seen recently.  The headline is:
In Bay Area, Youngsters Are More Prone to Bicycle Accidents
The evidence?
The analysis shows that in the Bay Area cyclists ages 10 to 19 were involved in more traffic collisions — more than 3,200 from 2005 to 2009 — than any other age group.
That would be a lot more interesting if (a) we knew what the age groups were and (b) we knew how many cyclists there were in each.  The best they offer is:
In a region filled with thousands of adult cyclists, including daredevils who barrel through congested cities at high speeds, data showing that youngsters are most prone to accidents surprised even bicycle advocates. 
That suggests the denominator is larger for the older groups, so the rate is higher for the kids.  But it does not actually say that.  Can we get some numbers?
According to the data, San Jose had 434 collisions involving teenagers, the most of any Bay Area city. Oakland was second with 193.
Yes, those are numbers, but no, that is not useful at all.  Can we get some numbers about how many riders there are?
“I would have thought it would be males in their 20s” who would have the highest accident rates, said Renee Rivera, head of the East Bay Bike Coalition. “Anecdotally, I see mostly young adults cycling.”
I will take that for a "no".  So you really do not know if the the kids are suffering disproportionately, do you?  Perhaps it is just that the older cyclists spend their time observing places that have few kids, like, near campuses and business districts rather than residential neighborhoods.  So, can the reporter tell us anything about older riders?
In fact, cyclists in their 20s had the second-most collisions with motorists — about 3,100 from 2005 to 2009.
Well at least we have a hint (not an actual statement) that the age groups are all a decade of ages.  At least that is something.  But we also know that the headline was just saying that the kids were ahead by just  100, far lower than the measurement error we would expect for a statistic like this.  The article goes on for another web page and a half, but no further numbers are forthcoming. 

So is this a problem we should be worried about.  Yes:  Innumeracy among health reporters is a serious problem that demonstrates a "need for early education", as reported in the article.  Well, that was actually reported about the need for childhood education about traffic laws, but I think I have better evidence on my side.

(3) I have been keeping this one on file since I just had to say something about it.  A little over a week ago it was reported,
Italian government officials have accused the country's top seismologist of manslaughter, after failing to predict a natural disaster that struck Italy in 2009, a massive devastating earthquake that killed 308 people. ….  Enzo Boschi, the president of Italy's National Institute of Geophysics and Volcanology (INGV), will face trial along with six other scientists and technicians, after failing to predict the future and the impending disaster.
You might be thinking this was a joke – I was not sure I should believe it when I read it – but here is an independent report from another news source.

It is not entirely clear what to say about this in Unhealthful News other than, perhaps, to note how good a job the reporters did in going on to explain why this was scientifically absurd:  No one is capable of predicting earthquakes with enough certainty to offer timely warnings.  If you want "sometime in the next twenty years, you might not want to be standing near a large window in San Francisco" they might be able to help, but acting on such information is not all that practical.

But it is even worse because these guys are not going on trial for reporting something that was wrong (even though no one else could have done any better), but rather for merely not reporting the right information.  That is, they did not issue an incorrect report or warning; they merely issued no warning at all.  Even if they had thought that the earthquake was somewhat likely to occur that week and failed to report it, they would have, at worst, been uncaring people who let down their countrymen.  But failing to be a good samaritan is not criminal, so long as you did not cause the problem yourself, unless you are some kind of official acting under a duty to provide real-time service.  Italy is a bit strange, but I cannot imagine that the geophysics people there swear an oath to protect their country against all enemies, foreign and tectonic.  Indeed, you can see why they would not want to issue an incorrect warning even if they were 10% sure (which would be very impressive), which would cause a costly panic they would be condemned for when they were wrong.

Yet as troubling as this is, it would be just so cool if someone would indict health scientists for getting things wrong in a way that killed a few hundred people.  I would not suggest it for honest mistakes or impossible demands like the seismologists faced – we could just restrict it to those who were overtly dishonest in their analysis, or at least grossly negligent with regard to modern methods.  Better still, we could indict health reporters on the same grounds.

Yes, I realize there would be some unforeseen consequences.  But one of them might be a positive: we would probably have to end the War on Drugs to free up enough prison space.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.