You know the cliche, "if you spent less time criticizing others and more time just [doing X] yourself then [something better would happen]"? Well in epidemiology and health news reporting, the opposite is true. If epidemiology researchers spend 10% as much time critically analyzing and commenting on what was being churned out in the field rather than just churning more, the field would not be dominated by junk science. As for health reporters, they generally seemed to have missed the day of class in journalism school when it was explained that reporters should not just believe everything someone tells them.
The New York Times headline read, "Low-Salt Diet Ineffective, Study Finds. Disagreement Abounds." WebMD headlined "Study Shows Salty Diet Good; Heart Group Disagrees". Another was more direct about who was leading this supposed parade of disagreement: "Study: Low Salt Diet Not Helpful, CDC Disagrees".
To avoid overstating this rare burst of skepticism in the health press, it should be noted that rather more common were headlines like "Low-salt diet may be risky" (ABC), "Sodium won't kill you? Scientists shake up what we know about salt" (CBS), and "New study: Low-salt diet kills" (Canada Free Press). Meanwhile some others offered balanced in their phrasing, though it varied in quality with "New Study Suggests Salt May Not Be as Bad as Once Thought" (Fox) not really capturing the "it may be bad for some people" message picked up, albeit perhaps too subtly, by the Boston Globe: "Study questions value of salt reduction in healthy people".
Most of the articles, whatever the headline, reported the background that the conventional wisdom is that salt is bad for CVD risk and the basic results of the new study, and discussed in varying detail the various American initiatives to try to force people to eat less salt. Many of them then chose to emphasize reasons why we should not over-interpret the new result.
Unfortunately, as you may have guessed by now, this is not the good-news story about skeptical reporting or measured scientific assessment of the study. Rather, the reporters just dutifully reported that public health authorities – emphasis on the authority part of that, not the health part – launched an attack on the new study. Employees of the U.S. CDC raced to the press to denounce the study. They were clearly motivated by their aggressive support for anti-salt policy interventions, as were pretty much all of the other critical commentators that were quoted. Yes, of course, there were limits to the value of the study, including possible measurement error and choice of a healthy population that might miss the highest risk. But it was clear that a desire to doubt the result motivated the identification of weaknesses, not the other way around.
So even when there appears to be critical analysis in epidemiology and the health news, it is almost worse than the usual absence of critical thought. It is acting like a hired gun in defense of a threatened position.
When working in litigation or other adversarial settings, it is typical for one side to not like what some or all of the scientific evidence says and to try to poke holes in the studies by hyping their inevitable limitations. That is not a proper role of a testifying science expert, who is brought in to assess the overall weight of the evidence and is supposed to provide an honest assessment. I realize that not all of us adhere to those standards – many are just hired guns who will say whatever their clients want them to. But an honest assessment is what we are supposed to provide. However, it is sometimes a central part of a legal strategy when the overall weight of the science does not support the particular side, to try to attack inconvenient studies individually, pointing out flaws and suggesting those render the results completely uninformative. In case the comparison is not obvious, the anti-salt campaigners are behaving like litigators who do not have the facts on their side.
Of course, the evidence as a whole might support the anti-salt efforts, but honest public policy is based on analyzing all the evidence and making a balanced assessment. Any public official who has the urge to attack a new study rather than say "we need to incorporate this information into our decision making, considering its value and limits" should arouse distrust.
Equally sadly, the press reports that covered the controversy seemed to mostly just create confusion about the practical implications of the evidence. Credit to the Reuters report, that emphasized the sensible words of the new study's author:
It's clear that one should be very careful in advocating generalized reduction in sodium intake in the population at large. There might be some benefits, but there might also be some adverse effects.The NYT report, by contrast, descended into a bizarro-world discussion of how we should want randomized trials which (a) would not address the problems discussed about the new study (no one suggested confounding was its major limitation) and (b) would not be possible. And the (b) is not for the reason that the various quoted commentators suggested: that people would not volunteer to be assigned to a lifetime of a particular diet. It is actually more fundamental even than that: If you assigned a cohort of people to many decades of controlled-salt diets then when you got to the end of the study, you would have..., ...well, mostly you would have to hope that some sort of time travel had been invented by then, so you could send the results back to when they were relevant. You see, the effect of an exposure usually varies across relevantly-different populations, and the population that will exist fifty years from now will differ a lot from today's population, in terms of dietary choices other than salt and medicines in particular, but also a host of other environmental differences. So the study would have been about a population that no longer exists. It would not be useless – just as no study that bears on a question is useless, including the new one – but it would have major limitations – like every study that addresses this question, including the new one.
Any one who understands epidemiology would know that. It is just such a shame that no one who really understands epidemiology has a phone or email that would allow them to be contacted by reporters.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.