One somewhat painful explanation is that saving newborns from immediate death by getting them to start to breathe, apparently the main benefit of the training, is not the most beneficial health intervention for extremely poor African societies. They are more worried about losing their productive adults or almost-productive older children. Still, Westerners seem particularly interested in saving the poor children, at least judging from the pathetic ads from high-overhead charities that beg for donations on basic cable. Moreover, some of the training (helping with breastfeeding, diagnosing diseases that are not immediately fatal) both saves lives and improves the health and welfare of the next generation.
So what has international public health funding gone to that was considered a better use of the funds than this? More pointedly, I wonder how much more money was spent in Zambia in the last ten years on anti-tobacco rather than coming up with 20K to train midwives?
Finally, I cannot help but say, come on people!, understand your numbers a little bit (I refer primarily to the research report, though the news story blindly quoted the obviously over-precise claims): Exactly 97 babies saved, not 100? Random sampling error alone (i.e., the confidence interval for the risk reduction) makes clear the estimate is not accurate beyond +/-10 or even 20, and that ignores other factors that might have affected the death rate over time (the researchers compared before- and after-training statistics). Even worse, exactly $208 per baby saved, not 200? That figure includes any error in estimating the number saved, as well as uncertainty about how much was spent. Someone knows what the grant was, but exchange rates, in-kind contributions of local resources, etc. make this number imperfect also.
And over the top absurd was (quoting from the abstract):
The intervention costs were $208 per life saved and $5.24 per disability-adjusted life-year averted.I am not just criticizing the apparent goal of averting life years. If the editors of Pediatrics (yes, it was in Pediatrics) cannot even figure out what is utter junk science, how can we expect them to make sure the wording is correct in their abstracts. (Just for the record, I did not know this was from Pediatrics when I decided to write about it today. I search the news for prominent interesting stories, not bad journals for easy targets.)
But let's assume that the life years were saved, or loss of life years was averted. The $5.24 assumes that the cost per baby saved is correct down to the last dollar, as well as assuming we know how many healthy years of life a Zambian born now can expect. It would be optimistic to think that could be guessed within 20%. No, that is way too optimistic – a factor of 2 seems more like it. That is, if the "5" part of $5.24 is correct, it is due to blind luck; the "2" part is silly; the "4" part is innumeracy.
Moreover, this is just a dumb measure for this intervention, and the authors, editors, and reviewers apparently do not understand this. Statistics like quality-adjusted life years saved are designed to understand the implications of interventions where simply adding up "lives save" (i.e., deaths delayed) is not meaningful. Examples of such situations are when the intervention improves quality of life but not longevity, when deaths are only delayed briefly, or when we want to compare two interventions that avoid deaths at very different ages. Saving a baby from dying immediately is completely unambiguous in its implications, and is made less meaningful by converting to life years (let alone quality/disability adjusted life years). This is not just because of the impossibility of estimating future life years, but also because the implications of saving a baby (cost of raising a child; effect on future childbearing; etc.) are completely different from those where that statistic is useful (comparing additional years of adulthood). The point of the measure is because "saving a life" means something very different when someone is disabled and will die a year later and when someone is healthy and will live decades more if saved now. But "life years" are at least that incomparable when we compare all of a newborn's life to an equal number of years added to the lives of a half dozen middle-aged people with families to support; those are obviously not the same. So using the conversion in this case completely defeats its purpose by changing a simple meaningful statistic into one that is hopelessly misleading.
So, the final score is: good news for Zambian newborns; very bad news for international aid priorities; further evidence that public health researchers are not good with numbers; and a suggestion that the editors of Pediatrics might actually view loss of quality-adjusted life years as a goal, which could actually explain a lot about the policy recommendations they make.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.