12 June 2011

Unhealthful News 163 - Bad McKinsey study about health financing: it stooped to the standards of health science

Last week, a report from the business consultancy McKinsey claimed that their survey found that 30% of American employers planned to eliminate health insurance for their employees when the "Obamacare" plan (the lame watered-down plan that will sort of provide a way to get health insurance for some people, though probably still leaving people like me screwed) finally takes effect.  This claim, a number much higher than other research had suggested, created a minor flurry of news and then criticism.

To understand the importance of this, it is useful to understand that in the US there is an obsession with government action crowding out private sector provision of goods, coupled with the notion that it is universally bad.  It is true that if government provides something with low or no charge it can take business away from beneficial private markets.  Even if the market is providing a better product at a lower social cost, if the government starts giving it away people might abandon the private alternative.  But crowding out can be beneficial since for many goods, like medical care financing, government can do a better job than the private sector.  Frankly it would be great if employer-provided health insurance which is incredibly inefficient (in terms of both costing more than it would cost government and crippling many businesses with a huge competitive disadvantage) was entirely wiped out by a government alternative. 

But the obsessive worry about crowding out is so crazy that there are those on the political right who argue that if only the government would just stop providing highways and schools the private sector could do a better job.  So the press in the US, which skews to the right, picked up on the McKinsey study and touted it as evidence that medical financing reform was bad.

It quickly became apparent that McKinsey could not back up their claim.  Greg Sargent of the Washington Post posted an early summary:
But as a number of critics were quick to point out, McKinsey’s finding is at odds with many other studies — and the company did not release key portions of the study’s methodology, making it impossible to evaluate the study’s validity.
There’s now been a new twist in this story.  I’m told that the White House, as well as top Democrats on key House and Senate committees, have privately contacted McKinsey to ask for details on the study’s methodology. According to an Obama administration official and a source on the House Ways and Means Committee, the company refused.
Based on that, Krugman commented:
One has to assume that there was something terribly wrong with the study. At any rate, nobody should be citing it until or unless McKinsey comes clean.  Oh, and if you ask me, this is a lot more important than some sex scandal.
Krugman went on to note Brian Beutler's post:
But multiple sources both within and outside the firm tell TPM the survey was not conducted using McKinsey’s typical, meticulous methodology. …. And that’s created a clamor within the firm at high levels to set the record straight.  “This particular survey wasn’t designed in a way that would allow it to be peer review published or cited academically,” said one source familiar with the controversy.  ….  Reached for comment today, a McKinsey spokesperson once again declined to release the survey materials….
At The Incidental Economist blog (who I recently picked on in my not-yet-finished comments about the cost of smoking, but will quote positively here), one author called it "Dangerous faux research" and wrote:
 Look, anybody can say what they like on a topic. They can put out a glossy report. They can claim they did a “survey” to make it sound scientifically rigorous. They can talk to the media all about it. They can stand behind their good name and reputation, if they have one. But when what they’re saying runs counter to previous experience and other credible estimates, they’d better have a good explanation.  But, McKinsey has no explanation. None. They’re stonewalling. 
….You know what would happen to me if I tried that? Suppose I sent my new results to a journal, results that were very different from that of others, and said, “Trust me. They’re good.” Well, my paper would be laughed out of the editorial office.  And that’s as it should be. That would not be research. That would be the opposite of research. That would be indistinguishable from making things up.
The primary reason I am writing about this is not to point out a bit of bad research about health economics, but to contrast the reaction with the typical reaction to research in health science itself, which often produces results that are equally unexpected, and that consistently fails to report key bits of the methodology, and few authors are honest enough to respond to requests to fill in what is missing.  You are familiar with the reaction to that:  Nothing. 

We generally have about as much idea about what produced health science research results as we do about what McKinsey did.  And though I do completely agree with that last quote – there is plenty of junk science that is published in health economics journals, and lots of study inputs come from black-box sources, and a study with mystery methodology is not exactly the same as making things up, though we do not know for sure.  It is certainly true that anyone can put out a glossy report, trade on their reputation, and make a bad survey sound like science – pretty much sounds like health science to me.  The only thing missing in health science is that the apparent consternation of an institution concerned with it reputation that is evident in the previous quote; I can only think of a handful of such cases ever.

In fairness, the missing information from the McKinsey study is not as subtle as the problems with most health research.  McKinsey omitted extremely basic information, given that their study was basically the answer to a single yes-or-no question.  For a list, see this story in Time (which another Incidental Economist blogger quite amusingly described with:  "Kate Pickert committed an actual act of journalism, and tried to get McKinsey to give her the necessary information. So far, they have refused. Her whole piece is worth reposting, so just go read it now.").

Still, sometime health science reports are missing this same information is missing and almost always are million equally critical information.  Any expert can recognize this, but the reaction is pretty much nothing:  Reporters just report the new bit of "truth".  Subject-matter expert bloggers might point out that the result is odd, but they ignore the failure to report the methods.  As for the government demanding more information, yeah right.  The government is accepts black-box health analyses as readily as the press does and, indeed, produces more of them than anyone else.  It seems the only time they bother to probe is when the study result has no practical implications but might affect their political bickering.

A more subtle point is just as damning.  Notice that the fight over the McKinsey results comes from the fact that it is a substantially larger number than previously estimated.  When health research is reported, this is generally not even noticed.  The review of previous studies in a new research report, let alone in the press reports about it, almost never distinguishes between big and small.  The common statement, "this result is consistent with previous studies that found an elevated risk…", might mean that the other studies estimated a completely inconsistent level of risk (much lower or higher), but only the fact that it is also elevated is considered.  By the standards of health science, the McKinsey report would not have been controversial; it would have been "consistent with" previous studies that showed that some employers will cut coverage.

It is pretty clear that the current fight is being carried out by pundits interested in how we pay for health who have no clue that the problems with the McKinsey study are pretty much de rigueur for the studies used to decide what to do with that money.

2 comments:

  1. Reading that post made me think that health news is like daytime television for the 9-5'ers. It's become fluff, entertainment, something to talk about at the water cooler the next day. People get wide-eyed over new horrors reportedly produced by alcohol, they get excited when some good news comes out about drinking coffee, and some of them (usually with less to do than the others) really get into it and start fan sites - er, lobby groups to end whatever the latest and trendiest scourge on our health is reported to be. Just like those who watch the daytime soaps, they perhaps don't want to be told that their stories aren't real; they just want to have something entertaining to think about during the day. Too bad that attitude helps perpetuate totally uncritical thinking about things that affect real lives.

    ReplyDelete
  2. Catherine,
    I agree that health news, like most news, is fluff for most people. The Wiener scandal is what passes for following lawmaking for most people. Even the New York Times labels that "ethics", not bothering to think any harder about real ethics. I might need a bit more convincing that disease groups are like groupie groups, though I am not saying I disagree.

    But your last point is definitely part of what motivates me. I fear that the Great Experiment of freedoms and government for the people is on a rather bad trend, and the cause is that most people do not bother to think critically. It makes them easy prey for the oligarchs. Yes, there are thousands of people writing blogs and books to try to address that directly, but that seems to be saturated and dead-end. But perhaps if people realized they could think critically about their health and retake control of that sector of their lives, they might learn how to not just be someone's sheep with regard to tax policy, etc. Not that I am suggesting that the health part alone is not a sufficient goal, but there might be more to it.

    ReplyDelete

Note: Only a member of this blog may post a comment.