If You Read The Lancet, The Terrorists Will Have Won

Here’s a leftover pic from my recent trip to Trinidad. Photographer Andrew Currie had left it out of his initial set because my face appears a bit blurry. I’m actually thankful for this, since the blurriness subtracts five years from my appearance!

The gentleman with me is, of course, the great Austin Clarke. It was one of the highlights of my young writing career, sharing the stage with Austin.

There is more debate to be had regarding the Lancet study which estimates 650,000 dead as a result of the war in Iraq. I’m particularly riled up over a comment made by a visitor on Rondi’s blog, essentially that the Lancet is “siding with the Jihadis”. This is the sort of the hypocriticial discourse-suppressing crap that makes my blood boil. Apparently, unless a science journal publishes data that supports the pro-war agenda, it’s “siding with the Jihadis”.

Here is a very readable article that well summarizes the dimensions of the study. It points out that the take-home message isn’t the exact number of estimated excess deaths, but the magnitude of the estimate; it far exceeds Bush’s “off the cuff” estimate of 30,000.

Someone –I think it was D-Mack or Mischa– had asked if the study’s methodology was a commonly used one. In response, here are some quotes from people who should know. I got them from this source.

“This is the most practical and appropriate methodology for sampling that we have in humanitarian conflict zones.”
Richard Brennan, head of health programmes, International Rescue Committee

“The methodology used is consistent with survey methodology that has long been standard practice in estimating mortality in populations affected by war.” –Professor Mike Toole, Centre for International Health

“I don’t think there’s anyone who’s been involved in mortality research who thinks there’s a better way to do it in unsecured areas. I have never heard of any argument in this field that says there’s a better way to do it.” –Professor Richard Garfield, Columbia University

“The sampling is solid. The methodology is as good as it gets. It is what people in the statistics business do.” –John Zogby, Zogby International

“[Study design is] rigorous, [with] well-justified analysis of the data.” –Professor Frank Harrell Jr, Chair of Biostatistics, Vanderbilt University

“Given the conditions [in Iraq], it’s actually quite a remarkable effort. I can’t imagine them doing much more in a much more rigorous fashion.” –Steve Heeringa, Director of the Statistical Design Group at the Institute for Social Research at the University of Michigan

“[The study is] statistically reliable”. –Sir Richard Peto, Professor of Medical Statistics at the University of Oxford

“They have enhanced the precision this time around and it is the only scientifically based estimate that we have got where proper sampling has been done and where we get a proper measure of certainty about these results.” –Professor Sheila Bird of the Biostatistics Unit at the Medical Research Council

Let us not forget that the authors of the study aren’t a bunch of freshly graduated Masters students or freshman professors looking to build a career. Rather, they include world class public health researchers from the Bloomberg School of Public Heath at John Hopkins University. During my two years consulting to the NIH in Washington, DC, I was daily humbled by the brilliant epidemiologists that issued from that school.

This doesn’t mean the study is not flawed. I reserve final judgement for when the full methodology becomes available. I just wanted to point out three things: that some very credible leading minds in the area approve of the methodology; that the authors are themselves leading public health researchers; and that the study was peer-reviewed before being allowed onto the pages of the Lancet.

Moreover, there are competing sources of bias at play:

  1. The much discussed “main street bias” which would overestimate deaths, if indeed this bias is in play;
  2. The convenience sample bias that I mentioned yesterday, the one that requires researchers to stay within safe areas, which would drastically underestimate deaths;
  3. The fact that if entire households or families are destroyed, there would be no one to provide death certificates for the researchers; this would underestimate deaths;
  4. The researchers’ inability to interview families that have fled from Iraq could bias the results in either direction. If they fled because some had already died, then this would underestimate deaths. If they fled before any one of them could be killed, then this would overestimate deaths.

Any fair and rational discussion of the study’s relevance and rigor must touch upon all of these biases, not just the ones that skew the results in a direction unfavourable to a preconceived agenda.

Slate‘s Fred Kaplan, in an article highly critical of the study, summarizes the upshot this way:

“This point should be emphasized. Let’s say that the study is way off, off by a factor of 10 or five—in other words, that the right number isn’t 655,000 but something between 65,500 and 131,000. That is still a ghastly number—a number that, apart from all other considerations, renders this war a monumental mistake. Here’s the key question: Had it been known ahead of time that invading Iraq would result in the deaths of 100,000 Iraqis (or 50,000, or pick your own threshold number), would the president have made—would Congress have voted to authorize, would any editorial writer or public figure have endorsed—a decision to go to war?

“Here lies the danger of studies that overstate a war’s death toll. The war’s supporters and apologists latch on to the inevitable debunkings and proclaim that really ‘only 100,000’ or ‘only 200,000’ people have died. It’s obscene—it sullies and coarsens the political culture—to place the word ‘only’ in front of such numbers.”

The source article forthe quotes above is definitely worth reading, as it spells out well the media’s trend of focusing on the minimal controversy while ignoring the study’s message. One telling quote is from a BBC reporter who was asked why he quoted non-experts while ignoring true experts. He said: “I quoted those people because they are players.”

Does that make you as nauseous as it does me? I’m reminded of the so-called “controversy” surrounding Climate Change. There is no controversy. Pretty close to 100% of the global scientific community agrees that Climate Change is real. But the media, in an effort to be “fair and balanced” gives equal time to industry mouthpieces who deny the science, giving rise to the illusion of disagreement among scientists.

loading
×