Cost-effectiveness

We seek charities that are "cost-effective" in the sense of changing lives as much as possible for as little money as possible. There are many limitations to cost-effectiveness estimates, and we do not assess charities only – or primarily – based on their cost-effectiveness. However, because different approaches to helping people can have extremely different costs – and because we want donors to have some sense of what they're "getting" for their donations – we include cost-effectiveness estimates for our top charities.

We feel that the following are important points to keep in mind when considering cost-effectiveness. (The rest of this page elaborates on these points.)

  • Charities frequently cite misleading and overly optimistic figures for cost-effectiveness.
  • Our cost-effectiveness estimates include all costs (including administrative costs) and generally look at the cost per life or life-year changed (death averted, year of blindness averted, etc.) However, there are many ways in which they do not account for all possible costs and benefits of a program.
  • Because of the many limitations of cost-effectiveness estimates, we give estimated cost-effectiveness only limited weight in recommending charities. Confidence in the organization's track record generally carries heavier weight when differences in estimated cost-effectiveness are not extremely large.
  • The impact we are most often able to estimate is the cost per life saved. There are many kinds of impact besides saving lives; we try to do our best to quantify the different ways in which a program may be improving lives, and to help donors decide (sometimes with the aid of formal frameworks such as the disability-adjusted life-year) how to compare these.

Charities frequently cite misleading cost-effectiveness figures

In The Life You Can Save, Peter Singer discusses the fact that many common claims about cost-effectiveness are misleading. We quote at length from the book, with his permission. (Note that our excerpt does not include footnotes, which are in the original.)1

"Organizations often put out figures suggesting that lives can be saved for very small amounts of money. WHO, for example, estimates that many of the 3 million people who die annually from diarrhea or its complications can be saved by an extraordinarily simple recipe for oral rehydration therapy: a large pinch of salt and a fistful of sugar dissolved in a jug of clean water. This lifesaving remedy can be assembled for a few cents, if only people know about it. UNICEF estimates that the hundreds of thousands of children who still die of measles each year could be saved by a vaccine costing less than $1 a dose. And Nothing But Nets, an organization conceived by American sportswriter Rick Reilly and supported by the National Basketball Association, provides anti-mosquito bed nets to protect children in Africa from malaria, which kills a million children a year. In its literature, Nothing But Nets mentions that a $10 net can save a life: "If you give $100 to Nothing But Nets, you've saved ten lives.

"If we could accept these figures, GiveWell's job wouldn't be so hard. All we would have to do to know which organization can save lives in Africa at the lowest cost would be to pick the lowest figure. But while these low figures are undoubtedly an important part of the charities' efforts to attract donors, they are, unfortunately, not an accurate measure of the true cost of saving a life.

"Take bed nets as an example. They will, if used properly, prevent people from being bitten by mosquitoes while they sleep, and therefore will reduce the risk of malaria. But not every net saves a life: Most children who receive a net would have survived without it. Jeffrey Sachs, attempting to measure the effect of nets more accurately, took this into account, and estimated that for every one hundred nets delivered, one child's life will be saved every year (Sachs estimated that on average a net lasts five years). If that is correct, then at $10 per net delivered, $1,000 will save one child a year for five years, so the cost is $200 per life saved (this doesn't consider the prevention of dozens of debilitating but nonfatal cases). But even if we assume that these figures are correct, there is a gap in them – they give us the cost of delivering a bed net, and we know how many bed nets "in use" will save a life, but we don't know how many of the bed nets that are delivered are actually used. And so the $200 figure is not fully reliable, and that makes it hard to measure whether providing bed nets is a better or worse use of our donations than other lifesaving measures.

"[GiveWell] found similar gaps in the information on the effect of immunizing children against measles. Not every child immunized would have come down with the disease, and most who do get it, recover, so to find the cost per life saved, we must multiply the cost of the vaccine by the number of children to whom it needs to be given in order to reach a child who would have died without it. And oral rehydration treatment for diarrhea may cost only a few cents, but it costs money to get it to each home and village so that it will be available when a child needs it, and to educate families in how to use it."

The cost-effectiveness figures we use

Early in our history, we relied largely on cost-effectiveness estimates provided by the Disease Control Priorities in Developing Countries report (DCP2).2 In 2011, we did a deep-dive investigation into one of these estimates, and found major errors that caused the published figure to be off by ~100x,3 and we have therefore changed our approach to cost-effectiveness. We now perform our own cost-effectiveness estimates for any charity that is a serious contender to become one of our top charities. It is still the case that the DCP2 estimates have shaped some of our strategic priorities in deciding which areas to explore (for example, our list of priority programs) but we do not rely on it for final recommendation decisions. Instead, whenever we are estimating the cost-effectiveness of a contender for a top charity spot, we perform our own analysis and publish the full details online.

Below we list some strengths of our estimates as compared to commonly cited figures, and then some weaknesses of our estimates that explain why we place only limited weight on their conclusions.

Strengths compared to commonly cited figures:

As discussed above, many commonly cited figures are misleading because they (a) account for only a portion of costs (for example, citing the cost of oral rehydration treatment but not the cost of delivering it); and/or (b) cite "cost per item delivered" figures as opposed to "cost per life changed" figures (for example, equating bed nets delivered with deaths averted, even though there are likely many bed nets delivered for each death averted

The cost-effectiveness estimates we use avoid these problems:

  • Our estimates are given in terms of life impact, and specify what sort of life impact can be expected. We generally make at least some attempt to convert impact into units of disability-adjusted life-years (DALYs), a common metric in public health, though we do not always find these units helpful or make them a key input into our recommendations.
  • Estimates include all direct costs associated with interventions, from all involved funders. Thus, planning costs, management costs, distribution costs, etc. are accounted for.
  • Estimates are generally based on actual costs and actual impact from past projects, to the extent this is possible; when we make projections, we attempt to gather all the information we can to inform such projections.

Limitations

The estimates we use do not capture all considerations for cost-effectiveness. In particular:

  • We generally draw effectiveness estimates from studies, and we would guess that studies often involve particularly well-executed programs in particularly suitable locations. While we make efforts to assess how representative studies are of average conditions, our ability to do so is often limited.
  • Estimates consider only direct impact, and do not attempt to incorporate the many ways in which a program may affect people beyond (or as a result of) its immediate impact on health/income. This issue is discussed specifically as it relates to GiveWell in a paper by Leif Wenar.4
  • Estimates are generally based on extremely limited information and are therefore extremely rough.

Because of the many limitations of cost-effectiveness estimates, we give estimated cost-effectiveness only limited weight in recommending charities (consistent with the appropriate approach to Bayesian adjustments to such estimates). Confidence in an organization's track record often carries heavier weight when differences in estimated cost-effectiveness are not extremely large.

The weight we feel it is appropriate to place on cost-effectiveness estimates has fallen over time. As discussed above, we previously made substantial use of the Disease Control Priorities Report, but our 2011 investigation and correction of its work led to general concerns about its approach. Our own cost-effectiveness estimates also seem to us to be fairly non-robust, i.e., sensitive to relatively small changes in assumptions, and our continuing work on refining them has led to an increasing sense of their non-robustness (example).

How cost-effective is cost-effective?

As stated above, we feel that common claims of donors' ability to save a life for a few dollars, or even a few cents, are generally overly optimistic. We list the most promising developing-world direct-aid programs we know of, with links to detailed write-ups including cost-effectiveness estimates, where available (note that most of these programs were identified using sources that focus heavily on cost-effectiveness).

Based on our knowledge of these programs:

  • The impact we are most often able to estimate is the cost per life saved. There are many kinds of impact besides saving lives; we try to do our best to quantify the different ways in which a program may be improving lives, and to help donors decide (sometimes with the aid of formal frameworks such as the disability-adjusted life-year) how to compare these.
  • We consider anything under $5,000 per life saved (or equivalent, according to one's subjective values about how to compare other sorts of impacts to lives saved) to be excellent cost-effectiveness. We consider anything over $50,000 per life saved (or equivalent) to be excessive for the cause of international aid, as it implies more than an order of magnitude higher costs than the strongest programs.

More on GiveWell's views on cost-effectiveness estimates

More blog posts on cost-effectiveness estimates:

Sources

Clicky Web Analytics