# The Givewell Blog All Categories (update page version)

Exploring how to get real change for your dollar.
Updated: 29 min 58 sec ago

### Allocation of discretionary funds from Q2 2019

Wed, 08/21/2019 - 11:38

In the second quarter of 2019, donors gave a combined $2.3 million to GiveWell for granting to recommended charities at our discretion. We greatly appreciate this support, which enables us to direct funding where we believe it can be used most effectively. We grant this funding to one or more of our top charities each quarter. We decided to allocate all$2.3 million to the Against Malaria Foundation (AMF). AMF is a GiveWell top charity that provides support for the distribution of long-lasting insecticide-treated nets to prevent malaria. AMF has been named a GiveWell top charity seven times. We chose to allocate the second-quarter funding to AMF because we believe AMF has a highly cost-effective and time-sensitive opportunity to spend it.

Our bottom line

We continue to recommend that donors giving to GiveWell choose the option on our donation form for “grants to recommended charities at GiveWell’s discretion” so that we can direct the funding to the top charity or charities with the most pressing funding needs. For donors who prefer to give to a specific charity, we note that if we had additional funds to allocate at this time, we would very likely allocate them to AMF, which we believe could use additional funding for highly cost-effective work, even after receiving the $2.3 million in funding mentioned above. Summary In this post, we discuss: • what AMF will do with additional funding. (More) • other possibilities we considered. (More) • our process for deciding where to allocate funds. (More) What will AMF do with additional funding? AMF told us that it will use additional funding to support a distribution of nets scheduled for 2020 in the Democratic Republic of the Congo (DRC). Distributions are often delayed by a few months. Our best guess is that these nets will be delivered in late 2020 or in 2021. DRC has a higher malaria burden than most of the other countries where AMF supports distributions. We model AMF’s work in DRC to be more than 1.5 times as cost-effective as AMF’s past work, on average—we estimate that a donation of roughly$2,000 to support work in DRC will avert a death, compared to $3,600 for AMF’s work overall.1You can see our calculations by making a copy of our 2019 version 4 cost-effectiveness model; this will enable you to edit the sheet and change the values in the drop-down menu as described below: 1. Our estimate of the cost-effectiveness of AMF’s work in general: Go to the “Nets” tab. In cell B125, you’ll see the “Median cost per death averted (after accounting for leverage and funging)” for AMF. The value is$3,554.
2. Go to the “Country selection” tab, change the value for the Against Malaria Foundation in cell B7 from “Overall” (which includes all countries AMF works in) to “DRC” on the drop-down menu.
3. Our estimate of the cost-effectiveness of AMF’s work in DRC: Go back to the “Nets” tab. The value in cell B125 is now $2,072. 4.$3,554 divided by $2,072 = ~1.7. jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We consider this to be the most promising funding need among our top charities, in terms of timeliness and cost-effectiveness. Our process for comparing top charities’ needs each quarter is described in greater detail below. Open questions and uncertainties Although we see this as a very promising opportunity, we are somewhat unsure how AMF will actually allocate the funding it receives. AMF’s role in net distributions is to: 1. identify countries with funding gaps (funding needs that aren’t otherwise expected to be met) for nets; 2. find distribution partners (in-country non-profit organizations or government agencies) to carry out the distributions; 3. purchase nets; and 4. work with distribution partners to monitor the distribution and use of nets. While AMF has told us that it will allocate additional funding to DRC, it is possible that AMF will deviate from its funding plans in the face of changing circumstances, primarily changes in the status of discussions with governments and changes in the amount of funding it has available to allocate. The most common changes in AMF’s plans in recent years have been (a) delays in distributions, often due to governments taking longer to sign agreements than AMF had originally estimated, and (b) changes in the quantity of nets purchased by AMF due to larger population numbers being found during registration than the government had estimated at earlier stages of planning. According to AMF, the total funding gap in DRC over the next two years (2020-2021) is$55 million. In addition to its plans to fund work in DRC, AMF currently holds $39 million to fund distributions in three other countries. Although AMF is in discussions about funding these distributions, it has not yet signed formal agreements to do so. If any of the discussions fall through, we expect AMF to reallocate the funding it has set aside. In addition, AMF raised$38 million in 2018,2More specifically, this is AMF’s total revenue between February 1, 2018 and January 31, 2019, which is GiveWell’s 2018 “metrics year.” jQuery("#footnote_plugin_tooltip_2").tooltip({ tip: "#footnote_plugin_tooltip_text_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); and we estimate that AMF will continue to raise at least half of that amount annually, independent of whether GiveWell allocates additional discretionary funding to AMF. This suggests that AMF will raise enough funding in the next year to substantially reduce the size of the funding gap in DRC, though the timing of when funding is received may affect the timing of distributions. If AMF fully fills the DRC funding gap, it seems intuitively likely that there would be other bottlenecks that might impede its progress, such as ability to find partner organizations with the capacity to implement the distributions and fulfill AMF’s reporting and monitoring requirements. We do not know where or when AMF would choose to fund nets if it had more funding than it could allocate to DRC in 2019 to 2021.

We incorporate our uncertainty about where AMF will use additional funding into our cost-effectiveness estimate of its work. When we made our first-quarter discretionary funding allocation, which also went to AMF, we modeled an 87 percent chance of AMF’s additional funding supporting nets in DRC.3This model has not been published. jQuery("#footnote_plugin_tooltip_3").tooltip({ tip: "#footnote_plugin_tooltip_text_3", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); As we considered where to grant second-quarter discretionary funding, we made a minor downward adjustment to 75 percent due to AMF’s continued lack of signed agreements with other countries and thus our greater uncertainty over how funds will be spent. Even with this uncertainty incorporated, we model AMF’s funding gap in DRC as a highly cost-effective opportunity.

Other possibilities we considered

Malaria Consortium’s seasonal malaria chemoprevention (SMC) program

When we granted discretionary funding we received in the first quarter, we focused on AMF and another top charity, Malaria Consortium’s SMC program, as the most promising recipients. Our decision centered on our comparison of the two organizations:

• Against Malaria Foundation
• We modeled additional funding to AMF as more cost-effective than additional funding to Malaria Consortium’s SMC program. Our best guess, which we did not subject to our formal internal review process, was that AMF was 38 percent more cost-effective than Malaria Consortium’s SMC program.
• We viewed AMF’s funding gap in DRC as time-sensitive because our expectation is that AMF receiving funding now will allow it to distribute nets sooner than if it receives the same amount of funding later this year.
• Malaria Consortium’s SMC program
• We viewed Malaria Consortium’s SMC program as likely to have more overall impact per dollar based on unmodeled qualitative factors described in “Principle 2” here.
• We did not expect that directing additional funding to Malaria Consortium would influence its spending on 2019 and 2020 programs—in other words, we didn’t see providing funding to Malaria Consortium as being particularly time-sensitive.

Weighing these factors, we ultimately chose AMF over Malaria Consortium based on its somewhat higher modeled cost-effectiveness and more time-sensitive funding need.

We now model additional funding to AMF as roughly 33 percent more cost-effective than additional funding to Malaria Consortium’s SMC program, as a result of adjusting the chance of additional funding supporting nets in DRC from 87 percent to 75 percent. We have not received any new information to update us on the time sensitivity of Malaria Consortium’s funding needs, and we continue to view Malaria Consortium as stronger than AMF on unmodeled qualitative factors.

We don’t view the comparison of the two organizations as meaningfully different than in the previous quarter, and we thus chose to prioritize AMF over Malaria Consortium again.

Other top charities

As far as we know, our six other top charities have not had any major changes in their funding needs or cost-effectiveness since March. We did not update our cost-effectiveness model since making our last quarterly allocation decision, nor did we receive any updates on our top charities’ room for more funding, beyond the $4.7 million in first-quarter discretionary funds that we allocated to AMF. Process for deciding where to allocate funds We follow the principles described in this blog post when deciding between funding opportunities. We ask our top charities to alert us throughout the year if they learn of any new funding opportunities that we should consider in our discretionary funding decisions. None of our top charities informed us of such an opportunity for second-quarter funding. With no new funding opportunities presented to us, we returned to our first-quarter funding recipient, AMF. When we granted first-quarter funding to AMF, we noted that AMF had a time-sensitive and cost-effective funding opportunity in DRC and a funding gap that was much larger than we were able to fill. As we considered where to allocate second-quarter funding, we asked AMF for information to help us assess whether that continued to be true. We asked AMF about its progress in signing net-distribution agreements, its ability to absorb additional funding for work in DRC, and whether additional funding sent to AMF in the next few months would contribute to filling the funding gap in DRC. Notes [ + ] 1. ↑ You can see our calculations by making a copy of our 2019 version 4 cost-effectiveness model; this will enable you to edit the sheet and change the values in the drop-down menu as described below: 1. Our estimate of the cost-effectiveness of AMF’s work in general: Go to the “Nets” tab. In cell B125, you’ll see the “Median cost per death averted (after accounting for leverage and funging)” for AMF. The value is$3,554.
2. Go to the “Country selection” tab, change the value for the Against Malaria Foundation in cell B7 from “Overall” (which includes all countries AMF works in) to “DRC” on the drop-down menu.
3. Our estimate of the cost-effectiveness of AMF’s work in DRC: Go back to the “Nets” tab. The value in cell B125 is now $2,072. 4.$3,554 divided by $2,072 = ~1.7. 2. ↑ More specifically, this is AMF’s total revenue between February 1, 2018 and January 31, 2019, which is GiveWell’s 2018 “metrics year.” 3. ↑ This model has not been published. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } } The post Allocation of discretionary funds from Q2 2019 appeared first on The GiveWell Blog. ### Experiments in GiveWell communication Tue, 07/23/2019 - 12:38 One of our top priorities is to increase the amount of money we direct to our recommendations. As part of our effort to do this, we’re planning to try new kinds of communication. We hope to reach people who haven’t heard of or connected with GiveWell in the past, and to increase retention of our current donors by making the experience of donating through GiveWell more compelling. We are experimenting on our homepage and in emails with using images and making our cost-effectiveness estimates more prominent. Our goal is to improve people’s connection to our work without compromising the accuracy of what we share. There are potential downsides to this approach. We expect to balance our goal of communicating in a way that is emotionally compelling with our commitment to honesty and not misleading donors or overstating the case for our recommendations. We’re not planning a major overhaul of GiveWell’s website or other communications in the near term, and we are unsure if we will make major changes in the future. Most of GiveWell’s communications will look as they always have. Our hope in the coming months is to learn whether there are new ways we can communicate about our work to increase our impact. We’re writing this post to share with you the context behind these experiments. Summary In this post, we discuss: • Our communication experiments. (More) • Challenges and potential downsides of our approach. (More) • How you can help us improve. (More) Our communication experiments We’re initially experimenting with using images and emphasizing cost-effectiveness information in our communications. We selected these experiments based on our intuition, our understanding of best practices in the nonprofit sector, and the feedback we’ve received from GiveWell’s donors and others. Over the years, we’ve heard from a number of our supporters that they wish GiveWell had more emotionally-oriented content so that they could more easily share GiveWell with their peers or feel more connected to their own gifts. We also understand that most charity fundraisers make emotional appeals tied to specific individuals or projects. Fundraisers may use cost figures to promote their causes, although these figures can be misleading (for example, claiming you can save a child’s life by donating$0.50—we estimate that even a very cost-effective program requires closer to $2,400 to avert the death of a child). Taking these considerations into account, we want to move in the direction of sharing content that people can easily connect to without sacrificing honesty and accuracy. We plan to make these changes gradually and to see what works before committing to a long-term path. We’re starting by making the following changes to our homepage and certain email content we share: 1. Adding images. We believe we can create a closer connection to our top charities by showing pictures of the work they do—either to illustrate how the program is carried out or to show the people they have helped. We want to do this respectfully. 2. Featuring our cost-effectiveness figures more prominently. Although we don’t advise taking our cost-effectiveness estimates literally, we do think they are one of the best ways we can communicate about the rough magnitude of expected impact of donations to our recommended charities. A few years ago, we decided not to feature our cost-effectiveness estimates prominently on our website. We had seen people using our estimates to make claims about the precise cost to save a life that lost the nuances of our analysis; it seemed they were understandably misinterpreting concrete numbers as conveying more certainty than we have. After seeing this happen repeatedly, we chose to deemphasize these figures. We continued to publish them but did not feature them prominently. Over the past few years, we have incorporated more factors into our cost-effectiveness model and increased the amount of weight we place on its outputs in our reviews (see the contrast between our 2014 cost-effectiveness model versus our latest one). We thus see our cost-effectiveness estimates as important and informative. We also think they offer a compelling motivation to donate. We aim to share these estimates in such a way that it’s reasonably easy for anyone who wants to dig into the numbers to understand all of the nuances involved. We’ve chosen two places to run our initial experiments with using images and emphasizing cost-effectiveness estimates: GiveWell’s homepage and certain email content. Homepage updates Our intuition is that someone spending a few minutes on GiveWell’s homepage would not come away with a clear understanding of what GiveWell does or an emotional connection to our work. We hypothesize that adding images, illustrations, and cost-effectiveness estimates will help new visitors better understand and connect to GiveWell’s work. We are also planning to link to a citations page that provides sources for the calculations we use on the homepage and enables readers to easily access the details of our research if they want to vet or understand our claims. We plan to test two new versions of the homepage this summer; you can see preliminary versions here and here. We’ll be testing these pages against each other and our current homepage. We expect to update our homepage pending the results of this experiment—we’ll likely be looking at visits to our top charities page from the homepage, the bounce rate on the homepage (the percentage of visitors who leave the page without going to other parts of our website), and the duration of time spent on the homepage. Impact emails We hypothesize that drawing clearer connections between our donors’ support and what it enables charities to achieve will increase retention of GiveWell’s donors. We think doing so will make the experience of giving through GiveWell more meaningful and memorable. One way we think we can do this is by reporting to donors what we expect the impact of their gifts to be. Information about impact has always been available on our website via our cost-effectiveness model, but it has neither been linked to individual donation amounts nor sent directly to donors. Up until recently, if a GiveWell donor was interested in the impact of their gift, they’d have to track down the relevant part of our cost-effectiveness model and do their own calculation. Now, we’re experimenting with more proactively sharing this information. In the fall, we sent an email to a group of our donors who gave to the Against Malaria Foundation (AMF), a GiveWell top charity that distributes insecticide-treated nets to prevent malaria. This email explained how AMF works, using photographs of net distributions, and included our best estimate of the impact each individual’s donation would have, in terms of the nets purchased and deaths averted. You can see an example of this email here. We ended this experiment after a few weeks due to a technical glitch. This year, we piloted sending an email to donors who supported “Grants to recommended charities at GiveWell’s discretion.” We grant these discretionary funds each quarter to the GiveWell top charity or charities that we believe have the most pressing funding needs. When we made these grants in 2019, we sent an email to donors who contributed to the discretionary funds. The email announced where we chose to grant the discretionary funds and why, along with a description of the charity that received the funds—including images of its work—and a calculation of each donor’s expected impact based on our cost-effectiveness analysis. You can see an example of this email here. Anecdotally, these emails have been positively received. Over a dozen recipients of the “Grants to recommended charities” emails have contacted us (unprompted) to let us know they appreciated the information. We do not yet feel confident in extrapolating the impact of these emails on donor retention, as most donors give on an annual basis. We plan to continue sending these emails each quarter when we decide where to grant discretionary funds and to assess over the long term whether they impact donor retention. Challenges and potential downsides A major challenge we face with this project is striking the right balance between communicating clearly, creating a connection to our work, and honoring our values. We anticipate that: 1. the use of images could fail to treat beneficiary populations with the respect they deserve (for a discussion of some simplistic narratives about the relationship between donors and beneficiaries, see this blog post). We plan to be particularly careful about our selection of images and avoid depictions that do not respect the dignity of our beneficiaries. 2. the use of images might make our marketing harder to distinguish from typical charity outreach. 3. the use of cost-effectiveness figures may make it harder to distinguish GiveWell’s carefulness (the hundreds of hours our researchers collectively spend per year on cost-effectiveness analysis) from charities’ often unjustified claims about cost per impact. 4. the use of cost-effectiveness figures may cause donors to take these estimates literally rather than as a rough sense of the magnitude of expected impact of donations to our recommended charities. To mitigate this and item (3) above, we plan to make links to the detailed analysis behind our cost-effectiveness figures readily available. The upside of moving more money to top charities and increasing donors’ engagement with our work seems worth tackling this challenge and its potential downsides. How you can help us We’re planning to move relatively slowly in this direction and adjust our actions based on the feedback we receive. If you have feedback about how our new communications are changing your view of the GiveWell brand (positively or negatively), please let us know by emailing info@givewell.org. We’re excited to be on this new path and hopeful it will lead to more funding for our top charities. The post Experiments in GiveWell communication appeared first on The GiveWell Blog. ### June 2019 open thread Mon, 06/17/2019 - 12:03 Our goal with hosting quarterly open threads is to give blog readers an opportunity to publicly raise comments or questions about GiveWell or related topics (in the comments section below). As always, you’re also welcome to email us at info@givewell.org or to request a call with GiveWell staff if you have feedback or questions you’d prefer to discuss privately. We’ll try to respond promptly to questions or comments. You can view our March 2019 open thread here. The post June 2019 open thread appeared first on The GiveWell Blog. ### Allocation of discretionary funds from Q1 2019 Wed, 06/12/2019 - 12:45 In the first quarter of 2019, donors gave a combined$4.7 million for granting to recommended charities at our discretion.

We really appreciate the generosity of our supporters in making it possible for us to regularly allocate funding to the top charity or charities that we believe can best use additional funding. Thank you!

In this post, we discuss our decision to allocate this $4.7 million to the Against Malaria Foundation (AMF), as well as the process we followed to arrive at this decision. We continue to recommend that donors giving to GiveWell choose the option on our donation form for “grants to recommended charities at GiveWell’s discretion” so that we can direct the funding to the top charity or charities with the most pressing funding needs. For donors who prefer to give to a specific charity, we note that if we had additional funds to allocate at this time, we would very likely allocate them to AMF, which we believe could use additional funding for highly cost-effective work, even after receiving the$4.7 million in funding mentioned above.

Our bottom line

As we did last quarter, we focused our efforts on deciding between allocating funding to Malaria Consortium vs. AMF. We currently believe that AMF has a more time sensitive funding need than Malaria Consortium, and our best guess is that it will have equivalent impact per dollar to Malaria Consortium. This led us to allocate funding to AMF.

What changed since last quarter?

In March 2019, we modeled AMF as somewhat more cost-effective than Malaria Consortium’s seasonal malaria chemoprevention (SMC) program, believed that both organizations had time-sensitive funding opportunities, but believed that Malaria Consortium would have more overall impact per dollar, when taking into account unmodeled qualitative factors (see “Principle 2” here). For this round of grantmaking, we updated slightly positively on AMF’s cost-effectiveness and believed that AMF had a time-sensitive funding opportunity while Malaria Consortium did not. These factors were sufficient to tip the balance in favor of allocating this funding to AMF.

What AMF will do with additional funding

AMF expects to allocate all funding that it receives in the near future toward distributing insecticide-treated nets in the Democratic Republic of Congo (DRC) in 2020.

The vast majority of the funding AMF currently has on hand is set aside for distributions in a series of other countries (we are aware of which countries but have been asked not to name them while discussions are continuing) in 2020, and in DRC in 2019 and 2020; this funding totals $57 million. It has made verbal agreements with those countries for those distributions, but has not yet signed contracts to commit the funding. There is a chance that one or more of these agreements will fall through, which could change when or how AMF uses the additional funding it receives now; we think the risk is real but that AMF’s other options (particularly to put more funding into DRC distributions in 2020-2021) are good and so we don’t see this as a major concern. According to AMF, agreements that have reached this stage with countries AMF has worked with before (as it has with each of these four countries) have not fallen through in the past. AMF also has an additional$5.5 million in uncommitted funds on hand, which it plans to allocate to DRC for 2020 distributions.

AMF estimates it could use up to an additional $12.0 million in DRC in 2020, after the$4.7 million we are granting to it, and up to an additional $36.8 million in DRC in 2021. Our process Our process for making this granting decision was less intensive than the process we used for the funds we received in the last quarter of 2018. We focused on making some updates to the information we had relied on last quarter, including: 1. Estimating Malaria Consortium’s room for more funding for SMC, in light of its receiving the$10.1 million in discretionary funds we granted last quarter.
2. Speaking with AMF and Malaria Consortium to discuss how additional funds would be used. We did not speak with other top charities for updates, as we believed, based on our work in late 2018 and earlier this year, that either Malaria Consortium or AMF could most cost-effectively use additional funding.
3. Updating our estimates of the cost-effectiveness of additional funds to AMF and Malaria Consortium. We applied the same changes as are discussed in the first footnote in this blog post to the latest version of our published cost-effectiveness model, which was updated in March to correct for an error in our model of insecticide-treated nets. These updates to the published model have not been published or vetted, and so are more likely to contain errors than our published cost-effectiveness model. We took this shortcut at this time to enable us to pursue other research work and because modest changes in the cost-effectiveness analysis (as a result of minor errors) would not have changed our conclusion here. We thoroughly revisit our comparisons between top charities once per year for our annual recommendations refresh in November; we are also developing a process to make it easier to update our published model throughout the year to reflect new information about how top charities would spend additional funding. The difference between our unvetted best guess and our published estimate for AMF is largely due to modeling the majority of marginal funding as going to DRC (which has a higher malaria burden that most of the other countries where AMF operates).

We ultimately relied on the same six principles as are described in this blog post.

We wrote in March 2019 about our decision to allocate the funding we received in the fourth quarter of 2018 to Malaria Consortium’s SMC program. Below, we discuss updates to our understanding of the value of allocating marginal funds to AMF and Malaria Consortium’s SMC program since March 2019.

AMF

We identified and corrected an error in our cost-effectiveness analysis of insecticide-treated nets, which increased the estimated cost-effectiveness of AMF. As a result of this update, our (unvetted and unpublished) best guess is that marginal funding to AMF is approximately 40% more cost-effective than marginal funding to Malaria Consortium’s SMC program; in March, our best guess was that marginal funding to AMF was approximately 20% more cost-effective.

Malaria Consortium’s SMC program

Our understanding is that additional funding received at this point in time would be unlikely to influence Malaria Consortium’s spending on 2019 and 2020 programs, and that decisions regarding 2021 spending will likely be made in early 2020, after we have made our (larger) year-end funding recommendations.

This is a change from the previous quarter: due to our allocation of $10.1 million to Malaria Consortium last quarter, Malaria Consortium currently has enough funding to fund its work on SMC through 2020, even if it were to expand to the maximum scale it is considering reaching in 2020. (Our calculations, which were reviewed by Malaria Consortium, are available here.) We expect that Malaria Consortium will have a large funding gap for SMC work in 2021, and we may fill some of that funding gap later in 2019. The post Allocation of discretionary funds from Q1 2019 appeared first on The GiveWell Blog. ### Evidence Action is shutting down No Lean Season Thu, 06/06/2019 - 11:59 This post discusses a set of issues with Evidence Action’s No Lean Season program. No Lean Season is a former GiveWell top charity and GiveWell Incubation Grant recipient. It is now shutting down. Evidence Action discusses its decision in this blog post. Here, we share a significant amount of detail about this decision and the factors that contributed. Proactively sharing detailed information about a charity’s shortcomings may be unusual, but it is core to GiveWell’s mission. We are dedicated to transparency about our recommendations—the good and the bad. Evidence Action has reviewed this post, and we’ve discussed our thinking at length with its senior leadership; however, the views expressed are our own. We have been impressed with Evidence Action’s commitment to transparency and continue to support its other work. These updates have not substantially changed our view of Evidence Action; we expect large programs to experience problems, to a certain extent. We believe Evidence Action responded to these problems responsibly, although we have several open questions. Summary Evidence Action is shutting down No Lean Season, a former GiveWell top charity that distributed no-interest subsidies to support seasonal migration in Bangladesh. As we have discussed previously, a study of the No Lean Season program in 2017 found disappointing results; this led to our removal of No Lean Season, in agreement with Evidence Action, from our list of top charities. In early 2019, Evidence Action’s senior leadership received allegations that a junior employee of the government agency in Bangladesh responsible for approving the No Lean Season program allegedly forged the government approval, allegedly in collaboration with an employee of the program’s implementing partner. The government agency allegedly later asked the implementing partner for a bribe to grant approval of the program. Senior leadership at Evidence Action then began an investigation that was largely unsuccessful in its attempts to learn more due to lack of full cooperation from the implementing partner. Evidence Action terminated its relationship with the implementing partner as a result. Evidence Action’s senior leadership also found that some Evidence Action program staff who worked directly with the partner did not fully cooperate with its investigation and had violated internal Evidence Action policy. Evidence Action decided to shut down No Lean Season because the cost of finding and supporting a new implementing partner was too high, given the disappointing 2017 study results. Separately, Evidence Action also informed us of a tragic accident involving migrants from households that had received No Lean Season subsidies. We do not believe this contributed to the decision to shut down No Lean Season, but we are sharing it in this post as the investigation into this accident recently concluded. We will provide additional information on the following in this blog post: • We outline below the factors contributing to Evidence Action’s decision to shut down its No Lean Season program (More): • The disappointing 2017 study of the program at scale. (More) • Evidence Action’s termination of its partnership earlier this year with the organization implementing the program in Bangladesh. After learning of the alleged improprieties (referenced above) in February 2019, senior leadership at Evidence Action began an investigation, conducted by external, independent legal counsel. Given the seriousness of the original allegations, Evidence Action also terminated its contract with the partner. The implementing partner largely refused to cooperate with the investigation, and as a result Evidence Action will not reengage with this partner in the future. (More) • In the course of its investigation, senior leadership at Evidence Action found evidence of an approximately$400 payment by an Evidence Action program staff member that violated its internal policies, as well as contradictory and potentially misleading statements made by some program staff members to investigators. This finding was not material in the decision to shut down the program, as the implementing partner’s lack of cooperation was already known at that point. (More)
• We summarize the findings of the investigation into the accident involving migrants whose families had received subsidies from the program. (More)
• We do not see any of the above issues as a significant update on Evidence Action as an organization. We expect challenges when working in international development, and think senior leadership at Evidence Action responded responsibly to address these challenges. We do retain open questions about Evidence Action’s selection of implementing partners and its process for hiring and evaluating staff. Finally, we and Evidence Action agree that it should continue to strengthen its financial controls going forward. (More)
• Evidence Action expects No Lean Season to have some funding remaining after the program fully closes out. We expect to ask Evidence Action to redirect the remaining funding it received from GiveWell for No Lean Season to Evidence Action’s Deworm the World Initiative, a GiveWell top charity. We (and Evidence Action) will also take into account donors’ preferences for reallocating this funding; we provide instructions for donors who supported No Lean Season to communicate their preferences to us below. (More)
Shutting down No Lean Season

Mixed evidence of impact for program

We removed our top-charity recommendation of No Lean Season after reviewing the results of a large randomized controlled trial (RCT) of its program during the 2017 “lean season.” Evidence Action agreed with GiveWell’s decision.

The study found that the program did not increase rates of migration in 2017, the first year in which the program was implemented at scale. This implied that, at a minimum, the impact of the program was sensitive to details of implementation, and, potentially, the program was not effective at scale. In either case, the results reduced our and Evidence Action’s expectations about the program’s future cost-effectiveness. As of late 2018, Evidence Action had stopped seeking donations for the program but was continuing to operate it and collect additional data on its impact. Evidence Action ran another large trial of the program in 2018 (for which data collection is ongoing in 2019), and we planned to reassess No Lean Season as a potential top charity in 2019, upon receiving the results from that trial.

Investigation into allegations against Evidence Action’s implementing partner

Termination of partnership with implementing partner

In early 2019, Evidence Action terminated its partnership with its partner in Bangladesh; this partner bore primary responsibility for implementing the program. Evidence Action reports that the termination of its implementing partnership accelerated consideration of shutting down the program, rather than waiting for results from the 2018 trial of the program, and tipped the balance in favor of ending it.

According to multiple Evidence Action program staff (and reported to GiveWell by the law firm that subsequently led an investigation into what happened), the implementing partner told these Evidence Action program staff in February 2019 that it had discovered that the government licenses for it to operate the program had been improperly granted, and that there was a possibility that bribes were paid to government officials to obtain the improper licenses. According to these Evidence Action program staff, the implementing partner said that it had approached the government agency to rectify the issue with the licenses and a high-level official asked for a bribe to issue the licenses. The Evidence Action program staff said that the implementing partner had asked Evidence Action for authorization to pay the bribe. The Evidence Action program staff reported this immediately to senior leadership at Evidence Action, who shared it with us and other major donors to the program soon after.

Evidence Action terminated its contract with the implementing partner as a result, shortly after learning of this event. Senior leadership at Evidence Action hired DLA Piper, a global law firm, to lead an investigation into the issue. DLA Piper told us that it was not able to learn significantly more about the circumstances around the allegedly improperly granted licenses because the implementing partner did not cooperate with the investigation; the implementing partner provided only incomplete financial records and declined to participate in interviews. What we know about these circumstances, therefore, is based on DLA Piper’s interviews with the Evidence Action program staff who were in the February meeting; these staff were consistent in their reports that the implementing partner said that a government official had requested a bribe, but they were inconsistent on some other details.

Payment in violation of internal policies and contradictory and potentially misleading statements by Evidence Action program staff

In the course of its investigation, DLA Piper found that some Evidence Action program staff made contradictory and potentially misleading statements to the investigators, including statements about a payment one of them made in 2018 (of approximately 400) that went against Evidence Action policy. According to the investigation findings, these staff members had learned that the implementing partner’s application to a government agency for licenses to operate the program in 2018 had been questioned and two of the staff discussed hiring a consultant to help with obtaining the licenses. The staff members requested approval from Evidence Action’s finance team to make a payment to hire a consultant; Evidence Action’s finance team responded that such consultant payments were not allowed by Evidence Action policy. The staff members later submitted a reimbursement request to Evidence Action for a cash payment for the same purpose. It is unclear whether a consultant was hired or how the funding was used. Evidence Action’s finance department sent the reimbursement. Senior leadership at Evidence Action has told us that the reimbursement should not have been sent and that its financial oversight practices were not adequate to detect this payment; it expects to make some changes to its controls and finance staff training as a result of this experience. We are unsure what to make of these findings. The investigation only established that these staff received reimbursement for a use of funds they had been informed was not permissible under Evidence Action policy, and that they made misleading statements to the investigators. There are missing pieces in the story that we are unable to account for (due to the implementing partner’s lack of cooperation with the investigation and conflicting and potentially misleading statements from some Evidence Action program staff). Senior leadership at Evidence Action has taken what we see to be appropriate corrective action in its staffing; we have chosen not to discuss details about individuals.[1] We understand that charities may on occasion experience malfeasance by staff, and we believe Evidence Action could not have prevented all possible scenarios where malfeasance might occur. We were disappointed to learn that Evidence Action staff appear to have made a payment in violation of Evidence Action policy and that this payment was approved; we are also disappointed that these staff made misleading statements to DLA Piper. We also believe that senior leadership at Evidence Action responded to the issues described above in a timely, thorough manner by investigating what occurred. While it is disappointing to learn of improper behavior at one of our top charities, our ultimate focus is whether the program accomplishes good in the world.[2] Of course, if we learned of wrongdoing that was not responsibly handled, and/or reflected a large and serious gap in internal controls, that could lead us to remove a top charity from our list. But broadly speaking, we would not expect to be aware of every instance of fraud (nor do we believe it would be cost-effective for most organizations to put in place controls that would absolutely prevent all malfeasance).[3] From discussions with senior leadership at Evidence Action, we do not believe the improper behavior by some Evidence Action program staff contributed to the decision to shut down the program, though it was uncovered as part of the investigation into the implementing partner. Decision to end the program In light of the 2017 RCT results, the need to find a new implementing partner, and the costs of doing so, Evidence Action decided to terminate the No Lean Season program. We agree with Evidence Action’s decision. This is a nuanced position: we agree with the decision to shut down the program, but we do not believe that the program is “bad” or “ineffective.” Instead, we believe that continued investment in No Lean Season—taking into account all of the challenges this would involve—is unlikely to be one of the best opportunities we or Evidence Action have to cost-effectively save or improve lives. We discuss how the 2017 study changed our assessment of No Lean Season’s cost-effectiveness here. The program was completed for the 2018-2019 season, with the exception of some subsidy repayment at the end of the season. Data collection and analysis for the 2018 RCT is ongoing and will be completed; we plan to write about the results once they are available. Cumilla accident A few weeks before the conversation that led to the termination of Evidence Action’s partnership with its implementing partner, there was an accident involving migrants from households that had received No Lean Season subsidies. We were saddened to hear of this tragic event. This tragedy did not trigger the shutdown of the program, but the investigation that Evidence Action conducted into the circumstances surrounding the accident recently concluded, so we include it here for completeness. According to news reports[4] (and reported to us by Evidence Action), in January 2019, a coal-laden truck struck a shed at a brick kiln in Cumilla District in Bangladesh. The shed collapsed, killing 13 individuals sleeping inside. Five of the individuals were from households that had received migration subsidies from No Lean Season. Four of those were between the ages of 15 and 17. It is No Lean Season’s policy to only provide subsidies to individuals over the age of 18 and No Lean Season had a number of protocols in place to enforce this policy[5]; senior leadership at Evidence Action told us that it believes that the teenagers did not receive subsidies directly. The teenagers may have migrated independently of the program or individuals from their households who were over the age of 18 may have accepted the subsidies and given them to the teenagers to use to migrate. Senior leadership at Evidence Action hired investigators to look into the circumstances around the use of No Lean Season subsidies by underage individuals. The investigators conducted a limited investigation (it was limited, at least in part, because the implementing partner did not provide requested documentation or interviews) and concluded that Evidence Action had “robust safeguards” in place for preventing underage migration, including checking birth dates on personal documents before issuing loans. We don’t see good reason to think that the program systematically increases overall risks of this type of accident. Has our view of Evidence Action changed? Taken together, the updates have raised questions about Evidence Action (see below). In general, malfeasance at a charity or its implementing partner could lead us to change our opinion of a charity. However, the details of this case have not led us to significantly reduce our confidence in Evidence Action. The decision to scale up No Lean Season was reasonable: high-quality evidence from when the program was operated at a small scale indicated that it had the potential to be cost-effective. Evidence Action decided to scale up based on that evidence, and ran another high-quality study to test the program’s impact at scale. While there were ethical and managerial lapses by some Evidence Action program staff and its implementing partner, as well as a failure of financial controls to catch an improper payment, we broadly believe that senior leadership at Evidence Action responded quickly, with transparency, and responsibly when the issues were uncovered, both to rectify the lapses and to consider how it might improve and prevent such lapses in the future. Overall, our high-level view of Evidence Action is very similar to what it was before we learned of these developments. We’ve written before that we see Evidence Action as a group we are highly aligned with and that we are excited to support its growth and development (see, for example, here). We have recommended GiveWell Incubation Grants to support Evidence Action’s operations as well as to support its work to develop potential new GiveWell top charities, and we count two of its programs, the Deworm the World Initiative and Dispensers for Safe Water as a top and standout charity, respectively. We expect large programs to experience problems, to some extent. We think senior leadership at Evidence Action took quick, thorough action to address the situation by launching an investigation and sharing updates with its major funders, as well as terminating its work with its implementing partner and taking corrective action with program staff who did not comply with Evidence Action’s policies. We have the following open questions about Evidence Action deriving from these developments: • Evidence Action’s selection of implementing partners. Should Evidence Action do more to vet its partners? Evidence Action has told us that it plans to make changes to its vetting practices as a result of this experience. • Evidence Action’s processes for hiring and evaluating staff. Evidence Action staff members violated its financial policies. How should Evidence Action improve its processes for hiring and evaluating staff? • Evidence Action financial oversight. While we think prevention of all malfeasance would be challenging (and may not be the best use of resources), are there cost-effective ways to reduce the likelihood of funds being misused in the future? What changes to its financial controls should Evidence Action implement? We plan to continue discussions with Evidence Action to better understand its work in these areas. What will happen with unused funds? Over the course of operating No Lean Season, Evidence Action received funding earmarked for this program specifically and spent down a portion of that funding. Evidence Action expects to have funding remaining that is designated for No Lean Season when it has fully closed out its work on the program. A large portion of the remaining funding is from Good Ventures, a large foundation with which we work closely, which donated to No Lean Season as a result of GiveWell’s recommendation. We expect to ask Evidence Action to redirect the remaining funding it received from GiveWell (donations made through our website or to GiveWell via check, wire transfer, or other means), including funding from Good Ventures, for No Lean Season to Evidence Action’s Deworm the World Initiative, which is a GiveWell top charity. If you made a donation to support No Lean Season and prefer that your donation (less the portion of total revenue that No Lean Season has spent) go to an Evidence Action program other than Deworm the World, please contact us at donations@givewell.org by July 31. We will also be emailing donors whose contact information we have. By default, if we don’t hear anything, funding will be directed to Deworm the World. Note that by “remaining funding,” we mean the original donation size multiplied by the percentage of total revenue for No Lean Season that will remain when the program is fully closed out. We then expect to recommend that Good Ventures reduce its next annual grant to Deworm the World (assuming Deworm the World remains a GiveWell top charity, which we expect it to) in December by the same amount, so that Deworm the World does not receive more GiveWell-directed funding in 2019 than it would have in the absence of No Lean Season’s remaining funds. We will post an update on our blog about how remaining GiveWell-directed funding for No Lean Season was reallocated. Evidence Action’s implementing partner holds some program funding, primarily the grant from Evidence Action to the implementing partner for the subsidies for migrants that was collected for the 2018 implementation of the program. It is unclear whether it will return these funds to Evidence Action; it has not yet agreed to do so. Notes 1. Evidence Action has told us that there is complexity in discussing in general terms the situations with the staff members involved because of the differences in the structure of their employment across several legal jurisdictions. 2. We generally agree with the “results not receipts” approach advocated by the Center for Global Development in this paper. 3. We did learn about two cases of staff fraud at GiveDirectly, another GiveWell top charity, in the past. You can read more in our review of GiveDirectly’s work here. 4. News reports of the accident are available here and here (among other places). 5. From Evidence Action’s blog: “The investigation found that the safeguards we had in place were robust, though ultimately could not fully eliminate the risk of an adult recipient choosing to pass their cash transport subsidy to a teenager in his place, contrary to program rules and protocols. These protocols were multilayered, and included verbally informing subsidy recipients of the condition that migrants must be at least 18 years of age; requiring subsidy recipients to sign or thumbprint an acknowledgement that both recipients and migrants (where different individuals) must be at least 18 years of age; reviewing national identification cards to verify that the subsidy recipient and any person that the recipient says plans to migrate from the household is at least age 18; and utilizing mobile data collection software that is programmed to prohibit field staff from including individuals reporting to be under the age of 18, in order to prevent accidental enrollment.” The post Evidence Action is shutting down No Lean Season appeared first on The GiveWell Blog. ### GiveWell’s plans for 2019 Thu, 05/16/2019 - 12:50 Our top priorities this year support our goals to (a) increase the impact per dollar of the funds we direct and (b) increase our money moved. In 2019, we are focused on: • Building research capacity. (More) • Experimenting with approaches to outreach to find ones that we can scalably use to drive additional money moved. (More) • Exploring new areas of research. (More) • Improving GiveWell’s organizational strength. (More) • Ongoing research. (More) Building research capacity We announced earlier this year our plans to hire researchers at three levels of seniority, listed here from most junior to most senior: Research Analyst, Senior Research Analyst, and Senior Fellow. Our goal is to have 3-5 signed offer letters in hand from new research staff by the end of 2019. We’re hoping that additional research capacity will enable us to expand the scope of GiveWell’s research, with the aim of finding opportunities that are more cost-effective than our current top charities. We’re planning to roughly double the size of the research team over the next few years. Outreach experimentation We plan to expand our outreach to current and potential donors going forward, with the aim of increasing the amount of money we direct to our recommended charities. As part of this work, we recently hired Stephanie Stojanovic as our first Major Gifts Officer. Our goal is to decide by the end of the 1st quarter of 2020 whether to scale our staff capacity further in the area of major gifts, based on Stephanie’s initial work.[1] We’re also planning to conduct experiments in 2019 related to how we message about our work to reach more people. These experiments could include work on search engine optimization and building landing pages that aim to communicate what GiveWell does and why it’s valuable, among other possibilities. We expect to have results by early 2020, as the bulk of donations we receive are made in December. Finally, we’re planning to search for a VP of Marketing to oversee work across outreach domains (including major gifts, donor retention, advertising, marketing, and written communications). We guess there is a 50 percent chance we make a hire for this role in 2019. Exploring new areas of research As mentioned above, we’re in the early stages of expanding the scope of GiveWell’s research. We plan to look into several new areas in 2019, including public health regulation and possible paths to support government aid agencies. This work is new for GiveWell, and as noted in our 2018 review post, we failed to make as much progress as we hoped in 2018 on our work on public health regulation. In 2019, we’re aiming to get substantially closer to the point where we have the staff structure to support grantmaking in new areas, though given how early we are in this work, we don’t yet have concrete goals we’re confident that we’ll achieve. A stretch goal for 2019 is to settle on a structure that we believe will support grantmaking in public health regulation and begin recommending grants in that area. We also plan to continue our investigation into possible paths to support government aid agencies; in particular, we plan to complete an investigation into an opportunity to do so in the area of results-based financing. Improving GiveWell’s organizational strength We expect to need additional operations capacity to maintain critical functions as GiveWell grows and to improve the organization going forward. We plan to hire one Operations Associate this year to assist with general operations needs, such as improving HR practices. We plan to hire many new staff over the coming years. In preparation, we plan to improve our procedures and information for recruiting, vetting, and onboarding staff to GiveWell this year, such as by improving inclusive recruitment practices and updating the substantive content of our onboarding activities. We also plan to improve our systems for soliciting feedback from staff about how GiveWell can improve as an organization, in order to give management better insight into how things are going. To accommodate our planned expansion, we plan to move to a new office that better suits our expected size and staff requirements. Ongoing research We have a number of ongoing research projects, detailed here. These include: • Completing a full draft of qualitative assessments of our top charities. In theory, we aim to maximize one thing with our top charity recommendations—total improvement in well-being per dollar spent—and this is what our cost-effectiveness estimates intend to capture. In practice, there are costs and benefits that we do not observe and are not estimated in our models, and so we allow for qualitative adjustments to affect our recommendations. We’re in the process of laying out a framework for qualitatively assessing relative organizational strength. • Updating key inputs into our cost-effectiveness estimates, such as: • How we use vitamin A deficiency data. • Using new malaria prevalence and child mortality data. • Using new data to update our estimates of costs incurred and target population reached for five of our top charities: Malaria Consortium’s seasonal malaria chemoprevention program, the Against Malaria Foundation, Helen Keller International’s vitamin A supplementation program, Sightsavers’ deworming program, and Evidence Action’s Deworm the World Initiative. • Better understanding the counterfactual to the work Evidence Action’s Deworm the World Initiative has done in India. This goal is one we hope to achieve if we have time, but is not critical to our assessment. Conclusion The concrete goals we aim to achieve in 2019 follow. We plan to revisit this list in early 2020 to assess our progress relative to our expectations, and to publish a blog post accounting for our work: • Building research capacity • Have 3-5 signed offer letters in hand from new research staff. • Outreach experimentation • Decide whether to scale up Major Gifts work. • Conduct experiments related to messaging about our work to reach more people. • Complete search for a VP of Marketing. • Exploring new areas of research • Look into several new areas, including public health regulation and possible paths to support government aid agencies. Get substantially closer to the point where we have the staff structure to support grantmaking in new areas. • Stretch goal: Settle on a structure that we believe will support grantmaking in public health regulation and begin recommending grants in that area. • Improving GiveWell’s organizational strength • Hire one Operations Associate. • Improve the staff onboarding process at GiveWell. • Improve systems for soliciting feedback from staff. • Move to a new office. • Ongoing research • Our full list of concrete research goals for 2019 is in this document. Notes 1. The majority of donations in support of GiveWell’s recommended charities are made in the fourth quarter of the year, and we generally don’t have a clear sense of the total amount given to GiveWell directly until the first quarter of the following year (and the second quarter for direct-to-charity donations that are reported to us), so we think this is the right time frame on which to assess major gifts work. The post GiveWell’s plans for 2019 appeared first on The GiveWell Blog. ### Review of GiveWell’s work in 2018 Wed, 05/15/2019 - 12:47 2018 was a successful year for GiveWell. We achieved most of our goals and our money moved (donations made to our recommended charities due our research) increased significantly. Each year, we look back at the goals we set the previous year and reflect on how our progress compared to our expectations. This post will briefly discuss our key achievements and failures in 2018. We describe in detail our progress on the goals we outlined in 2018 here. In 2018, we: • Directed an estimated65 million in donations to our top charities, not including the contributions of Good Ventures, a large foundation with which we work closely.
• Added senior hires in operations and outreach: a Director of Operations (Whitney Shinkle) and Head of Growth (Ben Bateman). We expect Whitney and Ben to make major contributions to our work in these domains.
• Continued to improve and expand our core research product, completing new intervention reports, deepening our analysis for several key inputs into our cost-effectiveness model, and providing more transparent explanations for how we decided to allocate funds between top charities.
Key achievements

Donations made to top charities as a result of our research

We currently estimate that the amount of money we directed to our top charities in 2018 was more than $65 million, not including the contributions from Good Ventures, a large foundation with which we work closely. This represents an increase of more than$20 million over 2017. The increase largely came from two multi-million dollar donations from donors who had supported GiveWell and/or our recommended charities in the past.

We plan to publish a full report on our 2018 donations and web traffic shortly.

Outreach and operations

We made two key senior hires in 2018: (1) Whitney Shinkle, who joined us in April as our new Director of Operations, and (2) Ben Bateman, who joined us in June as our first-ever Head of Growth.

We expect Whitney and Ben to play critical roles in laying the foundation to increase the amount of funding we can direct to our top charities. Whitney’s team, for example, is responsible for processing donations to our recommended charities, and for preparing GiveWell to increase the size of its staff. Ben is leading experiments to evaluate different ways we might increase the amount of funding we direct to our top charities via marketing and outreach.

Full details of our performance against our 2018 outreach and operations goals are here.

Research

We completed several projects that improved the quality of our cost-effectiveness estimates and how we write about them, and that we believe led to better decisions about where to allocate funds. For example, we made a major change to how we calculate worm intensity in the areas where our top charities work.

We also improved our transparency about these decisions, breaking our blog posts announcing our top charities into component parts to make them easier to follow (see 1, 2, and 3) and delving into more detail on our principles and funding gap analyses.

We published five new intervention reports, two of which were on the evidence for community-based management of acute malnutrition and syphilis screening and treatment during pregnancy, and recommended five new GiveWell Incubation Grants and two grant renewals. Two of our new grants supported Evidence Action Beta’s incubator and J-PAL’s Innovation in Government Initiative, respectively.

Full details of our performance on our 2018 research goals are here.

Key failures

Outreach and operations

We took a number of steps to improve our outreach to GiveWell’s existing donors. We had hoped this would lead to material improvements in retention of our donors as well as the amount of funding we were able to direct to our top charities from our donors. We haven’t completed a careful assessment of this work, but our belief at this point is that the steps we took last year are unlikely to have had a significant impact on donor retention.

Research

We made relatively little progress in exploring new areas of research (i.e., policy-oriented causes).

This page has more details on our progress toward the goals we laid out in early 2018.

We plan to publish a post soon detailing our plans for 2019.

The post Review of GiveWell’s work in 2018 appeared first on The GiveWell Blog.

### Allocation of discretionary funds from Q4 2018

Fri, 03/29/2019 - 12:32

In the fourth quarter of 2018, donors gave a combined $7.6 million in funding to GiveWell for making grants at our discretion. In this post, we discuss the process we used to decide how to allocate this$7.6 million, as well as an additional $0.8 million designated for grants at GiveWell’s discretion held by the Centre for Effective Altruism and$1.7 million in the EA Fund for Global Health and Development (which is managed by GiveWell Executive Director Elie Hassenfeld), for a total of $10.1 million in funding. We’re so grateful to have a community of supporters that relies on our work and is open to allowing us to allocate funding to the top charity or charities we believe need it most. We noted in November 2018 that we would use funds received for making grants at our discretion to fill the next highest priority funding gaps among our top charities. At the time, we wrote: If we had additional funds to allocate now, the most likely recipient would be Malaria Consortium to scale up its work providing seasonal malaria chemoprevention. Based on our analysis in 2018 as well as updates we have received from our top charities since that time, we have decided to allocate this$10.1 million in funding to Malaria Consortium’s seasonal malaria chemoprevention (SMC) program. The SMC program consists of treating children with a course of preventive antimalarial drugs during the time of year when malaria transmission is greatest.

We continue to recommend that donors giving to GiveWell choose the option on our donation form for “grants to recommended charities at GiveWell’s discretion” so that we can direct the funding to the top charity or charities with the most pressing funding needs. For donors who prefer to give to a specific charity, we note that if we had additional funds to allocate at this time, we would very likely allocate them to Malaria Consortium’s seasonal malaria chemoprevention program, which we believe could use additional funding for highly cost-effective work, even after receiving the $10.1 million in funding mentioned above. What Malaria Consortium will do with additional funding We wrote in detail about Malaria Consortium’s room for additional funding for its SMC program as of November 2018 here. We also spoke with Malaria Consortium for an update in early 2019. Our understanding of what Malaria Consortium will do with additional funding for its SMC program (including this$10.1 million), in order of priority, is as follows:

1. Contribute to filling a potential funding gap in Burkina Faso, the existence of which depends on the actions of other funders. If the gap materializes, filling it could require up to $3 million in addition to the$5 million that Malaria Consortium expects to have remaining on hand after what’s currently budgeted for 2019 and 2020.
2. Scale up further in Nigeria and Chad in 2020. Our impression is that, given drug production constraints and the length of time needed to plan for the implementation of a campaign, receiving additional funding now rather than in late 2019 (when we plan to make our next recommendation to Good Ventures to fund top charities) increases the likelihood that Malaria Consortium can use the funding for 2020 programs.
3. Fund the continuation of programs into 2021. Malaria Consortium has received enough funding to maintain its programs through 2020, but has not allocated funding to maintain programs beyond 2020. To maintain the 2019 program scale in 2021, Malaria Consortium would require an additional $14.8 million in funding, assuming no unbudgeted costs (e.g., additional scale-up) are incurred before then. Our impression is that there is little difference between receiving funding now and in late 2019 in terms of Malaria Consortium’s ability to use it to fund 2021 programs. Overview of our decision-making process In early 2019, we checked in with each of our top charities that seemed like plausible recipients of this funding, based on our assessment of their funding needs in late 2018. In general, these check-ins indicated that there weren’t updates in the marginal funding opportunities at our top charities. More details follow in the rest of this post. We refer below to “funding gaps,” which we use to describe the amount of additional funding that we believe could be used effectively (the gap between what charities could use and what they have on hand). After considering each funding opportunity, we came to believe that the two most promising funding gaps are Malaria Consortium’s for SMC and the Against Malaria Foundation’s. The Against Malaria Foundation (AMF), which distributes insecticide-treated nets to prevent malaria, currently has the opportunity to fund nets in the Democratic Republic of Congo (DRC); we expect a high level of cost-effectiveness for this opportunity due to high malaria rates in DRC. We discuss the comparison between these two funding opportunities in the next section. We followed the six principles described in this post in deciding between these two opportunities and ultimately decided to grant these funds to Malaria Consortium’s SMC program. Comparing Malaria Consortium and AMF What AMF would do with additional funding In February 2019, AMF told us it had$62.8 million in uncommitted funds, which it plans to commit to a few 2020 net distributions (these are not yet formal commitments—as of February, AMF had not yet signed agreements with government partners to fund these distributions). AMF told us that if it had additional funding at this time, it would allocate those funds toward closing the gap in funding for nets in DRC for 2020. AMF has also shared more detailed information with us about its plans for the funds it holds and its negotiations with country governments; that information is confidential at this time. AMF reports that the total need for funding in DRC for a universal coverage campaign across eight provinces is between $35 million and$45 million.

Comparison using our principles

Principle 1: Put significant weight on our cost-effectiveness estimates.

We estimate that Malaria Consortium’s SMC program and AMF are similar in cost-effectiveness but that AMF is somewhat more cost-effective on the margin.

The most recent version of our published cost-effectiveness model at the time we made this decision (2019 version 2) estimates that Malaria Consortium is 8.5 times as cost-effective as unconditional cash transfers (“8.5x cash” for short) and AMF’s work in DRC is 10x cash (calculated by making a copy of the spreadsheet and selecting DRC in the “Country selection” tab for AMF).

Our best guess of the cost-effectiveness of these two opportunities incorporates several additional adjustments. See this footnote for details.1We adjust for our guess about how factors that are not formally modeled would change the results. For details, see column AB of this spreadsheet, sheet “Consolidated funding gaps.” This adjustment replicates what we did to arrive at our recommendations at the end of 2018. (More in this blog post.)

For both AMF and Malaria Consortium, we update the country-specific malaria mortality data to be more recent (2017 instead of 2016 figures). For Malaria Consortium, we correct what we believe to be an error in our model (which makes a roughly 5% difference in the final cost-effectiveness estimate), and we have also used an updated method (compared to what we used previously) to account for the fact that the age range of children targeted for SMC differs slightly from the age ranges given in the available age-specific mortality data (3 to 12 months vs. 1 to 12 months). We plan to incorporate these changes into the published model in the future.

– We use DRC-specific cost data and adjustment for insecticide resistance. Our published cost-effectiveness model uses average data for these two parameters when a specific country is selected in the “Country selection” tab.

– We adjust the lifespan of a net downward by 10% for DRC. This is a rough guess based on findings from AMF’s past monitoring in DRC that suggested that nets wore out more quickly than in other locations where AMF has funded nets.

– We use a smaller fungibility adjustment than we do for other countries to capture the lower probability (compared to other countries where AMF operates) that DRC would reallocate funding that it receives from the Global Fund to Fight AIDS, Tuberculosis and Malaria to cover part of the funding gap for nets if AMF did not fund the distribution. Our understanding from conversations with AMF and the Global Fund is that DRC is relatively underfunded by the Global Fund, due to caps on how much it can spend in a single country and DRC’s large malaria burden, and so our guess is that there is less scope for reallocating funds from other malaria interventions to nets.

– We model most marginal funding as going to DRC, with some funding going to other countries. We do so firstly because we believe having additional funding on hand may lead AMF to commit more funding to other countries than it otherwise might, and secondly because of the possibility of AMF deciding not to commit additional funding or to cap the amount it provides to DRC if it has concerns about the quality of the 2019 distributions it is funding in DRC.

– We adjust AMF’s cost-effectiveness downward by 5% to account for the fact we recently learned that AMF has skipped some post-distribution surveys, leading us to update our estimate of potential misappropriation given missing monitoring results (see this spreadsheet).
jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });

With these updates, our best guess of the cost-effectiveness of these two opportunities is that additional funding to Malaria Consortium is 8.3x cash and to AMF is 10.0x cash, implying that AMF is 21% more cost-effective.

This estimate has not yet been vetted, so is more likely to contain errors than our published cost-effectiveness model. To enable us to pursue other research work throughout the year, we thoroughly revisit our comparisons between top charities once per year for our annual recommendations refresh in November. When making recommendations at other times of year, we ask ourselves “Have there been any major changes that should lead us to reconsider what we concluded last November?” In this case, we adjusted some of the inputs into our cost-effectiveness model to reflect what we have learned since November and found that the results were broadly similar to our published model. At this level of difference in estimated cost-effectiveness, which is small in relation to the uncertainty in the model, we are inclined to put substantial weight on the other principles discussed below, and particularly on Principle 2.

We are also somewhat concerned that funding AMF may create an incentive for AMF to prioritize less cost-effective spending opportunities over more cost-effective ones, thus reducing AMF’s overall cost-effectiveness in the long run. We estimate that the three other countries AMF is in negotiations with are less cost-effective places to work than DRC. If we were to provide funding to AMF for work in DRC, we could be indicating that a “gaming” strategy—in which an organization tells us that marginal funds would go to a more cost-effective opportunity because its funds on hand have been allocated to less cost-effective opportunities—results in additional funding beyond what it would receive if it allocated funding to more cost-effective opportunities first. We don’t want to create an incentive for organizations to prioritize funding less cost-effective opportunities ahead of more cost-effective ones. We haven’t estimated the potential impact of this factor quantitatively.

Principle 2: Consider additional information about an organization that we have not explicitly modeled.

While we incorporate many subjective factors into our cost-effectiveness models, there are additional costs and benefits that we believe may affect the true cost-effectiveness and that we do not believe are adequately captured by our models. Such uncaptured factors might include, for example: information that charities have and we lack about how to best to allocate funding among different locations; beneficiary experiences with the program that affect how much they benefit from it; and the degree to which charities have indirect impact through conducting research, acting as leaders in their fields, or bringing in new sources of funding.

As we generally do not have the opportunity to observe or measure these costs and benefits directly, we consider them qualitatively through proxies. Such proxies include: our perception of how thoughtfully charities answer our questions; whether they are transparent about mistakes they make; how successful they have been in meeting operational goals (such as hiring, geographic expansion, and instituting new technical systems); whether they conduct and publish research; the frequency of errors in the information they share with us; and whether they meet agreed-upon timelines for sharing information.

We plan to write more about factors that we consider outside of our CEA model in the next few months, as well as assessments of each of our top charities on the proxies we use.

Overall, we assess Malaria Consortium as consistently stronger on the above qualitative proxies than AMF.  Both organizations stand out from the vast majority of organizations we have considered for their transparency about both positive and negative results and their track record of collecting information about how their programs are performing. They have both spent a large number of hours over several years (for Malaria Consortium) or over a decade (for AMF) responding to our questions and document requests. This comparison is a relative one, and one that we have not fully justified publicly (but plan to shortly). Based on our experiences working with both organizations, we believe that Malaria Consortium has shown signs of having stronger organizational management.

Principle 3: Assess charities’ funding gaps at the margin, i.e., where they would spend additional funding, where possible.

We’ve accounted for what Malaria Consortium and AMF are likely to do with marginal funding in our cost-effectiveness estimates, above.

Principle 4: Default towards not imposing restrictions on charity spending.

On this principle, there’s no difference between the two opportunities. Funding provided by GiveWell to either program would not be restricted.

Principle 5: Fund on a three-year horizon, unless we are particularly uncertain whether we will want to continue recommending a program in the future.

On this principle, there’s no difference between the two opportunities.

Principle 6: Ensure charities are incentivized to engage with our process.

This principle favors Malaria Consortium, which has consistently provided requested information that aids us in understanding and evaluating their program. AMF has more often been delayed or inconsistent in providing the information we’ve requested.

Other options we decided against (our other six top charities)

Schistosomiasis Control Initiative

The Schistosomiasis Control Initiative (SCI)’s room for additional funding is highly dependent on how much funding it receives from the UK’s Department for International Development (DFID) over the next three years. As of the time we were making this decision, we had not yet received an update on the level of funding that DFID plans to provide. More information is available in our review.

Helen Keller International’s vitamin A supplementation program

Helen Keller International (HKI) told us that it plans to use the funding it has already received for vitamin A supplementation as we expected: to continue its work in Mali, Burkina Faso, Guinea, and Côte d’Ivoire and to restart work in Niger. With additional funding it would prioritize work in:

• Kenya, where it could spend about $2 million over three years. • Cameroon, where it could spend about$4.2 million over three years.
• Nigeria, where it could spend $0.6 million to conduct a study of the impact of technical assistance work. • DRC, where it could spend about$9 million to reopen a country office and fund vitamin A supplementation over three years.

In November 2018, we estimated that these opportunities were less cost-effective than Malaria Consortium’s SMC program.2For HKI’s programs, see this spreadsheet, sheet “Consolidated funding gaps,” column AB. For Malaria Consortium’s overall SMC program, see same spreadsheet, sheet “Cost-effectiveness results,” row 6. jQuery("#footnote_plugin_tooltip_2").tooltip({ tip: "#footnote_plugin_tooltip_text_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We did not revisit those calculations as part of the quarterly allocation process.

Evidence Action’s Deworm the World Initiative

Deworm the World has told us that it plans to follow the prioritization laid out in our recommendation to Good Ventures. That prioritization leaves the following opportunities unfunded:

• Extending its funding runway beyond 2020 to 2021.
• Holding sufficient funding for 2020 programming in India that is currently supported by other funders.
• Improving financial stability via increased reserves.
• Expanding to new locations (two states in India and one state in Nigeria).

At the end of 2018, we estimated that these opportunities were 15.0x cash on average; however, that average was largely driven by the opportunity to expand to two new states in India, which is relatively low priority for Deworm the World because it is prioritizing financial stability over further expansion. With that in mind, we prefer to allocate funding to Malaria Consortium.

Sightsavers’ deworming program

Sightsavers indicated to us that it plans to follow the funding priorities it presented in 2018, with the exception of one area where there is no longer room for more funding. As a result of that change, Sightsavers has sufficient funding for all remaining opportunities to fund deworming that it currently has capacity to implement.

END Fund’s deworming program

We didn’t ask the END Fund for an update on its funding needs in early 2019, as we didn’t expect that an update would lead us to allocate discretionary funding to its deworming program. More context for this decision is available here.

GiveDirectly

We didn’t ask GiveDirectly for an update on its funding needs in early 2019, as we didn’t expect that an update would lead us to allocate discretionary funding to its work. More context for this decision is available here.

Notes   [ + ]

1. ↑ We adjust for our guess about how factors that are not formally modeled would change the results. For details, see column AB of this spreadsheet, sheet “Consolidated funding gaps.” This adjustment replicates what we did to arrive at our recommendations at the end of 2018. (More in this blog post.)

For both AMF and Malaria Consortium, we update the country-specific malaria mortality data to be more recent (2017 instead of 2016 figures). For Malaria Consortium, we correct what we believe to be an error in our model (which makes a roughly 5% difference in the final cost-effectiveness estimate), and we have also used an updated method (compared to what we used previously) to account for the fact that the age range of children targeted for SMC differs slightly from the age ranges given in the available age-specific mortality data (3 to 12 months vs. 1 to 12 months). We plan to incorporate these changes into the published model in the future.

– We use DRC-specific cost data and adjustment for insecticide resistance. Our published cost-effectiveness model uses average data for these two parameters when a specific country is selected in the “Country selection” tab.

– We adjust the lifespan of a net downward by 10% for DRC. This is a rough guess based on findings from AMF’s past monitoring in DRC that suggested that nets wore out more quickly than in other locations where AMF has funded nets.

– We use a smaller fungibility adjustment than we do for other countries to capture the lower probability (compared to other countries where AMF operates) that DRC would reallocate funding that it receives from the Global Fund to Fight AIDS, Tuberculosis and Malaria to cover part of the funding gap for nets if AMF did not fund the distribution. Our understanding from conversations with AMF and the Global Fund is that DRC is relatively underfunded by the Global Fund, due to caps on how much it can spend in a single country and DRC’s large malaria burden, and so our guess is that there is less scope for reallocating funds from other malaria interventions to nets.

– We model most marginal funding as going to DRC, with some funding going to other countries. We do so firstly because we believe having additional funding on hand may lead AMF to commit more funding to other countries than it otherwise might, and secondly because of the possibility of AMF deciding not to commit additional funding or to cap the amount it provides to DRC if it has concerns about the quality of the 2019 distributions it is funding in DRC.

– We adjust AMF’s cost-effectiveness downward by 5% to account for the fact we recently learned that AMF has skipped some post-distribution surveys, leading us to update our estimate of potential misappropriation given missing monitoring results (see this spreadsheet).
2. ↑ For HKI’s programs, see this spreadsheet, sheet “Consolidated funding gaps,” column AB. For Malaria Consortium’s overall SMC program, see same spreadsheet, sheet “Cost-effectiveness results,” row 6. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } }

The post Allocation of discretionary funds from Q4 2018 appeared first on The GiveWell Blog.

Mon, 03/11/2019 - 13:53

Our goal with hosting quarterly open threads is to give blog readers an opportunity to publicly raise comments or questions about GiveWell or related topics (in the comments section below). As always, you’re also welcome to email us at info@givewell.org or to request a call with GiveWell staff if you have feedback or questions you’d prefer to discuss privately. We’ll try to respond promptly to questions or comments.

You can view our December 2018 open thread here.

The post March 2019 open thread appeared first on The GiveWell Blog.

### What is it like to work at GiveWell?

Thu, 03/07/2019 - 13:54

We (GiveWell) recently announced that we’re planning to expand the scope of our research and to roughly double the size of our full-time research staff (from approximately 10 to 20) over the next three years. I (James) am writing this post because I think GiveWell is an awesome place to work and I think now is a particularly good time to join.

I’ll start by telling the story of how I started working with GiveWell’s research team. Then I’ll explain why I think it’s a great place to work and how you can decide if you’d like to work here. Finally, I’ll add some notes on what the application process looks like, and how much time it’s likely to take if you reach the later stages.

If there’s anything you want to learn about that I’ve missed, please let me know in the comments and I’ll do my best to get back to you.

I should acknowledge that I was asked to write this post because I like my job a lot. I hope you’re willing to put this publication bias to one side for a few minutes.

My career before GiveWell

I started my career in consulting. It was OK, but I couldn’t shake the feelings that (a) I wasn’t doing anything useful, and (b) the research we did wasn’t always motivated by needing to get to the right answer. So after a few years I took an early career break, and went to do a master’s degree (in philosophy and economics). This was when I got really interested in figuring out where I should give money in order to most effectively help people.

I thought about applying to GiveWell during my master’s degree, but decided not to because my partner and I both lived and worked in London, and GiveWell is based in San Francisco. With hindsight, this was probably a mistake. I’ve done work remotely for GiveWell for the last two years, and—even though remote work does come with its challenges—it’s turned out just fine. Two years later, GiveWell applied for a visa for me, and I will join the staff this spring.

But back then, instead of applying to GiveWell, I joined the research team at the Centre for Effective Altruism (CEA). Here, I realized that working out which charities help people the most was a question of incredible importance, depth and difficulty. I decided that I’d like to spend a good chunk of my life trying to answer it better.

As part of CEA’s research into cost-effective giving opportunities, I’d started looking into preventing pesticide suicide as a potential high impact area for philanthropy. However, before I’d completed my investigation, CEA decided to discontinue its philanthropic research activities. Fortunately, my manager sent my preliminary work to GiveWell, who interviewed me, asked me to do a work trial (20 hours, paid) and then offered me a position as a research consultant. Five months later, GiveWell made a grant of $1.3 million to the Centre for Pesticide Suicide Prevention as a direct result of my research. That felt great. Why do I think GiveWell’s a great place to work? When I was considering whether to join GiveWell, my main questions were: 1. How much does this job help people? (more) 2. Is the work intellectually stimulating? (more) 3. Is the work something I’m likely to be good at? (more) 4. Will I be working with people who are excellent at what they do, share my values, and are nice to be around? (more) 5. Will I be able to work remotely? (more) I’ll go through each of these questions in turn. You can help people a lot by working at GiveWell. When you’re working as a philanthropic funder, your impact is a function of (i) how much funding you influence, and (ii) how much you can improve the allocation of that funding. GiveWell influences a lot of funding. In 2017, we influenced between$133 million and $150 million.1$133 million includes (i) donations to our top charities through GiveWell, (ii) donations directly to our top charities where donors explicitly indicated their donations were a result of GiveWell’s recommendation, and (iii) Incubation Grants funded by Good Ventures. $150 million includes our best guess of donations which were a result of our recommendations but for which donors did not explicitly indicate their donations were a result of GiveWell’s recommendation. jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We have 25 staff between the research, operations, and outreach teams, meaning that, on average each staff member influences ~$5-6 million each year. That’s more than individual staff influence at the Bill and Melinda Gates Foundation, the largest private foundation in the world.2The Bill and Melinda Gates Foundation made $4.7 billion in grants in 2017, with 1,541 employees = ~$3 million per employee. jQuery("#footnote_plugin_tooltip_2").tooltip({ tip: "#footnote_plugin_tooltip_text_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We also have a lot of control over how those funds are granted, subject to being able to clearly explain the rationale for those grants to our colleagues and donors who rely on our research.

Taking the conservative estimate of the portion of that funding that went to our top charities (as opposed to Incubation Grants) we estimate that, in expectation, this $117 million prevented 19,000 deaths, administered 50 million deworming treatments, and gave cash to 8,300 poor households. So how much have I personally influenced that funding? I’ve been the lead investigator on three grants: a$1.3 million grant to the Centre for Pesticide Suicide Prevention, a $1 million grant to J-PAL’s Innovation in Government Initiative, and a$300,000 grant to Fortify Health. The first two of these grants likely would not have happened without my work.

I led the discussion of how to allocate $64 million of funding from Good Ventures in 2018 by developing principles for making this decision. I’ve contributed to methodological improvements in our cost-effectiveness analysis, completed internal evidence reviews of tens of different programs, reviewed new research relevant to our top charities, and managed other researchers. Today, I’m leading our research into new types of interventions that fall outside of our traditional top charity criteria, and am exploring opportunities to help aid agencies spend their money more cost-effectively. I think that both of these projects have the potential to massively increase GiveWell’s impact. They’re still at a very early stage, and we want to devote more capacity to them longer term, so I see this as an enormous opportunity for new people at GiveWell to help shape the organization’s future research agenda. I don’t think this kind of impact is unusual for a GiveWell researcher. If you do well here, you’ll be given the opportunity to take direct ownership over a lot of your work, taking the lead on important decisions (with input from your manager and the rest of the team). Without a detailed cost-effectiveness analysis, I can’t confidently state that GiveWell is the single most impactful place you could possibly work. But if you think improving the lives of people living in extreme poverty is of the utmost importance, I think it’s near the top. The work is intellectually stimulating. GiveWell’s work starts with the question, where should our donors give their money to maximize their impact on people living in the poorest parts of the world? We break this question down into its constituent parts, and answer each part to the best of our abilities. For example, I’m currently looking into whether the effective regulation of lead paint might be a cost-effective way to improve childhood development outcomes.3This project is still in progress and hasn’t yet been published on our website. jQuery("#footnote_plugin_tooltip_3").tooltip({ tip: "#footnote_plugin_tooltip_text_3", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); This project involves: 1. Critically reviewing the academic literature on the links between (i) exposure to lead-based paint and high blood lead levels, (ii) high blood lead levels and cognitive function, and (iii) cognitive function and earnings. 2. Estimating the proportion of houses in low-income countries which are currently painted by using Google Maps random street view, and estimating the proportion of paint that contains lead by using paint studies. 3. Interviewing academic experts about the impact of lead on childhood development. 4. Interviewing implementers and doing online research to understand which organizations work on lead paint regulation, how much funding they currently receive, and how much an advocacy campaign costs. 5. Critically reviewing case studies of past campaigns to understand the factors that lead to successful regulation and enforcement. 6. Building a rough cost-effectiveness model using all of the above information. 7. Explaining and justifying my conclusions privately to the GiveWell team, and publicly to donors who rely on our work. This work isn’t just taking the headline results of some studies and plugging them into a spreadsheet. It requires thinking carefully and critically about how to interpret the entirety of the evidence available to us. What sources of bias or variance exist; how should they affect our best guess of what is actually true? And, ultimately, what should we do? How do I know if this is right for me? You won’t know until you try it, and I’d recommend applying to be a GiveWell researcher even if you’re unsure. Our current recruiting process includes a 20-hour paid work trial for candidates in the later stages of the process, which is a great opportunity for both sides to work out if it’s a good fit. But I can offer some pointers about what might indicate you’ll like the work: 1. You should enjoy and be competent at understanding academic evidence. You don’t have to have a PhD (unless you’re applying for the Senior Fellow position), but you should understand, or be able to quickly learn, what to look for in a study in order to interpret its results and assess its merits. 2. You should be excited by broad, thorny questions with no obvious answers. Most of the important questions in the world haven’t been answered decisively by rigorous academic studies. 3. You should enjoy making clear arguments and critically assessing the arguments of others. 4. You should prefer working quickly (relative to academia) to get the best answer you can to guide your decisions, rather than spending lots of time diving deep on a narrow question that isn’t going to change your decision. 5. You should be OK with most of the work being desk-based. I’ve attended workshops and built relationships by traveling to meet people when they’ve been important for achieving our objectives, but we’re not interested in publicity or relationships for their own sake. The majority of research work involves reading and writing at your computer. You don’t have to have all of these interests to succeed at GiveWell. I’m medium on 1 and 5, but close to maximum on 2, 3, and 4. Another way to work out if you might enjoy the work is to read this post by Rachel Glennerster comparing academic and policy jobs. If your reaction is these both sound great but policy jobs sound better, that’s a good sign. GiveWell researchers are more academically-minded and technical than typical grantmakers, but the work here is still closer to a policy job than an academic one. It’s this combination of academic rigor and practical recommendations that makes the job quite unique. Will I be working with people who are excellent at what they do, share my values, and are nice to be around? The people at GiveWell are among the most competent, kind and thoughtful people I’ve ever worked with. Some specific things I’ve observed about working at GiveWell: • Managers put a lot of effort into helping their reports improve. The management philosophy generally focuses on making the most of your strengths, ahead of mitigating your weaknesses. Managers also share feedback frequently to stay in sync with reports on how things are going. • Managers are very open to receiving feedback. During the first year I worked with GiveWell, I was constantly being asked what I disliked about my work. My disappointingly positive responses soon necessitated a switch to the un-dodgeable, “What’s the worst thing about working with GiveWell?” • Staff at GiveWell are remarkably conscious of other people’s feelings. I’ve seen plenty of disagreement, but when I’ve disagreed with my colleagues, I’ve generally felt like we’re all on the same side trying to get to a better answer. • Staff are very passionate about their work and take their jobs seriously, but GiveWell is flexible with working hours. We’re encouraged to work the hours in which we’re most productive, or fit our working hours around family commitments. Staff rarely feel pressure to work late into the evening, although they sometimes choose to do so. Is it hard to work remotely? Because I consult remotely from the UK, I don’t see as much of my colleagues as I’d like. A lot of people have asked me what it’s like working remotely and whether I have any tips. I do: • If you’re going to work remotely, I’d recommend spending a few weeks in California as soon as possible (GiveWell is happy to pay for remote staff to visit four times a year). Remote meetings feel a lot better when you’ve met the person on the other side of the screen in person before. • Make the most of the time you have for communication. The time difference between California and the UK has been a bigger issue for me than not being in the same location because I only have a few hours of overlap with most of GiveWell’s staff each day. There’s not really an easy solution to this, but it’s manageable if you’re efficient with that time. • Consider relocating if you can. GiveWell is open to staff working remotely on a long-term basis (just under half of our researchers currently work remotely). This works fine when you’re largely doing independent research, but it’s harder when you’re managing people. GiveWell sponsors international visas, although these can take a long time to obtain. How can I find out more and apply? • If you haven’t already, read the job description for our open research positions here. • Listen to this podcast interview that I did with the organization 80,000 Hours for more details about the kinds of questions we grapple with. • If you think you could contribute at GiveWell, but don’t fit neatly into any of the researcher roles, email jobs@givewell.org with a copy of your resume, a cover letter, and a demonstration of what you could contribute to our work. • If you have questions about working here which aren’t answered in the post, feel free to ask them in the comments and I’ll do my best to get back to you. If you’re excited about working at GiveWell, you can apply for the researcher positions here. Some notes on the application process Hiring is one of the most important decisions GiveWell makes so we want to do everything we can to ensure we hire the right people. While work trials take a lot of time, we think they’re the only reliable way for both GiveWell and applicants to figure out if it’ll be a good fit long term. They also give people the opportunity to demonstrate what they can do, even if they don’t come from a stereotypical academic background. As such, the application process has six stages, three of which involve doing work trials, and generally takes between 35 and 55 hours for people who reach the latest stages. 1. Initial application: upload your resume, answer some brief questions, and take an online test. (~90 minutes) 2. Conversation notes: Listen to a recording of an interview we conducted and take formal conversation notes. (3-8 hours, compensated) 3. Case study interview: Answering a question GiveWell has previously worked on. (~2 hours) 4. Work assignment: Critical review of some evidence to reach a considered conclusion in limited time. (~10 hours, compensated) 5. Remote trial: Working closely with a senior member of our research team. (10-20 hours, compensated) 6. Interview day: 1-2 days in the San Francisco office meeting the team and attending interviews. (7-14 hours, travel and accommodation reimbursed) We recognize this is a fairly heavy time commitment for people who reach the later stages. To some extent, we think this is necessary. But to try to mitigate that cost, we: • minimize the amount of time spent on the first stage of the process subject to it still giving us relevant information. • let people know as soon as we think it’s not going to work out. Only people who get to the next stage need to complete that task. • compensate people for time spent on major work trial tasks, and for travel expenses when they visit the office. • are flexible around peoples’ schedules for coming to visit the office. Notes [ + ] 1. ↑$133 million includes (i) donations to our top charities through GiveWell, (ii) donations directly to our top charities where donors explicitly indicated their donations were a result of GiveWell’s recommendation, and (iii) Incubation Grants funded by Good Ventures. $150 million includes our best guess of donations which were a result of our recommendations but for which donors did not explicitly indicate their donations were a result of GiveWell’s recommendation. 2. ↑ The Bill and Melinda Gates Foundation made$4.7 billion in grants in 2017, with 1,541 employees = ~$3 million per employee. 3. ↑ This project is still in progress and hasn’t yet been published on our website. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } } The post What is it like to work at GiveWell? appeared first on The GiveWell Blog. ### Announcing a call for grant applicants in Southeast Asia and Bangladesh Wed, 02/13/2019 - 12:42 Today, we announced a grantmaking process to look for outstanding organizations operating in Southeast Asia and Bangladesh. We’re working with Affinity Impact, a social impact initiative founded by the children of a Taiwanese entrepreneur, to provide three grants—one$250,000, and two $25,000 grants—to organizations that are operating programs in global health and development in any of the following countries: Bangladesh, Cambodia, East Timor, Indonesia, Laos, Myanmar, Philippines, and Vietnam. One of the goals of this grantmaking process is to help us better understand the giving opportunities in a geography in which we haven’t previously focused, and to learn from the grantmaking process whether doing so is an effective way to engage with philanthropists who don’t plan to support our current top charities. An overview of this process is available here. Details of the application process are here. Applications are due on April 1, 2019. If you represent an organization applying or considering applying for the grant and have any additional questions, please contact us directly via email at applications@givewell.org and mention that you’re applying for the “2019 GiveWell Grants for Global Health and Development in Southeast Asia and Bangladesh.” We will try to respond as quickly as possible. The post Announcing a call for grant applicants in Southeast Asia and Bangladesh appeared first on The GiveWell Blog. ### How GiveWell’s research is evolving Thu, 02/07/2019 - 12:26 To date, most of GiveWell’s research capacity has focused on finding the most impactful programs among those whose results can be rigorously measured. This work has led us to recommend, and direct several hundred million dollars to, charities improving health, saving lives, and increasing income in low-income countries. One of the most important reasons we have focused on programs where robust measurement is possible is because this approach largely does not rely on subject-matter expertise. When Holden and I started GiveWell, neither of us had any experience in philanthropy, so we looked for charities that we could evaluate through data and evidence that we could analyze, to make recommendations that we could fully explain. This led us to focus on organizations that had impacts that were relatively easy to measure. The output of this process is reflected in our current top charities and the programs they run, which are analyzed in our intervention reports. GiveWell has now been doing research to find the best giving opportunities in global health and development for 11 years, and we plan to increase the scope of giving opportunities we consider. We plan to expand our research team and scope in order to determine whether there are giving opportunities in global health and development that are more cost-effective than those we have identified to date. We expect this expansion of our work to take us in a number of new directions, some of which we have begun to explore over the past few years. We have considered, in a few cases, the impact our top and standout charities have through providing technical assistance (for example, Deworm the World and Project Healthy Children), supported work to change government policies through our Incubation Grants program (for example, grants to the Centre for Pesticide Suicide Prevention and Innovation in Government Initiative), and begun to explore areas like tobacco policy and lead paint elimination. Over the next several years, we plan to consider everything that we believe could be among the most cost-effective (broadly defined) giving opportunities in global health and development. This includes more comprehensively reviewing direct interventions in sectors where impacts are more difficult to measure, investigating opportunities to influence government policy, as well as other areas. Making progress in areas where it is harder to determine causality will be challenging. In my opinion, we are excellent evaluators of empirical research, but we have yet to demonstrate the ability to make good judgments about giving opportunities when less empirical information is available. Our values, intellectual framework, culture, and the quality of our staff make me optimistic about our chances, but all of us at GiveWell recognize the difficulty of the project we are embarking on. Our staff does not currently have the capacity or the capabilities to make enough progress in this direction, so we are planning to significantly increase the size of our staff. We have a research team of ten people, and we are planning to more than double in size over the next three years. We are planning to add some junior staff but are primarily aiming to hire people with relevant experience who can contribute as researchers and/or managers on our team. GiveWell’s top charities list is not going to change dramatically in the near future, and it may always include the charities we recommend today. Our top charities achieve outstanding, cost-effective results, and we believe they are some of the best giving opportunities in global health and development. We expect to conclude that many of the opportunities we consider in areas that are new for us are less cost-effective than those we currently recommend, but we also think it is possible that we will identify some opportunities that are much more cost-effective. We believe it is worth a major effort to find out. What areas will we look into? As with any exploration into a new area, we expect the specifics of the work we will undertake to shift as we learn more. Below we discuss two major areas of work we are embarking on and building our team for currently. In the long term, we are open to considering making grants or recommendations in all areas of global health and development. We have not yet comprehensively considered what those areas might be, but they could include (for example) research and development, or social entrepreneurship. Using reasoned judgment and less robust evidence to come to conclusions about additional direct-delivery interventions In the past, we have often asked, “does this intervention meet our criteria?” rather than “what is our best guess about how promising this intervention is relative to our top charities?” Our intervention report on education is a good example of asking the question, “does this meet our criteria?” It reviews all randomized controlled trials of education programs that measure long-term outcomes, but it does not attempt to reach a bottom line about how cost-effective education in developing countries is. We plan to more deeply explore how we can reach conclusions about how areas such as nutrition, agriculture, education, reproductive health, surgical interventions, mental health, and non-communicable diseases compare to our current top charities. Investigating opportunities to improve government spending and influence government policy Some of the areas we will consider exploring to leverage government resources and affect government policy are: Broad thematic area Examples Brief rationale Public health regulation Tobacco control; lead paint regulation; road traffic safety; air pollution regulation; micronutrient fortification and biofortification; sugar control; salt control; trans-fats control; legislation to reduce counterfeit drugs; soil pollution; pesticide regulation; occupational safety laws Some regulatory interventions to improve public health have had a large impact in high-income countries. Low-income countries can lack the government capacity or political will to implement these regulations. Charities can advocate or provide technical assistance to accelerate regulation and improve implementation. Improving government program selection Innovation in Government Initiative; Innovations for Poverty Action; IDInsight; Center for Effective Global Action Low-income country governments may not have the capabilities to select good programs to support with their limited budgets. Charities can directly assist governments to make better decisions in the short term, or help improve their capabilities to do so independently over the longer term. Improving government implementation Results for Development; Deworm the World in India Low-income countries may not have the capabilities to implement programs effectively. Charities can directly assist governments to improve the reach or quality of programs in the short term, or help improve their capabilities to do so independently over the longer term. Improving non-programmatic government capabilities Building State Capability Improving the administrative capabilities of a government can result in broad improvements in the way countries function. Improved or increased aid spending Center for Global Development; ONE Campaign; Overseas Development Institute; Brookings Institution Spending by high-income countries on global health and development accounts for a large portion of total spending in this area. There are groups who advocate for, and provide technical assistance to improve aid spending. Advocating for increased spending on highly cost-effective, direct-delivery programs Malaria No More; Uniting to Combat Neglected Tropical Diseases GiveWell’s money moved is a small proportion of total global spending on aid. We believe these dollars would go further if a portion were redirected to the highly cost-effective, direct-delivery programs we recommend. Increasing economic growth and redistribution Charter cities; infrastructure programs; trade liberalization; macroeconomic policy; International Growth Centre; tax reform Economic growth is an important driver of economic well-being over the long term. Government policies can be an important determinant of the rate of economic growth and the degree to which growth translates into well-being for the population. There may be opportunities for charities to assist in promoting growth and better distributional outcomes. Negative externalities of high-income country policies Immigration reform; trade liberalization; reducing carbon emissions Governments of high-income countries are incentivized to select policies which are popular with their own voters. These policies can impose substantial costs on low-income countries. Charities can advocate for these policies to be changed. Improving governance Election monitoring; anti-corruption; good governance awards; term limits; peace programs There are particular characteristics of the governance of a country (e.g. democratic accountability, stability, human rights, lack of corruption) which are strongly associated with the well-being of its people. Charities can advocate for these characteristics to be adopted or strengthened. Reducing the cost of health commodities Clinton Health Access Initiative Reductions in the cost of medical commodities can result in improved coverage and improved economic well-being for low-income households. Improving data collection Institute for Health Metrics and Evaluation Improved data can be used by a variety of actors to make better decisions. One-off big bets Mosquito gene drives advocacy and research We may come across promising projects that do not fit neatly into one of the above categories. How will our analysis change? How will it be the same? Writing up and publishing the details of the reasoning behind the recommendations we make is a core part of GiveWell. We will remain fully transparent about our research. Judgment calls that are not easily grounded in empirical data have long been a part of GiveWell’s research. For example, we make difficult, decision-relevant judgment calls about moral weights, interpreting conflicting evidence about deworming, and estimating the crowding-out and crowding-in effects of our donations on other actors (what we call leverage and funging). As we move into areas where measuring outcomes and attributing causal impact is more difficult, we expect subjective judgments to play a larger role in our decision making. For examples of the approach we have taken to date, see our writeup of our recent recommendation for a grant to the Innovation in Government Initiative, a grantmaking entity within the Abdul Latif Jameel Poverty Action Lab (J-PAL) or our page evaluating phase I of our 2016 grant to Results for Development (R4D). While writing about such judgments will be a challenge of this work, we are fully committed to sharing what has led us to our decisions, with only limited exceptions due to confidential or sensitive information. What does this mean for staffing and organizational growth? We need to grow our team to achieve our goals. Repeatedly this past year, we had to make the difficult choice to not take on a research project or investigate a grant opportunity that seemed promising because we did not have the capacity. We are planning to roughly double our research team over the next few years, primarily by adding researchers who have experience and/or an academic background in global health and development. We are looking to add both individual contributors and research managers to the team. We expect that the people we hire in the next few years will play a critical role in shaping GiveWell’s future research agenda and will be some of the leaders of GiveWell in the future. For more information about the research roles we’re hiring for, see our jobs. The post How GiveWell’s research is evolving appeared first on The GiveWell Blog. ### Schedule a quick call to make giving easier Thu, 12/20/2018 - 12:25 If you’re thinking about where to give to charity this year and it would be helpful to speak with a member of GiveWell’s staff about your decision, please let us know. We’re happy to answer questions sent to info@givewell.org or to schedule a call via the form here. On a call, we’d be glad to: • Provide an overview of our recommendations. We know it can be time-consuming to read and digest all of the content on our website. We’re glad to share a quick summary of our top charities list. • Assist with the logistics of making a donation and discuss different options for donating, such as appreciated securities, checks, and wire transfers. • Answer any questions about our research or recommendations. Due to limited staff capacity, it’s possible we won’t be able to speak with everyone who requests a call, although based on past experience we hope to be able to connect with anyone who gets in touch. We look forward to hearing from you! The post Schedule a quick call to make giving easier appeared first on The GiveWell Blog. ### December 2018 open thread Wed, 12/12/2018 - 12:33 Our goal with hosting quarterly open threads is to give blog readers an opportunity to publicly raise comments or questions about GiveWell or related topics (in the comments section below). As always, you’re also welcome to email us at info@givewell.org or to request a call with GiveWell staff if you have feedback or questions you’d prefer to discuss privately. We’ll try to respond promptly to questions or comments. You can view our September 2018 open thread here. The post December 2018 open thread appeared first on The GiveWell Blog. ### Staff members’ personal donations for giving season 2018 Mon, 12/10/2018 - 13:30 For this post, GiveWell staff members wrote up the thinking behind their personal donations for the year. We made similar posts in previous years.1See our staff giving posts from 2017, 2016, 2015, 2014, and 2013. jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Staff are listed in order of their start dates at GiveWell. You can click the below links to jump to a staff member’s entry: Elie Hassenfeld This year, I’m planning to donate to GiveWell for granting to top charities at its discretion. I feel the same way I did last year, when I wrote, “GiveWell is currently producing the highest-quality research it ever has, which has led to more thoroughly researched, higher-quality recommendations that have been compared to more potential alternatives than ever before.” I asked Holden Karnofsky, GiveWell’s co-founder, whether he thought there were promising opportunities for individuals with long-termist views; after checking with him, I believed that the Open Philanthropy Project and other donors were covering most of the opportunities I would find most promising. I also considered giving to animal welfare organizations. I looked briefly at Animal Charity Evaluators’ research but ultimately didn’t feel like I had enough time to think through how their recommendations compared to giving to GiveWell, so I defaulted to GiveWell. I hope to give this more consideration in the future. Natalie Crispin I will be giving my annual gift to GiveWell for granting at its discretion to top charities. We expect that all of our top charities will be constrained by funding in the next year and that several will have unfunded opportunities to spend funds in highly cost-effective ways (at least 5 times as cost-effective as cash transfers). Our current best guess is that GiveWell will grant the funds it receives for granting at its discretion to Malaria Consortium, which would allow it to expand its work preventing child deaths from malaria in Nigeria or other countries. There is also a possibility that we will identify an opportunity that is more cost-effective than how Malaria Consortium would use funding at the current margin. Over the next few months, we will be discussing with our top charities how they plan to use funding from Good Ventures and other funders and what that means for how they would use additional funding. Giving to GiveWell for granting at its discretion allows for flexibility to take advantage of those opportunities. I am very grateful for all the work, thoughtfulness, and hours of debate that my colleagues put into GiveWell’s recommendations this year. I am excited to support the most effective charities I know of. Josh Rosenberg I’m planning to give the same way that I did last year: • 80% to GiveWell for granting at its discretion to top charities. GiveWell’s top charities are the most cost-effective ways to help people that I know of. I see Malaria Consortium’s work on seasonal malaria chemoprevention (the current default option for discretionary funding) as a robust and highly effective giving opportunity. • 10% to animal welfare charities. I believe that animal welfare is a particularly important and neglected problem. • 10% to long-term future-oriented causes. I have not yet chosen a donation target in this cause area. If I do not find an opportunity I am satisfied with after a small amount of additional research, I will enter this portion of my giving into a donor lottery. I focused most of my giving on global health and development since GiveWell’s top charities have the most pressing funding gaps I am aware of. If I knew of an especially strong case for a particular giving opportunity in another cause area, I would be open to changing my allocation in the future. Devin Jacob I plan on making approximately 80% of my charitable donations in 2018 to GiveWell, with 100% of that money allocated to GiveDirectly. Compared to my colleagues at GiveWell, I value near-term improvements in material well-being more than I value reducing deaths. Donating to GiveDirectly is the best means of supporting this goal that I know of. I struggle each year when attempting to assess whether I should bet on the possible long-term income effects of deworming. To date I have been unable to convince myself I should make this bet, even though I find little to argue with in our work on the expected value of donations to charities implementing deworming programs. I am making a decision to ignore the difference in expected value between a donation to a deworming charity and a donation to GiveDirectly due to the greater certainty of impact via the latter. I think my approach to charitable giving is conservative relative to other staff at GiveWell and many of our donors but I also think that my approach is reasonable given my specific ethical commitments. I also support other organizations with gifts each year. This year, approximately 10-15% of my giving will go to organizations that do not meet GiveWell’s criteria. These organizations work in a number of areas including: • Immigration policy, activism, and legal aid – International Refugee Assistance Project, RAICES, and the National Immigration Law Center • Nonprofit news – primarily CALmatters, the Center for Investigative Reporting, and ProPublica • Local issues I care about such as transit infrastructure – eg, Bike East Bay • Other political causes I choose to keep the political contributions I make private as some of the causes I support are controversial and I would not want my political beliefs to have any potential impact on GiveWell’s work. In the course of my day-to-day work duties at GiveWell, I also frequently make small donations to our charities when testing various payment platforms. To date, these donations account for approximately 5-10% of my remaining planned gifts in 2018. These gifts are distributed among our recommended and standout charities haphazardly. I could refund these transactions, but choose not to do that as I think all of our recommended charities do excellent work and I am happy to support them. Catherine Hollander I plan to give 75% of my total charitable giving to Malaria Consortium’s seasonal malaria chemoprevention program. I value averting deaths quite highly and I believe, based on GiveWell’s assessment, that contributing toward filling Malaria Consortium’s funding gap will accomplish a lot of good in the world. In previous years (2017, 2016, and 2015), the majority of my gift has been directed to the Against Malaria Foundation (AMF), but I believe Malaria Consortium currently has a more pressing funding gap for its seasonal malaria chemoprevention work. I plan to give 10% of my total giving to AMF to continue their work. I understand that giving predictably is helpful for organizations’ planning and I don’t wish to abruptly alter my support for AMF. I also think that AMF continues to represent an outstanding giving opportunity as one of GiveWell’s top charities. I plan to give 5% of my total giving to StrongMinds, an organization focused on treating depression in Africa. I have not vetted this organization anywhere nearly as closely as GiveWell’s top charities have been vetted, though I understand that a number of people in the effective altruism community have a positive view of StrongMinds within the cause area of mental health (though I don’t have any reason to think it is more cost-effective than GiveWell’s top charities). Intuitively, I believe mental health is an important cause area for donors to consider, and although we do not have GiveWell recommendations in this space, I would like to learn more about this area by making a relatively small donation to an organization that focuses on it. I plan to give the remaining 10% of my charitable giving this year in conjunction with my partner to an organization working on criminal justice reform in the United States. We are going to discuss and review organizations together between now and the end of the year and make a joint gift in this space. I plan to consult previous recommendations made by Open Philanthropy Project’s program officer focused on criminal justice reform, Chloe Cockburn, as well as checking with friends who are better informed of the needs in this space than I am. Andrew Martin I think there’s a strong case for donating to GiveWell to grant to top charities at its discretion this year. Our top charities have substantial funding gaps for highly cost-effective programs, even after taking the$63.2 million that we’ve recommended that Good Ventures allocate between our top charities into account. These funding gaps include expanding Malaria Consortium’s work on seasonal malaria chemoprevention in Nigeria, Chad, and Burkina Faso, extending HKI’s vitamin A supplementation programs in several countries over the next three years, and extending Deworm the World’s programs in Pakistan and Nigeria.

As Natalie and James have noted, it seems likely that donations given to GiveWell at the end of 2018 to allocate at its discretion will be directed to Malaria Consortium’s seasonal malaria chemoprevention program. I’m planning to donate to GiveWell to allocate at its discretion because I expect that GiveWell will either direct those funds to Malaria Consortium or to another funding gap it judges to be even more valuable to fill.

Christian Smith

I’m planning to make my year-end donation to Malaria Consortium for its seasonal malaria chemoprevention (SMC) program. As my colleagues have mentioned, Malaria Consortium appears to be in a great position for scaling up a highly-effective intervention in areas with substantial malaria burdens.

I decided not to give to GiveWell for granting at its discretion because I think there’s a chance GiveWell will decide deworming programs look more worthwhile than SMC on the margin. I take a more skeptical stance than most of my colleagues on the value of deworming programs. While I’m not confident, I would guess that our process for modeling the value of deworming relative to malaria prevention puts deworming in too favorable a light.

Isabel Arjmand

My giving this year looks very similar to last year’s. It’s important to me for the bulk of my giving to go to organizations where I’m confident that my donation will have a substantial impact, and I don’t know of any giving opportunities in that vein that are as strong as GiveWell’s top charities. Each year I also give to a handful of other organizations, some in international development and others operating in the United States. I intend each of those donations to be large enough to be meaningful to me and to signal support for these programs, while still leaving the vast majority for GiveWell-recommended charities. In all, 80% of my charitable budget is going to GiveWell’s top charities and 20% to other causes, which is the same as my donation last year.

I’m giving 75% of my total year-end donation to grants to recommended charities at GiveWell’s discretion. I strongly considered designating my donation to Malaria Consortium’s seasonal malaria chemoprevention (SMC) program instead. I’m very excited about Malaria Consortium’s opportunity to provide SMC in Nigeria; I’ve been particularly impressed by Malaria Consortium as an organization over the past year; and I have more confidence in SMC as an intervention than I do in some others. It’s hard for me to imagine preferring for my donation to go elsewhere when it’s time for GiveWell to grant out its discretionary funding from the fourth quarter of 2018. But, I believe that if GiveWell does decide to give the next round of discretionary funding elsewhere, I’m more likely than not to agree with that decision. I hold this belief in part because my moral weights and overall outputs in our cost-effectiveness analysis are quite similar to the median staff member, and while I’m concerned about the evidence for deworming, I think that concern is adequately reflected in my cost-effectiveness analysis inputs.

An additional 5% of my donation will go to GiveDirectly. I look forward to continuing to follow the work they do, particularly their cash benchmarking project, their work with refugees, and their continual research to improve the effectiveness of their programs.

I plan to distribute the remaining 20% of my donation across the following organizations:

• International Refugee Assistance Project, which advocates for refugees and displaced people with a focus on those from the Middle East.
• StrongMinds, which is the most promising organization I know of focused on mental health in low- and middle-income countries.
• Planned Parenthood Action Fund, which takes a comprehensive, intersectional view of women’s health and reproductive justice.
• Cool Earth, which works with local communities to protect rainforests and reduce carbon dioxide emissions.

As I wrote last year, I’d be somewhat surprised if these organizations were competitively cost-effective with GiveWell’s top charities, and I haven’t vetted them with an intensity that comes anywhere close to the rigor of GiveWell evaluations. I choose to support these programs in order to promote more justice-focused causes, further my own civic engagement, and signal support for work I think is important.

I also make small donations throughout the year to grassroots organizations working in the Bay Area like Causa Justa :: Just Cause, Initiate Justice, and the Sogorea Te Land Trust. These donations, which are motivated primarily by community engagement and relationship-building, come out of my personal discretionary spending, rather than what I budget for charitable giving.

As always, I’m grateful for the thoughtfulness of my colleagues, the work that went into producing this year’s recommendations, and the conversations we’ve had that have informed my own giving.

James Snowden

I’m planning to donate to GiveWell for allocating funds at its discretion because (i) I prefer GiveWell to have the flexibility to react to new information, and (ii) in the absence of new information, I expect additional funds will be allocated to Malaria Consortium, the charity I would have given to. I expect Malaria Consortium would use those funds to scale up seasonal malaria chemoprevention in Nigeria, Chad and Burkina Faso. According to the Global Burden of Disease, Nigeria has the most deaths from malaria of any country, and Burkina Faso has the highest rate of deaths from malaria given its population size. This drives my view that donations to Malaria Consortium are likely to be more cost-effective than donations to the Against Malaria Foundation, which sometimes distributes nets in countries with a lower malaria burden.

I may also continue to give a smaller proportion of my donations to organizations working on improving animal welfare, and focused on the long-term future, but haven’t yet decided whether to do so, or where to give.

Dan Brown

I will give 75% of my 2018 charity donation to GiveWell to allocate to recommended charities at its discretion. This is my first year working for GiveWell and I’ve been very impressed with the quality of work that goes into our recommendations. My moral values seem to be quite close to the median values across staff members in our cost-effectiveness analysis, and so I see no reason to deviate from GiveWell’s choice on that basis. As Natalie and James note, our best guess is that these funds will be allocated to Malaria Consortium to scale up its seasonal malaria chemoprevention programs.

I will give 15% of my donation to No Means No Worldwide, a global rape prevention organisation. I spent a reasonable amount of time during my PhD researching gender based violence. This encouraged me to donate to an organisation tackling sexual violence, particularly because the frequency of sexual violence globally is staggering. I have not vetted No Means No Worldwide with anything like the rigor of a GiveWell evaluation, but I have been impressed by what I have read so far (e.g. they are evaluating their program using RCTs, and I like that part of their approach is to promote positive masculinity amongst boys).

I will give 6% of my donation to Stonewall (UK), an organisation tackling discrimination against LGBT people. Whilst I have focused most of my donation on global health and development, I would also like to support a more justice-focused cause. I have fairly limited information with which to choose amongst charities in this area as I’m not aware of a GiveWell-type organisation to help direct my donation. However, I would like to see more done to tackle homophobia in sport, and the main organisation I am aware of that has tried to do this is Stonewall (UK) (through its Rainbow Laces campaign).

I will give the remaining 4% of my donation to Afrinspire. I have donated to this charity for a number of years. To my knowledge, the money I donate is used to help pay for school costs for orphaned children in Kampala (through the Jaguza Initiative). I do not expect this to be as cost-effective as other charitable giving opportunities, but I do not think it would be responsible to unexpectedly decrease this donation now that I am paying more attention personally to cost-effectiveness.

Olivia Larsen

This year, I plan to give 95% of my year-end donation to GiveWell for granting at its discretion. This is my first year working at GiveWell full-time, and it will be my first time contributing to GiveWell’s discretionary fund.

In previous years, I have chosen to support specific top charities among GiveWell’s recommendations. Knowing which charity I was supporting in advance of my donation helped me more clearly conceptualize the impact I was making. Since starting at GiveWell, however, I’ve seen the level of detail and thought that the research team puts into analyzing each top charity’s funding gaps and identifying where a marginal dollar will have the largest impact. I’m convinced that the additional good associated with GiveWell being able to adapt to additional information and allocate my donation to the highest-impact charity we see when the grants are disbursed outweighs my desire to know where my donation will go ahead of time.

I also expect to allocate 5% of my year-end donation to helping factory farmed animals. This will be my first donation to an animal-focused charity, and it is a decision I went back and forth on. I believe that animals suffer, and I believe that I should act to alleviate that suffering; for example, by not eating animal products. Due to the scale of factory farming, the intensity of factory farming, and the neglectedness of the cause, I think it’s reasonable that interventions there might be orders of magnitude more cost-effective at averting the suffering of animals than GiveWell’s charities are at averting the suffering of humans. However, I’m very uncertain about how to compare helping animals to helping humans. I’m uncomfortable about the idea of allowing a human to suffer, even if I can alleviate the suffering of many animals with the same donation. I haven’t fully engaged with this discomfort yet, but I’m planning to make a donation targeted at helping animals this year to help me both clarify my own values and learn more about the effective animal advocacy space. I haven’t yet decided how to allocate this donation, but I expect that I’ll either donate to the Animal Welfare Fund through Effective Altruism Funds or through outsourcing the decision to a trusted friend who knows more about effective animal advocacy than I do.

This year, I plan to give 75% of my donations to GiveWell to allocate at its discretion. I believe that this will ensure that my donations go the furthest in global health and development. In previous years, I have given to either one of GiveWell’s top charities, or to the Global Health Effective Altruism fund. This year, my greater understanding of the advantages in allowing my donations to be channelled at GiveWell’s discretion, coupled with my U.S. taxpayer status, have caused me to prefer to give to GiveWell for regranting.

I plan to give the remaining 25% of my donations to an organization working on animal welfare but have not yet decided which one. It will likely be one of Animal Charity Evaluators top charities, and I expect to rely on the advice of a friend who has thought about effective animal charities far more than I have. I also considered giving some money to organizations focusing on the long-term future, but my view is that these organizations are not funding constrained.

Notes   [ + ]

1. ↑ See our staff giving posts from 2017, 2016, 2015, 2014, and 2013. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } }

The post Staff members’ personal donations for giving season 2018 appeared first on The GiveWell Blog.

### We’ve added more options for cryptocurrency donors

Fri, 12/07/2018 - 13:03

We’ve updated our donations processing to better meet the needs of those who want to give via cryptocurrencies. Last year, after we began to accept Bitcoin, we received over $290,000 in Bitcoin donations. By allowing more types of cryptocurrency donations, we’re enabling donors to realize tax deductions and to contribute more funding to their chosen charity based on gains in the cryptocurrencies they hold. We’re now accepting donations in the following cryptocurrencies: • Bitcoin (BTC) • Bitcoin Cash (BCH) • Ethereum (ETH) • Ethereum Classic (ETC) • Litecoin (LTC) • 0x (ZRX) We’ve built different pages for donating based on where you’d like to direct your support. To donate cryptocurrency, click the option you prefer: If you have any questions or would like to donate in a currency not listed above, please reach out to us at donations@givewell.org. If you have questions about the different options for directing your donation (top charities, standout charities, or operating expenses), please let us know. The post We’ve added more options for cryptocurrency donors appeared first on The GiveWell Blog. ### Response to concerns about GiveWell’s spillovers analysis Thu, 12/06/2018 - 14:02 Last week, we published an updated analysis on “spillover” effects of GiveDirectly‘s cash transfer program: i.e., effects that cash transfers may have on people who don’t receive cash transfers but who live nearby those who do receive cash transfers.1For more context on this topic, see our May 2018 blog post. jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We concluded: “[O]ur best guess is that negative or positive spillover effects of cash are minimal on net.” (More) Economist Berk Özler posted a series of tweets expressing concern over GiveWell’s research process for this report. We understood his major questions to be: 1. Why did GiveWell publish its analysis on spillover effects before a key study it relied on was public? Is this consistent with GiveWell’s commitment to transparency? Has GiveWell done this in other cases? 2. Why did GiveWell place little weight on some papers in its analysis of spillover effects? 3. Why did GiveWell’s analysis of spillovers focus on effects on consumption? Does this imply that GiveWell does not value effects on other outcomes? These questions apply to GiveWell’s research process generally, not just our spillovers analysis, so the discussion below addresses topics such as: • When do our recommendations rely on private information, and why? • How do we decide on which evidence to review in our analyses of charities’ impact? • How do we decide which outcomes to include in our cost-effectiveness analyses? Finally, this feedback led us to realize a communication mistake we made: our initial report did not communicate as clearly as it should have that we were specifically estimating spillovers of GiveDirectly’s current program, not commenting on spillovers of cash transfers in general. We will now revise the report to clarify this. Note: It may be difficult to follow some of the details of this post without having read our report on the spillover effects of GiveDirectly’s cash transfers. Summary In brief, our responses to Özler’s questions are: • Why did GiveWell publish its analysis on spillover effects before a key paper it relied on was public? One of our major goals is to allocate money to charities as effectively as possible. Sometimes, research we learn about cannot yet be made public but we believe it should affect our recommendations. In these cases, we incorporate the private information into our recommendations and we are explicit about how it is affecting our views. We expect that private results may be more likely to change but nonetheless believe that they contain useful information; we believe ignoring such results because they are private would lead us to reach less accurate conclusions. For another recent example of an important conclusion that relied on private results, see our update on the preliminary (private) results from a study on No Lean Season, which was key to the decision to remove No Lean Season as a top charity in 2018. We discuss other examples below. • Why did GiveWell place little weight on some papers in its analysis of spillover effects? In general, our analyses aim to estimate the impact of programs as implemented by particular charities. The goal of our spillovers analysis is to make our best guess about the size of spillover effects caused by GiveDirectly’s programs in Kenya, Uganda, and Rwanda. We are not trying to communicate an opinion on the size of spillover effects of cash transfers in other countries or in development economics more broadly. Therefore, our analysis places substantially more weight on studies that are most similar to GiveDirectly’s program on basic characteristics such as geographic location and program type. Correspondingly, we place little weight on papers that do not meet these criteria. However, we’d welcome additional information that would help us improve our future decisionmaking about which papers to put the most weight on in our analyses. • Why did GiveWell’s analysis of spillovers focus on effects on consumption? Our cost-effectiveness models focus on key outcomes that we expect to drive the bulk of the welfare effects of a program. In the case of our spillovers analysis, we believe the two most relevant outcomes for estimating spillover effects on welfare are consumption and subjective well-being. We chose to focus on consumption effects in large part because (a) this is consistent with how we model the impacts of other programs, such as deworming, and (b) distinguishing effects on subjective well-being from effects on consumption in a way that avoids double-counting benefits was too complex to do in the time we had available. It is possible that additional work on subjective well-being measures would meaningfully change how we assess benefits of programs (for this program and potentially others). This is a question we plan to return to in the future. As noted above, our current best guess is that negative or positive spillover effects of GiveDirectly’s cash transfers are minimal on net. However, we emphasize that our conclusion at this point is very tentative, and we hope to update our views next year if there is more public discussion or research on the areas of uncertainty highlighted in our analysis and/or if public debate about the studies covered in our report raises major issues we had not previously considered. Details follow. Why did GiveWell publish its analysis on spillover effects before a key paper it relied on was public? In our analysis of the spillover effects of GiveDirectly’s cash transfer program, we place substantial weight on GiveDirectly’s “general equilibrium” (GE) study (as we noted we would do in May 2018,2“We plan to reassess the cash transfer evidence base and provide our updated conclusions in the next several months (by November 2018 at the latest). One reason that we do not plan to provide a comprehensive update sooner is that we expect upcoming midline results from GiveDirectly’s “general equilibrium” study, a large and high-quality study explicitly designed to estimate spillover effects, will play a major role in our conclusions. Results from this study are expected to be released in the next few months.” (More.) jQuery("#footnote_plugin_tooltip_2").tooltip({ tip: "#footnote_plugin_tooltip_text_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); prior to seeing the study’s results) because: • it is the study with the largest sample size, • its methodology was designed to estimate both across-village and within-village spillover effects, and • it is a direct study of a version of GiveDirectly’s program. The details of this study are currently private, though we were able to share the headline results and methodology when we published our report. This represents one example of a general policy we follow, which is to be willing to compromise to some degree on transparency in order to use the best information available to us to improve the quality of our recommendations. More on the reasoning behind this policy: • Since our recommendations affect the allocation of over$100 million each year, the value of improving our recommendations by factoring in the best information (even if private) can be high. Every November we publish updates to our recommended charities so that donors giving in December and January (when the bulk of charitable giving occurs) can act on the most up-to-date information.
• We have ongoing communications with charities and researchers to learn about new information that could affect our recommendations. Private information (both positive and negative) has been important to our views on a number of occasions. Beyond the example of our spillovers analysis, early private results were key to our views on topics including:
• No Lean Season in 2018 (negative result)3“In a preliminary analysis shared with GiveWell in September 2018, the researchers did not find evidence for a negative or positive impact on migration, and found no statistically significant impact on income and consumption.” (More.) jQuery("#footnote_plugin_tooltip_3").tooltip({ tip: "#footnote_plugin_tooltip_text_3", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });
• Deworming in 2017 (positive result)4“We have seen preliminary, confidential results from a 15-year follow-up to Miguel and Kremer 2004. We are not yet able to discuss the results in detail, but they are broadly consistent with the findings from the 10-year follow-up analyzed in Baird et al. 2016.” (More.) jQuery("#footnote_plugin_tooltip_4").tooltip({ tip: "#footnote_plugin_tooltip_text_4", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });
• Insecticide resistance in 2016 (modeling study)5“We have seen two modeling studies which model clinical malaria outcomes in areas with ITN coverage for different levels of resistance based on experimental hut trial data. Of these two studies, the most recent study we have seen is unpublished (it was shared with us privately), but we prefer it because the insecticide resistance data it draws from is more recent and more comprehensive.” (More.) jQuery("#footnote_plugin_tooltip_5").tooltip({ tip: "#footnote_plugin_tooltip_text_5", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });
• Development Media International in 2015 (negative result)6“The preliminary endline results did not find any effect of DMI’s program on child mortality (it was powered to detect a reduction of 15% or more), and it found substantially less effect on behavior change than was found at midline. We cannot publicly discuss the details of the endline results we have seen, because they are not yet finalised and because the finalised results will be embargoed prior to publication, but we have informally incorporated the results into our view of DMI’s program effectiveness.” (More.) jQuery("#footnote_plugin_tooltip_6").tooltip({ tip: "#footnote_plugin_tooltip_text_6", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });
• Living Goods in 2014 (positive result)7“The researchers have published an abstract on the study, and shared a more in-depth report with us. The more in-depth report is not yet cleared for publication because the authors are seeking publication in an academic journal.” (More.) jQuery("#footnote_plugin_tooltip_7").tooltip({ tip: "#footnote_plugin_tooltip_text_7", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });
• Note that in all of the above cases we worked with the relevant researchers to get permission to publicly share basic information about the results we were relying on, as we did in the case of the GE study.
• In all cases, we expected that full results would be made public in the future. Our understanding is that oftentimes early headline results from studies can be shared publicly while it may take substantially longer to publicly release full working papers because working papers are time-intensive to produce. We would be more hesitant to rely on a study that has been private for an unusually long period of time unless there were a good reason for it.
• However, relying on private studies conflicts to some extent with our goal to be transparent. In particular, we believe two major downsides of our policy with respect to private information are (a) early private results are more likely to contain errors, and (b) we are not able to benefit from public scrutiny and discussion of the research. We would have ideally seen a robust public discussion of the GE study before we released our recommendations in November, but the timeline for the public release of GE study results did not allow that. We look forward to closely following the public debate in the future and plan to update our views based on what we learn.
• Despite these limitations, we have generally found early, private results to be predictive of final, public results. This, combined with the fact that we believe private results have improved our recommendations on a number of occasions, leads us to believe that the benefits of our current policy on using private information outweigh the costs.

A few other notes:

• Although we provide a number of cases above in which we relied on private information, the vast majority of the key information we rely on for our charity recommendations is public.
• When private information is shared with us that implies a positive update about a charity’s program, we try to be especially attentive about potential conflicts of interest. In this case, there is potential for concern because the GE study was co-authored by Paul Niehaus, Chairman of GiveDirectly. We chose not to substantially limit the weight we place on the GE study because (a) a detailed pre-analysis plan was submitted for this study, and (b) three of the four co-authors (Ted Miguel, Johannes Haushofer, and Michael Walker) do not have an affiliation with GiveDirectly. We have no reason to believe that GiveDirectly’s involvement altered the analysis undertaken. In addition, the GE study team informed us that Paul Niehaus recused himself from final decisions about what the team communicated to GiveWell.
• When we published our report (about one week ago), we expected that some additional analysis from the GE study would be shared publicly soon (which we still expect). We do not yet have an exact date and do not know precisely what content will be shared (though we expect it to be similar to what was shared with us privately).
Why did GiveWell place little weight on some papers in its analysis of spillover effects?

Some general context on GiveWell’s research that we think is useful for understanding our approach in this case is:

• We are typically estimating the impact of programs as implemented by particular charities, not aiming to publish formal meta-analyses about program areas as a whole. As noted above, we believe we should have communicated more clearly about this in our original report on spillovers and we will revise the report to clarify.
• We focus our limited time on the research that we think is most likely to affect our decisions, so our style of analysis is often different from what is typically seen in academia. (We think the differences in the kind of work we do is captured well by a relevant Rachel Glennerster blog post.)

Consistent with the above, the goal of our spillovers analysis was to make a best guess for the size of the spillover effect of GiveDirectly’s (GD’s) program in Kenya, Uganda, and Rwanda specifically.8This program provides $1,000 unconditional transfers and treats almost all households within target villages in Kenya and Uganda (though still treats only eligible households in Rwanda). jQuery("#footnote_plugin_tooltip_8").tooltip({ tip: "#footnote_plugin_tooltip_text_8", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We are not trying to communicate an opinion on the size of spillover effects of cash transfers in other countries or development economics more broadly. If we were trying to do the latter, we would have considered a much wider range of literature. We expect that studies that are most similar to GD’s program on basic characteristics such as geographic location and program type will be most useful for predicting spillovers in the GD context. So, we prioritize looking at studies that 1) took place in sub-Saharan Africa, and 2) evaluate unconditional cash transfer programs (further explanation in footnote).9On (1): Our understanding is that the nature and size of spillover effects is likely to be highly dependent on the context studied, for example because the extent to which village economies are integrated might differ substantially across contexts (e.g. how close households are to larger markets outside of the village in which they live, how easily goods can be transported, etc.). On (2): We expect that providing cash transfers conditional on behavioral choices is a fairly different intervention from providing unconditional cash transfers, and so may have different spillover effects. jQuery("#footnote_plugin_tooltip_9").tooltip({ tip: "#footnote_plugin_tooltip_text_9", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We would welcome additional engagement on this topic: that is, (a) to what extent should we believe that effects estimated in studies not meeting these criteria would apply to GD’s cash transfer programs, and (b) are there other criteria that we should have used? A further factor that causes us to put more weight on the five studies we chose to review deeply is that they all study transfers distributed by GD, which we see as increasing their relevance to GD’s current work (though the specifics of the programs that were studied vary from GD’s current program). We believe that studies that do not meet the above criteria could affect our views on spillovers of GD’s program to some extent, but they would receive lower weight in our conclusions since they are less directly relevant to GD’s program. We saw further review of studies that did not meet the above criteria as lower priority than a number of other analyses that we think would be more likely to shift our bottom line estimate of the spillovers of GD’s program. Even though we focused on the subset of studies most relevant to GD’s program, we were not able to combine their results to create a reasonable explicit model of spillover effects because we found that key questions were not answered by the available data (our attempt at an explicit model is in the following footnote).10We tried to create such an explicit model here (explanation here). jQuery("#footnote_plugin_tooltip_10").tooltip({ tip: "#footnote_plugin_tooltip_text_10", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); One fundamental challenge is that we are trying to apply estimates of “within-village” spillover effects to predict across-village spillover effects.11GiveDirectly treats almost all households within target villages in Kenya and Uganda (though still treats only eligible households in Rwanda). jQuery("#footnote_plugin_tooltip_11").tooltip({ tip: "#footnote_plugin_tooltip_text_11", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Additional complications are described here. More on why we placed little weight on particular studies that Özler highlighted in his comments:12Note on terminology: In our spillovers analysis report, we talk about studies in terms of “inclusion” and “exclusion.” We may use the term “exclude” differently than it is sometimes used in, e.g., academic meta-analyses. When we say that we have excluded studies, we have typically lightly reviewed their results and placed little weight on them in our conclusions. We did not ignore them entirely, as may happen for papers excluded from an academic meta-analysis. To try to clarify this, in this blog post we have used the term “place little weight.” We will try to be attentive to this in future research that we publish. jQuery("#footnote_plugin_tooltip_12").tooltip({ tip: "#footnote_plugin_tooltip_text_12", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); • We placed little weight on the following papers in our initial analysis for the reasons given in parentheses: Angelucci & DiGiorgi 2009 (conditional transfers, study took place in Mexico), Cunha et al. 2017 (study took place in Mexico), Filmer et al. 2018 (conditional transfers, study took place in the Philippines), and Baird, de Hoop, and Özler 2013 (mix of conditional and unconditional transfers). • In addition, the estimates of mental health effects on teenage schoolgirls in Baird, de Hoop, and Özler 2013 seem like they would be relatively less useful for predicting the impacts of spillovers from cash transfers given to households, particularly in villages where almost all households receive transfers as is often the case in GD’s program.13We expect that local spillover effects via psychological mechanisms are less likely to occur with the current spatial distribution of GD’s program. In GD’s program in Kenya and Uganda, almost all households are treated within its target villages. In addition, the majority of villages within a region are treated in a block. Baird, de Hoop, and Özler 2013 estimate spillover effects within enumeration areas (groups of several villages), and the authors believe that the “detrimental effects on the mental well-being of those randomly excluded from the program in intervention areas is consistent with the idea that an individual’s utility depends on her relative consumption (or income or status) within her peer group”, p.372. The spatial distribution of GD’s program in Kenya and Uganda makes it more likely that the majority of one’s local peer group receives the same treatment assignment. jQuery("#footnote_plugin_tooltip_13").tooltip({ tip: "#footnote_plugin_tooltip_text_13", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Why did GiveWell’s analysis of spillovers focus on effects on consumption? Does this imply that GiveWell does not value effects on other outcomes? Some general context on GiveWell’s research that we think is useful for understanding our approach in this case is: • When modeling the cost-effectiveness of any program, there are typically a large number of outcomes that could be included in the model. In our analyses, we focus on the key outcomes that we expect to drive the bulk of the welfare effects of a program. • For example, our core cost-effectiveness model primarily considers various programs’ effects on averting deaths and increasing consumption (either immediately or later in life). This means that, e.g., we do not include benefits of averting vision impairment in our cost-effectiveness model for vitamin A supplementation (in part because we expect those effects to be relatively small as a portion of the overall impact of the program). • This does not mean that we think excluded outcomes are unimportant. We focus on the largest impacts of programs because (a) we think they are a good proxy for the overall impact of the relevant programs, and (b) having fewer outcomes simplifies our analysis, which leads to less potential for error, better comparability between programs, and a more manageable time investment in modeling. • For a deeper assessment of which program impacts we include and exclude from our core cost-effectiveness model and why, see our model’s “Inclusion/exclusion” sheet.14We have not yet added it, but we plan to add “Subjective well-being” under the list of outcomes excluded in the “Cross-cutting / Structural” section of the sheet, since it may be relevant to all programs. jQuery("#footnote_plugin_tooltip_14").tooltip({ tip: "#footnote_plugin_tooltip_text_14", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); We aim to include outcomes that can be justified by evidence, feasibly modeled, and are consistent with how we handle other program outcomes. We revisit our list of excluded outcomes periodically to assess whether such outcomes could lead to a major shift in our cost-effectiveness estimate for a particular program. In our spillovers analysis, we applied the above principles to try to identify the key welfare effects. Among the main five studies we reviewed on spillovers, it seems like the two most relevant outcomes are consumption and subjective well-being. We chose to focus on consumption for the following reasons: • Assessing the effects of cash transfers on consumption (rather than subjective well-being) is consistent with how we model the welfare effects of other programs that we think increase consumption on expectation, such as deworming. • Distinguishing effects on subjective well-being from effects on consumption in order to avoid double-counting benefits was too complex to do in the time we had available. It seems intuitively likely that standards of living (proxied by consumption) affect subjective well-being. In the Haushofer and Shapiro studies and in the GE study, the spillover effects act in the same direction for both consumption and subjective well-being. We do not think it would be appropriate to simply add subjective well-being effects into our model over and above effects on consumption since that risks double-counting benefits. • We do not have a strong argument that consumption is a more robust proxy for “true well-being” than subjective well-being, but given that consumption effects can be more easily compared across our programs we have chosen it as the default option at this point. We hope to broadly revisit in the future whether we should be placing more weight on measures of subjective well-being across programs. It is possible that additional work on subjective well-being measures would meaningfully change how we assess benefits of programs (for this program and potentially others). Examples of our questions about how to interpret subjective well-being effects in the cash spillovers literature include: • In the Haushofer and Shapiro studies, how should we interpret each of the underlying components of the subjective well-being indices? For example, how does self-reported life satisfaction map onto utility versus self-reported happiness? • In Haushofer, Reisinger, & Shapiro 2015, there is a statistically significant negative spillover effect on life-satisfaction, but there are no statistically significant effects on happiness, depression, stress, cortisol levels or the overall subjective well-being index (column (4) of Table 1). How should we interpret these findings? Next steps • We hope that there is more public discussion on some of the policy-relevant questions we highlighted in our report and on the other points of uncertainty highlighted throughout this post. Our conclusions on spillovers are very tentative and could be affected substantially by more analysis, so we would greatly appreciate any feedback or pointers to relevant work.15If you are aware of relevant analyses or studies that we have not covered here, please let us know at info@givewell.org. jQuery("#footnote_plugin_tooltip_15").tooltip({ tip: "#footnote_plugin_tooltip_text_15", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); • We are planning to follow up with Dr. Özler to better understand his views on spillover effects of cash transfers. We have appreciated his previous blog posts on this topic and want to ensure we are getting multiple perspectives on the relevant issues. Notes [ + ] 1. ↑ For more context on this topic, see our May 2018 blog post. 2. ↑ “We plan to reassess the cash transfer evidence base and provide our updated conclusions in the next several months (by November 2018 at the latest). One reason that we do not plan to provide a comprehensive update sooner is that we expect upcoming midline results from GiveDirectly’s “general equilibrium” study, a large and high-quality study explicitly designed to estimate spillover effects, will play a major role in our conclusions. Results from this study are expected to be released in the next few months.” (More.) 3. ↑ “In a preliminary analysis shared with GiveWell in September 2018, the researchers did not find evidence for a negative or positive impact on migration, and found no statistically significant impact on income and consumption.” (More.) 4. ↑ “We have seen preliminary, confidential results from a 15-year follow-up to Miguel and Kremer 2004. We are not yet able to discuss the results in detail, but they are broadly consistent with the findings from the 10-year follow-up analyzed in Baird et al. 2016.” (More.) 5. ↑ “We have seen two modeling studies which model clinical malaria outcomes in areas with ITN coverage for different levels of resistance based on experimental hut trial data. Of these two studies, the most recent study we have seen is unpublished (it was shared with us privately), but we prefer it because the insecticide resistance data it draws from is more recent and more comprehensive.” (More.) 6. ↑ “The preliminary endline results did not find any effect of DMI’s program on child mortality (it was powered to detect a reduction of 15% or more), and it found substantially less effect on behavior change than was found at midline. We cannot publicly discuss the details of the endline results we have seen, because they are not yet finalised and because the finalised results will be embargoed prior to publication, but we have informally incorporated the results into our view of DMI’s program effectiveness.” (More.) 7. ↑ “The researchers have published an abstract on the study, and shared a more in-depth report with us. The more in-depth report is not yet cleared for publication because the authors are seeking publication in an academic journal.” (More.) 8. ↑ This program provides$1,000 unconditional transfers and treats almost all households within target villages in Kenya and Uganda (though still treats only eligible households in Rwanda). 9. ↑ On (1): Our understanding is that the nature and size of spillover effects is likely to be highly dependent on the context studied, for example because the extent to which village economies are integrated might differ substantially across contexts (e.g. how close households are to larger markets outside of the village in which they live, how easily goods can be transported, etc.).
On (2): We expect that providing cash transfers conditional on behavioral choices is a fairly different intervention from providing unconditional cash transfers, and so may have different spillover effects. 10. ↑ We tried to create such an explicit model here (explanation here). 11. ↑ GiveDirectly treats almost all households within target villages in Kenya and Uganda (though still treats only eligible households in Rwanda). 12. ↑ Note on terminology: In our spillovers analysis report, we talk about studies in terms of “inclusion” and “exclusion.” We may use the term “exclude” differently than it is sometimes used in, e.g., academic meta-analyses. When we say that we have excluded studies, we have typically lightly reviewed their results and placed little weight on them in our conclusions. We did not ignore them entirely, as may happen for papers excluded from an academic meta-analysis. To try to clarify this, in this blog post we have used the term “place little weight.” We will try to be attentive to this in future research that we publish. 13. ↑ We expect that local spillover effects via psychological mechanisms are less likely to occur with the current spatial distribution of GD’s program. In GD’s program in Kenya and Uganda, almost all households are treated within its target villages. In addition, the majority of villages within a region are treated in a block. Baird, de Hoop, and Özler 2013 estimate spillover effects within enumeration areas (groups of several villages), and the authors believe that the “detrimental effects on the mental well-being of those randomly excluded from the program in intervention areas is consistent with the idea that an individual’s utility depends on her relative consumption (or income or status) within her peer group”, p.372. The spatial distribution of GD’s program in Kenya and Uganda makes it more likely that the majority of one’s local peer group receives the same treatment assignment. 14. ↑ We have not yet added it, but we plan to add “Subjective well-being” under the list of outcomes excluded in the “Cross-cutting / Structural” section of the sheet, since it may be relevant to all programs. 15. ↑ If you are aware of relevant analyses or studies that we have not covered here, please let us know at info@givewell.org. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } }

The post Response to concerns about GiveWell’s spillovers analysis appeared first on The GiveWell Blog.

### Our updated top charities for giving season 2018

Mon, 11/26/2018 - 11:59

We’re excited to share our list of top charities for the 2018 giving season. We recommend eight top charities, all of which we also recommended last year.

Our bottom line

We recommend three top charities implementing programs whose primary benefit is reducing deaths. They are:

Five of our top charities implement programs that aim to increase recipients’ incomes and consumption. They are:

These charities represent the best opportunities we’re aware of to help people, according to our criteria. We expect GiveWell’s recommendations to direct more than $100 million to these organizations collectively over the next year. We expect our top charities to be able to effectively absorb hundreds of millions of dollars beyond that amount. Our list of top charities is the same as it was last year, with the exception of Evidence Action’s No Lean Season. We removed No Lean Season from the list following our review of the results of a 2017 study of the program. We also recognize a group of standout charities. We believe these charities are implementing programs that are evidence-backed and may be extremely cost-effective. However, we do not feel as confident in the impact of these organizations as we do in our top charities. We provide more information about our standout organizations here. Where do we recommend donors give? • We recommend that donors choose the “Grants to recommended charities at GiveWell’s discretion” option on our donation forms. We grant these funds quarterly to the GiveWell top charity or top charities where we believe they can do the most good. • If you prefer to give to a specific charity, we believe that all of our top charities are outstanding and will use additional funding effectively. If we had additional funds to allocate now, the most likely recipient would be Malaria Consortium to scale up its work providing seasonal malaria chemoprevention. • If you have supported GiveWell’s operations in the past, we ask that you maintain your support. If you have not supported GiveWell’s operations in the past, we ask that you consider designating 10 percent of your donation to help fund GiveWell’s operations. How should donors give? Conference call to discuss recommendations We’re holding a conference call on Tuesday, December 4, at 12pm ET/9am PT to discuss our latest recommendations and to answer any questions you have. Sign up here to join the call. Additional details Below, we provide: • An overview of the research we conducted in 2018 that was directly relevant to these recommendations. More • An explanation of changes to our recommended charity list and of major updates in the past year. More • The funding allocation that we are recommending to Good Ventures and our top charities’ remaining room for more funding. More • Our recommendations for people interested in supporting our top charities. More Our research process in 2018 We plan to summarize all of the research we completed this year in a future post as part of our annual review process. A major focus of 2018 was improving our recommendations in future years, in particular through our work on GiveWell Incubation Grants and completing intervention reports on promising programs. Below, we highlight the key research that led to our current charity recommendations. This page describes our general process for conducting research. • Following existing top charities. We followed the progress and plans of each of our 2017 top charities. We had several conversations with each organization and reviewed documents they shared with us. We published updated reviews of each of our top charities. Key information from this work is available in the following locations: • Our page summarizing changes at each of our top charities and standouts in 2018. • Our workbook with each charity’s funding needs and our estimates of the cost-effectiveness of filling each need. • Our full reviews for each charity are linked from this page. • Staying up to date on the research on the interventions implemented by our top charities. Details on some of what we learned in the section below. • Making extensive updates to our cost-effectiveness model and publishing 14 updates to the model over the course of the year. In addition to updating our cost-effectiveness model with information from the intervention research described above, we added a “country selection” tab to our cost-effectiveness analysis (so that users can toggle between overall and country-specific cost-effectiveness estimates); an “inclusion/exclusion” tab, which lists different items that we considered whether or not to account for in our cost-effectiveness analysis; and we explicitly modeled factors that could lead to wastage (charities failing to use the funds they receive to implement their programs effectively). • Completing a review of Zusha! We completed our review of the Georgetown University Initiative on Innovation, Development, and Evaluation—Zusha! Road Safety Campaign and determined that it did not meet all of our criteria to be a top charity. We named Zusha! a standout charity. Major updates from the last 12 months Below, we summarize major updates across our recommended charities over the past year. For detailed information on what changed at each of our top and standout charities, see this page. • We removed Evidence Action’s No Lean Season from our top charity list. At the end of 2017, we named No Lean Season, a program that provides loans to support seasonal migration in Bangladesh, as one of GiveWell’s top charities. This year, we updated our assessment of No Lean Season based on preliminary results we received from a 2017 study of the program. These results suggested the program did not successfully induce migration in the 2017 lean season. Taking this new information into account alongside previous studies of the program, we and Evidence Action no longer believe No Lean Season meets our top charity criteria. We provide more details on this decision in this blog post. • We received better information about Sightsavers’ deworming program. In previous years, we had limited information from Sightsavers documenting how it knew that its deworming programs were effectively reaching their intended beneficiaries. This year, Sightsavers shared significantly more monitoring information with us. This additional information substantially increased our confidence in Sightsavers’ deworming program. This spreadsheet shows the monitoring we received from Sightsavers in 2018. • We reviewed new research on the priority programs implemented by our top charities and updated our views and cost-effectiveness analyses accordingly. Examples of such updates include: Recommended allocation of funding for Good Ventures and top charities’ remaining room for more funding Allocation recommended to Good Ventures Good Ventures is a large foundation with which GiveWell works closely; it has been a major supporter of GiveWell’s top charities since 2011. Each year, we provide recommendations to Good Ventures regarding how we believe it can most effectively allocate its grants to GiveWell’s recommended charities, in terms of the total amount donated (within the constraints of Good Ventures’ planning, based in part on the Open Philanthropy Project’s recommendations on how to allocate funding across time and across cause areas) as well as the distribution between recipient charities. Because Good Ventures is a major funder that we expect to follow our recommendations, we think it’s important for other donors to take its actions into account; we also want to be transparent about the research that leads us to make our recommendations to Good Ventures. That said, Good Ventures has not finalized its plans for the year and may give differently from what we’ve recommended. We think it’s unlikely that any differences would have major implications for our bottom-line recommendations for other donors. This year, GiveWell recommended that Good Ventures grant$64.0 million to our recommended charities, allocated as shown in the table below.

Charity Recommended allocation from Good Ventures Remaining room for more funding1This column displays our top charities’ remaining room for more funding, or the amount we believe they can use effectively, for the next three years (2019-2021), after accounting for the $64.0 million we’ve recommended that Good Ventures give (our recommendation won’t necessarily be followed, but we think it’s unlikely that differences will be large enough to affect our bottom-line recommendation to donors) and an additional$1.1 million from GiveWell’s discretionary funding. jQuery("#footnote_plugin_tooltip_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Malaria Consortium (SMC program) $26.6 million$43.9 million Evidence Action (Deworm the World Initiative) $10.4 million$27.0 million Sightsavers (deworming program) $9.7 million$1.6 million Helen Keller International (VAS program) $6.5 million$20.6 million Against Malaria Foundation $2.5 million$72.5 million Schistosomiasis Control Initiative $2.5 million$16.9 million The END Fund (deworming program) $2.5 million$45.8 million GiveDirectly $2.5 million >$100 million Standout charities $800,000 (combined) We discuss our process for making our recommendation to Good Ventures in detail in this blog post. Allocation of GiveWell discretionary funds As part of reviewing our top charities’ funding gaps to make a recommendation to Good Ventures, we also decided how to allocate the$1.1 million in discretionary funding we currently hold. The latter comes from donors who chose to donate to “Grants to recommended charities at GiveWell’s discretion” in recent months. We decided to allocate this funding to Malaria Consortium’s seasonal malaria chemoprevention program, due to how large and cost-effective we believe Malaria Consortium’s funding gap is.

Top charities’ remaining room for more funding

Although we are expecting to direct a significant amount of funding to our top charities ($65.1 million between Good Ventures and our discretionary funding), we believe that nearly all of our top charities could productively absorb considerably more funding than we expect them to receive from Good Ventures, our discretionary funding, and additional donations we direct based on our recommendation. This spreadsheet lists all of our top charities’ funding needs; rows 70-79 show total funding gaps by charity. Our recommendation for donors The bottom line • We recommend that donors choose the option to support “Grants to recommended charities at GiveWell’s discretion” on our donate forms. We grant these funds quarterly to the GiveWell top charity or top charities where we believe they can do the most good. We take into account charities’ funding needs and donations they have received from other sources when deciding where to grant discretionary funds. (The principles we outline in this post are indicative of how we will make decisions on what to fund.) We then make these grants to the highest-value funding opportunities we see among our recommended charities. This page lists discretionary grants we have made since 2014. • If you prefer to give to a specific charity, we believe that all of our top charities are outstanding and will use additional funding effectively. See below for information that may be helpful in deciding between charities we recommend. • If we had additional funds to allocate, the most likely recipient would be Malaria Consortium to scale up its work providing seasonal malaria chemoprevention. Comparing our top charities If you’re interested in donating to a specific top charity or charities, the following information may be helpful as you compare the options on our list. The table summarizes key facts about our top charities; column headings are defined below. Note: the cost-effectiveness estimates we present in this post differ from those in our published cost-effectiveness analysis for a number of reasons.2“The cost-effectiveness estimates in this sheet, which we used to inform our recommended allocation differ from those in our published cost-effectiveness analysis because (1) we apply a number of adjustments to incorporate additional information (2) we apply different weightings to each program (which affects the weighted average of cost-effectiveness).” Source: Giving Season 2018 – Allocation (public), “Cost-effectiveness results” tab, row 17. Additional details at the link. jQuery("#footnote_plugin_tooltip_2").tooltip({ tip: "#footnote_plugin_tooltip_text_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Organization Modeled cost-effectiveness (relative to cash transfers) at the present margin3For sources on the estimates included in this table, see this spreadsheet, “Cost-effectiveness results” tab. The estimates presented here differ from the estimates presented in our recommendation to Good Ventures because they estimate cost-effectiveness on the margin, if Good Ventures were to follow our recommendations. jQuery("#footnote_plugin_tooltip_3").tooltip({ tip: "#footnote_plugin_tooltip_text_3", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Primary benefits of the intervention Quality of the organization’s communication Ongoing monitoring and likelihood of detecting future problems Malaria Consortium (SMC program) 8.8 Averting deaths of children under 5 Strong Strong Evidence Action (Deworm the World Initiative) See footnote4At the margin, we expect additional funding to Deworm the World Initiative to support its programs in Pakistan and Nigeria in 2021 as well as Deworm the World’s general reserves. We think these are broadly good uses of funds, but our cost-effectiveness model is not currently built to meaningfully model the cost-effectiveness of reserves. In the absence of more information, we would guess that additional funding to Deworm the World would be roughly in the range of our estimate for Deworm the World’s overall organizational cost-effectiveness (~15x as cost-effective as cash transfers), but we have not analyzed the details of additional spending at the current margin enough to be confident in that estimate. However, if Good Ventures generally follows our recommended allocation, we expect that Deworm the World will have sufficient funding to continue its most time-sensitive work and we can decide whether to fund other marginal opportunities at a later date. jQuery("#footnote_plugin_tooltip_4").tooltip({ tip: "#footnote_plugin_tooltip_text_4", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Possibly increasing income in adulthood Strong Strong Helen Keller International (VAS program) 6.4 Averting deaths of children under 5 Strong Moderate Against Malaria Foundation 7.3 Averting deaths Moderate Moderate Schistosomiasis Control Initiative 8.3 Possibly increasing income in adulthood Moderate Relatively weak Sightsavers (deworming program) See footnote5We do not have a strong sense of the cost-effectiveness of additional funds to Sightsavers at the current margin. Our cost-effectiveness estimate of Sightsavers’ remaining funding gap is 15.4x as cost-effective as cash transfers, but this fails to capture a number of features particular to the program Sightsavers would fund on the margin. We would guess that the value of marginal funding to Sightsavers is roughly in the range of our overall estimate for Sightsavers of ~12x as cost-effective as cash transfers. One major reason for our uncertainty follows. As discussed here, Sightsavers’ prioritization of how to spend additional funds differed substantially from what would be implied by our cost-effectiveness analysis, but we think that this discrepancy may largely be due to factors that our model does not capture or ways our model may be inaccurate; therefore, it is difficult to rely on our model to assess the cost-effectiveness of specific remaining country funding gaps. jQuery("#footnote_plugin_tooltip_5").tooltip({ tip: "#footnote_plugin_tooltip_text_5", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Possibly increasing income in adulthood Moderate Moderate The END Fund (deworming program) 5.4 Possibly increasing income in adulthood Moderate Relatively weak GiveDirectly 1 Immediately increasing income and assets Strong Strong Definitions of column headings follow: • Estimated cost-effectiveness (relative to cash transfers) at the present margin. We recommended that Good Ventures give$64.0 million to our top and standout charities, prioritizing the funding gaps that we believe are most cost-effective. The table above shows our estimates for the cost-effectiveness of additional donations to each charity, after accounting for the $64.0 million we’ve recommended that Good Ventures give (our recommendation won’t necessarily be followed, but we think it’s unlikely that differences will be large enough to affect our bottom-line recommendation to donors). • Primary benefits of the intervention. This column describes the major benefit we see to supporting a charity implementing this intervention. • Quality of the organization’s communication. In most cases, we have have spent dozens or hundreds of hours interacting with our top charities. Here, we share our subjective impression of how well each organization has communicated with us. Our assessment of the quality of a charity’s communications is driven by whether we have been able to resolve our questions—particularly our less straightforward questions—about the organization’s activities, impact, and plans; how much time and effort was required to resolve those questions; how often the charity has sent us information that we later learned is inaccurate; and how direct we believe the charity is in acknowledging their weaknesses and mistakes. The organizations that stand out for high-quality communications are those that have most thoughtfully and completely answered our questions; brought problems with the program to our attention; and communicated clearly with us about timelines for providing additional information. High-quality communications reduce the time that we need to spend answering each question and therefore allow us to gain a greater degree of confidence in an organization. More importantly, our communication with an organization is one of the few ways that we can directly observe an organization’s general competence and thoughtfulness, so we see this as a proxy for unobserved ways in which the organization’s staff affect the impact of the program. • Ongoing monitoring and likelihood of detecting future problems. The quality of the monitoring we have received from our top charities varies widely, although we believe it stands out from that of the majority of charities. Ideally, the monitoring data charities collect would be representative of the program overall (by sampling all or a random selection of locations or other relevant units); would measure the outcomes of greatest interest for understanding the impact of the program; and would use methods that result in a low risk of bias or fraud in the results. In assessing the quality of a charity’s monitoring, we ask ourselves, “how likely do we believe it is that there are substantive problems with the program that are not detected by this monitoring?” Monitoring results inform our cost-effectiveness analyses directly. In addition, we believe that the quality of an organization’s monitoring give us information that is not fully captured in these analyses. Similar to how we view communication quality, we believe that understanding how an organization designs and implements monitoring is a opportunity to observe its general competence and degree of openness to learning and program improvement. Other key factors donors might want to consider when making their giving decision: • As shown in the table above, our top charities implement programs with different primary benefits: some primarily avert deaths; others primarily increase incomes or consumption. Donors’ preference for programs that avert deaths relative to those that increase incomes (or how one weighs the value of averting a death at a given cost or increasing incomes a certain amount at a given cost) depends on their moral values. The cost-effectiveness estimates shown above rely on the GiveWell research team’s moral values. For more on how we (and others) compare the “good” accomplished by different programs, see this blog post. Donors may make a copy of our cost-effectiveness model to input their own moral weights and see how that impacts the relative cost-effectiveness of our top charities. • The table above shows cost-effectiveness estimates for different charities. We put significant weight on cost-effectiveness figures, but they have limitations. Read more about how we use cost-effectiveness estimates in this blog post. • Ultimately, donors are faced with a decision about how to weigh estimated cost-effectiveness (incorporating their moral values) against additional information about an organization that we have not explicitly modeled. We’ve written about this choice in the context of choosing between GiveDirectly and SCI in this 2016 blog post. • Four of our top charities implement deworming programs. We recommend the provision of deworming treatments to children for its possible impact on recipients’ incomes in adulthood. We work in an expected value framework; in other words, we’re willing to support a higher-risk intervention if it has the potential for higher impact (more in this post about our worldview). Deworming is such an intervention. We believe that deworming may have very little impact, but that risk is outweighed by the possibility that it has very large impact, and it’s very cheap to implement. We describe our assessment of deworming in this summary blog post as well as this detailed post. Donors who have lower risk tolerance may choose not to support charities implementing deworming programs. • The table above lists our views on the quality of each of our top charities’ monitoring. This 2016 blog post describes our view of AMF’s monitoring and may give donors more insight into how we think about monitoring quality. Giving to support GiveWell’s operations GiveWell is currently in a financially stable position. Over the next few years, we are planning to significantly increase our spending, driven by hiring additional research and outreach staff. We project that our revenue will approximately equal our expenses over the next few years; however, this projection includes an expectation of growth in the level of operating support we receive. We retain our “excess assets policy” to ensure that if we fundraise for our own operations beyond a certain level, we will grant the excess to our recommended charities. In June of 2018, we applied our excess assets policy and designated$1.75 million in unrestricted funding for grants to recommended charities.

We cap the amount of operating support we ask Good Ventures to provide to GiveWell at 20 percent of our operating expenses, for reasons described here. We ask that donors who use GiveWell’s research consider the following:

• If you have supported GiveWell’s operations in the past, we ask that you maintain your support. Having a strong base of consistent operations support allows us to make valuable hires when opportunities arise and to minimize staff time spent on fundraising for our operating expenses.
• If you have not supported GiveWell’s operations in the past, we ask that you designate 10 percent of your donation to help fund GiveWell’s operations. This can be done by selecting the option to “Add 10% to help fund GiveWell’s operations” on our credit card donation form or letting us know how you would like to designate your funding when giving another way.
Questions?

We’re happy to answer questions in the comments below. Please also feel free to reach out directly with any questions.

This post was written by Andrew Martin, Catherine Hollander, Elie Hassenfeld, James Snowden, and Josh Rosenberg.

Notes   [ + ]

1. ↑ This column displays our top charities’ remaining room for more funding, or the amount we believe they can use effectively, for the next three years (2019-2021), after accounting for the $64.0 million we’ve recommended that Good Ventures give (our recommendation won’t necessarily be followed, but we think it’s unlikely that differences will be large enough to affect our bottom-line recommendation to donors) and an additional$1.1 million from GiveWell’s discretionary funding. 2. ↑ “The cost-effectiveness estimates in this sheet, which we used to inform our recommended allocation differ from those in our published cost-effectiveness analysis because (1) we apply a number of adjustments to incorporate additional information (2) we apply different weightings to each program (which affects the weighted average of cost-effectiveness).” Source: Giving Season 2018 – Allocation (public), “Cost-effectiveness results” tab, row 17. Additional details at the link. 3. ↑ For sources on the estimates included in this table, see this spreadsheet, “Cost-effectiveness results” tab. The estimates presented here differ from the estimates presented in our recommendation to Good Ventures because they estimate cost-effectiveness on the margin, if Good Ventures were to follow our recommendations. 4. ↑ At the margin, we expect additional funding to Deworm the World Initiative to support its programs in Pakistan and Nigeria in 2021 as well as Deworm the World’s general reserves. We think these are broadly good uses of funds, but our cost-effectiveness model is not currently built to meaningfully model the cost-effectiveness of reserves. In the absence of more information, we would guess that additional funding to Deworm the World would be roughly in the range of our estimate for Deworm the World’s overall organizational cost-effectiveness (~15x as cost-effective as cash transfers), but we have not analyzed the details of additional spending at the current margin enough to be confident in that estimate. However, if Good Ventures generally follows our recommended allocation, we expect that Deworm the World will have sufficient funding to continue its most time-sensitive work and we can decide whether to fund other marginal opportunities at a later date. 5. ↑ We do not have a strong sense of the cost-effectiveness of additional funds to Sightsavers at the current margin. Our cost-effectiveness estimate of Sightsavers’ remaining funding gap is 15.4x as cost-effective as cash transfers, but this fails to capture a number of features particular to the program Sightsavers would fund on the margin. We would guess that the value of marginal funding to Sightsavers is roughly in the range of our overall estimate for Sightsavers of ~12x as cost-effective as cash transfers.

One major reason for our uncertainty follows. As discussed here, Sightsavers’ prioritization of how to spend additional funds differed substantially from what would be implied by our cost-effectiveness analysis, but we think that this discrepancy may largely be due to factors that our model does not capture or ways our model may be inaccurate; therefore, it is difficult to rely on our model to assess the cost-effectiveness of specific remaining country funding gaps.

function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } }

The post Our updated top charities for giving season 2018 appeared first on The GiveWell Blog.

### Our recommendation to Good Ventures

Mon, 11/26/2018 - 11:59

Today, we announce our list of top charities for the 2018 giving season. We expect to direct over $100 million to the eight charities on our list as a result of our recommendation. Good Ventures, a large foundation with which we work closely, is the largest single funder of our top charities. We make recommendations to Good Ventures each year for how much funding to provide to our top charities and how to allocate that funding among them. As this funding is significant, we think it’s important for other donors to take into account the recommendation we make to Good Ventures. This blog post explains in detail how we decide what to recommend to Good Ventures and why; we want to be transparent about the research that leads us to our recommendations to Good Ventures. If you’re interested in a bottom-line recommendation for where to donate this year, please view our post with recommendations for non-Good Ventures donors. Note that Good Ventures has not finalized its plans for the year and may give differently from what we’ve recommended. We think it’s unlikely that any differences would have major implications for our bottom-line recommendations for other donors. Summary In this post, we discuss: • How we decided how much funding to recommend Good Ventures provide to our top charities. • Our recommendation for how Good Ventures should allocate that funding among our top charities, and how we arrived at that allocation: How we decided how much funding to recommend to Good Ventures This year, GiveWell recommended that Good Ventures grant$64.0 million to our top charities and standout charities. The amount Good Ventures gives to our top charities is based in part on how the Open Philanthropy Project plans to allocate funding across time and across cause areas. (Read more about our relationships with Good Ventures and the Open Philanthropy Project here.)

The Open Philanthropy Project currently plans to allocate around 10% of its total available capital to “straightforward charity,” which it currently allocates to global health and development causes based on GiveWell’s recommendations. This 10% allocation includes two “buckets”—a fixed percentage of total giving each year of 5% and another “flexible” bucket of 5%, which can be spent down quickly (over a few years) or slowly (over many years). GiveWell’s recommendation that Good Ventures grant $64.0 million this year puts the flexible bucket on track to be spent down within the next 14 years. We’re recommending$64.0 million this year to balance two considerations:

• As the world gets richer, giving opportunities in global health and development generally seem likely to get worse over time. This implies that giving now has a larger impact.
• In the coming years, GiveWell may find opportunities that are considerably more cost-effective than our current recommendations (e.g., among policy advocacy organizations). This would make spending in future years have a larger impact.
Our recommended allocation for Good Ventures

The table below summarizes how much funding we recommend Good Ventures grant to each of our top charities, along with our explicit cost-effectiveness estimate for each organization and organizational factors we don’t model explicitly that affect our assessment of impact.

As always, cost-effectiveness figures should be interpreted with caution.

Note: the cost-effectiveness estimates we present in this post differ from those in our published cost-effectiveness analysis for a number of reasons.1“The cost-effectiveness estimates in this sheet, which we used to inform our recommended allocation differ from those in our published cost-effectiveness analysis because (1) we apply a number of adjustments to incorporate additional information (2) we apply different weightings to each program (which affects the weighted average of cost-effectiveness).” Source: See Giving Season 2018 – Allocation (public), “Cost-effectiveness results” tab, row 17. Additional details at the link. jQuery("#footnote_plugin_tooltip_1612_1").tooltip({ tip: "#footnote_plugin_tooltip_text_1612_1", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] });

Charity Modeled cost-effectiveness (relative to cash transfers)2“We typically won’t move forward with a charity in our process if it appears that it won’t meet the threshold of at least 2-3x as cost-effective as cash transfers. We think cash transfers are a reasonable baseline to use due to the intuitive argument that if you’re going to help someone with Program X, Program X should be more cost-effective than just giving someone cash to buy that which they need most.” June 1, 2017, GiveWell blog, How GiveWell uses cost-effectiveness analyses. The estimates presented here differ from the estimates presented in our recommendation to donors because they estimate weighted average cost-effectiveness over the whole funding gap, rather than on the margin. jQuery("#footnote_plugin_tooltip_1612_2").tooltip({ tip: "#footnote_plugin_tooltip_text_1612_2", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Organizational factors we don’t model explicitly3We take into account an organization’s strength of communication with us and the comprehensiveness of its program monitoring. We factor this into our broad assessment of the organization’s cost-effectiveness. Read more: November 26, 2018, GiveWell blog, Our updated top charities for giving season 2018. jQuery("#footnote_plugin_tooltip_1612_3").tooltip({ tip: "#footnote_plugin_tooltip_text_1612_3", tipClass: "footnote_tooltip", effect: "fade", fadeOutSpeed: 100, predelay: 400, position: "top right", relative: true, offset: [10, 10] }); Recommended allocation Malaria Consortium (SMC program) 8.8 Very strong $26.6 million Evidence Action (Deworm the World Initiative) 14.6 Very strong$10.4 million Sightsavers (deworming program) 12.0 Moderate $9.7 million Helen Keller International (VAS program) 7.0 Strong$6.5 million Against Malaria Foundation 7.3 Moderate $2.5 million Schistosomiasis Control Initiative 8.3 Relatively weak$2.5 million The END Fund (deworming program) 5.4 Relatively weak $2.5 million GiveDirectly 1 Very strong$2.5 million Standout charities $800,000 (combined) Sum$64.0 million

The underlying objective of GiveWell’s allocation is to direct as much money as possible to the most cost-effective giving opportunities over the long run. (We aim to optimize cost-effectiveness, as defined broadly—we recognize the limitations of our cost-effectiveness model and consider additional factors in our assessment.) We relied on modeled cost-effectiveness figures as well as the organizational factors described above to inform our recommendations.

To meet this objective, our allocation this year was driven by the principles described below.

Principles we followed in arriving at this allocation

Principle 1: Put significant weight on our cost-effectiveness estimates. Our cost-effectiveness estimates incorporate a substantial amount of information relevant to our decisionmaking. While we recognize the high levels of uncertainty around our cost-effectiveness estimates, they are the single largest factor we take into consideration. More on how we use cost-effectiveness to inform our decisions here.

Principle 2: Consider additional information about an organization that we have not explicitly modeled. While our cost-effectiveness estimates are the best tool we know of to estimate the amount of good a charity accomplishes, we believe it’s infeasible to try to incorporate all relevant considerations into a single quantitative estimate. Subjective assessments that aren’t included in our cost-effectiveness calculations but affect how much impact a charity has include:

• A charity’s ability to make good decisions on how to prioritize. Our top charities often take factors that aren’t included in our cost-effectiveness estimates into account when deciding how to spend their limited budgets. We use our subjective assessment of how well charities answer our questions about their activities as a proxy for how well they make these decisions.
• Upside. Our top charities often perform activities that go beyond the scope of their direct work, such as conducting and sharing research that influences others, or raising funds for their programs from funders that would otherwise give to less cost-effective programs.

For the most part, we do not have the opportunity to directly observe these factors. Our subjective assessments of these factors are based on the observed but unmodeled factors that we discuss in this post: the quality of the organization’s communication and ongoing monitoring, and the likelihood of detecting future problems.

Principle 3: Assess charities’ funding gaps at the margin, i.e., where they would spend additional funding, where possible. We try to understand how charities’ funding would be spent among different programs or locations. Our cost-effectiveness estimates for charities’ projects often vary substantially (depending, for example, on the underlying disease burden in a particular country the charity plans to work in). Where possible, we compare our best guess of how funding would be used on the margin, rather than on average. As part of assessing charities’ marginal cost-effectiveness, we intend to capture whether there are diminishing returns to their receiving additional funding.

Principle 4: Default towards not imposing restrictions on charity spending. While we rely on our expectation of how charities would prioritize funding gaps to estimate marginal cost-effectiveness, we do not plan to impose any restrictions on how the funding is actually used in practice. (There is one exception to this: in cases where a top charity implements multiple global health and development programs and our recommendation is restricted to one of those programs, we do restrict funding to the priority program we recommend, such as deworming or vitamin A supplementation.) We believe our top charities are often better placed to make decisions about which projects to fund than we are, and we want to ensure maximum flexibility for them to do so.

Principle 5: Fund on a three-year horizon, unless we are particularly uncertain whether we will want to continue recommending a program in the future. Our top charities have communicated to us that there are often substantial benefits to knowing that funding for a program is secure for the future. As a general rule, we aim to provide funding for three years for each program we choose to fund. The exception is when we are more uncertain whether we would want to renew funding for a third year (e.g. because our estimated cost-effectiveness of a program is close to the marginal program we decided not to fund).

Principle 6: Ensure charities are incentivized to engage with our process. We recognize that our charity review process requires deep engagement from senior members of charities’ staff. We want to ensure that charities are incentivized to keep engaging with our process. To this end, since 2016, we recommended that Good Ventures provide a minimum “incentive grant” to top charities ($2.5 million) and standout charities ($100,000).

We hope that providing significant incentive grants increases the chances that charities are motivated to compete for a GiveWell recommendation. We fear that without ensuring that every top charity or standout receives a substantial amount of funding, some charities might be deterred from applying for a GiveWell recommendation or from making changes to their programs to potentially become top charities.

Our process for determining our recommended allocation for Good Ventures

In line with the principles above, we used the following process to arrive at our recommended allocation for Good Ventures:

1. We recommended that Good Ventures provide each charity with an incentive grant ($2.5 million per top charity and$100,000 per standout charity).
2. We identified the most cost-effective gap we were unable to entirely fill with the $64.0 million we recommended to Good Ventures (noting again that Good Ventures has not finalized its plans for the year and may give differently from what we’ve recommended): Malaria Consortium’s seasonal malaria chemoprevention program in Nigeria, Burkina Faso, and Chad. Our cost-effectiveness analysis suggests this gap is about 8.8x as cost-effective as cash transfers, and that Malaria Consortium could absorb about$70 million in additional funding to support this work. We have a high opinion of Malaria Consortium as an organization, and this qualitative assessment supports our consideration of this gap as highly cost-effective to fill.

Our best guess is there are limited diminishing marginal returns over the interval of this funding gap.

3. Remaining funding gaps were compared to the Malaria Consortium funding gap in Nigeria, Burkina Faso, and Chad based on (i) their estimated cost-effectiveness, (ii) our subjective assessment of the organization’s quality, and (iii) particular arguments relevant to that funding gap but not captured elsewhere in our analysis (e.g., whether our decision to not fund a particular gap would be disproportionately disruptive to an organization’s activities).

This spreadsheet lists all of our top charities’ funding needs; rows 70-79 show total funding needs gaps by charity. We relied on this list of funding needs in determining our recommendation to Good Ventures, as well as in making our assessment of how much additional funding our top charities can absorb, after taking into account our recommendation to Good Ventures.

In brief, we concluded that some charities’ funding gaps compared favorably to Malaria Consortium’s seasonal malaria chemoprevention gap, which led us to recommend a total of ~$6-10 million in funding to each of Deworm the World Initiative, Sightsavers’ deworming program, and Helen Keller International’s vitamin A supplementation program. We did not see compelling reasons to recommend funding to the other top charities ahead of Malaria Consortium’s funding gap, so we only recommended that those charities receive the$2.5 million incentive grant.

We explain our recommended allocation to Good Ventures for each of our top charities in more detail on this page.

Questions?

We’re happy to answer questions in the comments below. Please also feel free to reach out directly with any questions.

This post was written by Andrew Martin, Catherine Hollander, Elie Hassenfeld, James Snowden, and Josh Rosenberg.

Notes   [ + ]

1. ↑ “The cost-effectiveness estimates in this sheet, which we used to inform our recommended allocation differ from those in our published cost-effectiveness analysis because (1) we apply a number of adjustments to incorporate additional information (2) we apply different weightings to each program (which affects the weighted average of cost-effectiveness).” Source: See Giving Season 2018 – Allocation (public), “Cost-effectiveness results” tab, row 17. Additional details at the link. 2. ↑ “We typically won’t move forward with a charity in our process if it appears that it won’t meet the threshold of at least 2-3x as cost-effective as cash transfers. We think cash transfers are a reasonable baseline to use due to the intuitive argument that if you’re going to help someone with Program X, Program X should be more cost-effective than just giving someone cash to buy that which they need most.” June 1, 2017, GiveWell blog, How GiveWell uses cost-effectiveness analyses. The estimates presented here differ from the estimates presented in our recommendation to donors because they estimate weighted average cost-effectiveness over the whole funding gap, rather than on the margin. 3. ↑ We take into account an organization’s strength of communication with us and the comprehensiveness of its program monitoring. We factor this into our broad assessment of the organization’s cost-effectiveness. Read more: November 26, 2018, GiveWell blog, Our updated top charities for giving season 2018. function footnote_expand_reference_container() { jQuery("#footnote_references_container").show(); jQuery("#footnote_reference_container_collapse_button").text("-"); } function footnote_collapse_reference_container() { jQuery("#footnote_references_container").hide(); jQuery("#footnote_reference_container_collapse_button").text("+"); } function footnote_expand_collapse_reference_container() { if (jQuery("#footnote_references_container").is(":hidden")) { footnote_expand_reference_container(); } else { footnote_collapse_reference_container(); } } function footnote_moveToAnchor(p_str_TargetID) { footnote_expand_reference_container(); var l_obj_Target = jQuery("#" + p_str_TargetID); if(l_obj_Target.length) { jQuery('html, body').animate({ scrollTop: l_obj_Target.offset().top - window.innerHeight/2 }, 1000); } }

The post Our recommendation to Good Ventures appeared first on The GiveWell Blog.

### Update on No Lean Season’s top charity status

Mon, 11/19/2018 - 13:00

At the end of 2017, we named Evidence Action’s No Lean Season one of GiveWell’s nine top charities. Now, GiveWell and Evidence Action agree that No Lean Season should not be a GiveWell top charity this year, and Evidence Action is not seeking additional funding to support its work at this time.

This post will discuss this decision in detail. In brief, we updated our assessment of No Lean Season, a program that provides loans to support seasonal migration, based on preliminary results Evidence Action began discussing with us in July from a study of the 2017 implementation of the program (hereinafter referred to as “2017 RCT”). These results suggested the program, as implemented in 2017, did not successfully induce migration. Taking this new information into account alongside previous studies of the program, we and Evidence Action do not believe No Lean Season meets our top charity criteria at this time.

Evidence Action’s post on this decision is here.

GiveWell’s mission is to identify and recommend charities that can most effectively use additional donations. While it may be disappointing for a top charity to be removed from our list of recommendations, we believe that adding and removing top charities from our list is an important part of our process. If our top charities list never changed, we would guess we were (a) acting too conservatively (i.e. not being open enough to adding new top charities), or (b) not being critical enough of groups once they’ve been added to our list (i.e. not being open enough to removing existing top charities).

We believe this decision speaks positively of Evidence Action and demonstrates our mutual commitment to updating our views based on new evidence. GiveWell has interacted with hundreds of organizations in our history, and very few have subjected their programs to a rigorous study in the way that Evidence Action did last year and, at smaller scale, in 2014. We’re excited to work with a group like Evidence Action that is committed to rigorous study and openness about results.

Summary

In this post, we will discuss:

• The history of GiveWell and No Lean Season. (More)
• How the 2017 RCT updated our views of No Lean Season. (More)
• What did the 2017 RCT find? (More)
• How did we interpret the RCT results? (More)
• What does the future of No Lean Season look like? (More)
• Conclusion
GiveWell and No Lean Season

No Lean Season provides support for low-income agricultural workers in rural Bangladesh during the time of seasonal income and food insecurity (“lean season”). The program provides small, interest-free loans to support workers’ temporary migration to seek employment. No Lean Season is implemented by RDRS Bangladesh; Evidence Action provides strategic direction, conducts program monitoring, and provides technical assistance, among other functions. Evidence Action developed No Lean Season as part of its Beta portfolio, which is focused on prototyping and scaling cost-effective programs.

GiveWell began engaging with No Lean Season as a potential top charity in 2013, when we began to explore making an Incubation Grant to support its scale-up. We saw No Lean Season as a promising program that lacked the track record to be considered for a top charity recommendation at that time. We describe our initial interest in the program in a February 2017 blog post:

We approached Evidence Action in late 2013 to express our interest in supporting the creation of new GiveWell top charities.

In March 2014, Good Ventures made a $250,000 grant to Evidence Action to support the investigation and scale-up of promising programs. Since then, Good Ventures has made three additional grants totaling approximately$2.7 million to support the program’s scale-up.

No Lean Season continued to test and scale their program with this and other support. We decided to recommend No Lean Season as a top charity in late 2017. We based our recommendation on three randomized controlled trials (RCTs) of the program. (We generally consider RCTs to be one of the strongest types of evidence available; you can read more about why we rely on RCTs here.)

Two of the RCTs (conducted in 2008 and 2014) indicated increased migration, income, and consumption for program participants. In the third RCT, which was conducted in 2013 and has not been published, the program is considered to have failed to induce migration, potentially due to political violence that year. We discuss the RCT evidence in greater depth in our intervention report on conditional subsidies for seasonal labor migration in northern Bangladesh.

Weighing the evidence, the cost of the program, and the potential impacts, we decided No Lean Season met our criteria to be named a top charity in November 2017. We summarized our reasoning in our blog post announcing our 2017 list of top charities, and noted the risks of this recommendation:

Several randomized controlled trials (RCTs) of subsidies to increase migration provide moderately strong evidence that such an intervention increases household income and consumption during the lean season. An additional RCT is ongoing. We estimate that No Lean Season is roughly five times as cost-effective as cash transfers (see our cost-effectiveness analysis).

Evidence Action has shared some details of its plans for monitoring No Lean Season in the future, but, as many of these plans have not been fully implemented, we have seen limited results. Therefore, there is some uncertainty as to whether No Lean Season will produce the data required to give us confidence that loans are appropriately targeted and reach their intended recipients in full; that recipients are not pressured into accepting loans; and that participants successfully migrate, find work, and are not exposed to major physical and other risks while migrating.

As indicated above, No Lean Season conducted an additional RCT to evaluate its program during the 2017 lean season (approximately September to December), the preliminary results of which indicate the program failed to induce migration. With the evidence from the 2017 RCT, the case for the program’s impact and cost-effectiveness looks weaker.

Our updated perspective on No Lean Season

The 2017 RCT was a key factor in the decision to remove No Lean Season from our top charities list. Below, we discuss:

What did the 2017 RCT find?

The 2017 RCT was a collaboration between Evidence Action, Innovations for Poverty Action, and researchers from Yale University, the London School of Economics, and the University of California, Davis. In a preliminary analysis shared with GiveWell in September 2018, the researchers did not find evidence for a negative or positive impact on migration, and found no statistically significant impact on income and consumption.[1]

However, the implementation of the program during the 2017[2] lean season and the evaluation of it differed from previous iterations. No Lean Season operated at a larger scale in the fall of 2017 than it had previously, offering loans to 158,155 households, compared with 16,268 households in 2016. Relative to earlier versions of the program, the program in 2017 involved (a) higher-intensity delivery of the intervention (offering loans to most eligible individuals) and (b) broader eligibility requirements (the eligibility rate in 2017 was 77 percent, compared with 49 percent in 2016).[3]

At this point, neither GiveWell, nor No Lean Season, nor the researchers feel we have a conclusive understanding of why the program failed to induce migration. However, No Lean Season and the researchers are exploring various hypotheses about what may explain the failure to induce migration, and they note that some suggestive evidence supports some hypotheses more than others. The researchers have posited several possibilities:

1. The way the program was targeted in 2017 was suboptimal. The Migration Organizers, who survey households for eligibility and offer and disburse loans (more detail here under “Migration Organizers”), may have focused their efforts on the individuals that were seen as most likely to migrate, rather than those who needed a loan to afford migration. The use of loan targets during implementation may have inadvertently incentivized this behavior.[4] If, for example, loan officers mostly made loans to people who would have migrated regardless of receiving a loan, this could have led to the lack of impact on migration found in the study.
2. The 2017 lean season was particularly bad for the program. The researchers note that severe flooding and associated implementation delays in some regions may have caused problems in 2017. The researchers plan to look more closely at the regions that experienced flooding, though they note that they don’t have the data necessary to make experimental comparisons.[5] In addition, a 2013 trial may have failed due to issues that were specific to the year of that trial, such as increased labor strikes.
3. There exists another (currently unknown) reason why this program won’t work at scale. Conditions in Bangladesh may have changed, negative spillovers (harmful impacts for individuals who did not receive loans) may cancel out gains, or pilot villages may have been strategically picked in earlier trials.[6]

The researchers are considering all of these possibilities. After considering various possible theories as well as some non-experimental data (including administrative data and data from a special-purpose survey of Migration Organizers who worked on the program in 2017), they feel that the ‘mistargeting’ theory is the most likely explanation and the explanation most consistent with the analysis.[7]

In scenario (1), No Lean Season may be able to identify and fix the problem. In scenario (2), GiveWell will need to update our estimate of the impact of the program to take into account the fact that periodic program failures due to external factors are more likely than we previously thought. In scenario (3), the program is unlikely to be effective in the future.

How did we interpret the RCT results?

We don’t know the extent to which each of the above explanations contributed to the study not finding an effect on migration.

We used the results of the 2017 RCT to update our cost-effectiveness estimate for the program. Cost-effectiveness estimates form arguably the most important single input into our decisions about whether or not to recommend charities (more on how GiveWell uses cost-effectiveness analyses here). When we calculate a program’s cost-effectiveness, we take many different factors into account, such as the administrative and program costs and the expected impact. We also make a number of educated guesses, such as the likelihood that a program’s impact in a new country will be similar to that in a country where it has previously worked. Below, we describe the mechanism by which the 2017 RCT result was incorporated into our model and how it changed our conclusion.

Prior to this year, we formed our view of No Lean Season based on the three small-scale RCTs mentioned above (conducted in 2008, 2013, and 2014). Each of these RCTs looked at a slightly different version of the program. We believed that the ‘high-intensity’ arm of the 2014 RCT was the version most likely to resemble the program at scale. We thus used the migration rate measured in this arm of the RCT as our starting point for calculating the program’s impact.

The high-intensity arm of the 2014 RCT also had the highest measured migration rate of the three RCTs we assessed, and so we wanted to give some consideration to the less-positive results found in the other two assessments. We applied a small, downward adjustment to the rate of induced migration observed in the 2014 high-intensity arm in our cost-effectiveness model; this was an educated guess, based on the information we had. Our best guess was that the program would lead, in expectation, to 80% of the induced migration seen in the 2014 high-intensity arm.[8]

Now, the preliminary 2017 RCT results show no significant impact on migration rates or incomes. Because this trial was large and very recent, we updated our expectations of the impact of the program substantially, and in a negative direction. Our best guess now is that the program will lead, in expectation, to 40% of the induced migration seen in the 2014 high-intensity arm. Holding other inputs constant, this adjustment reduces our estimate of No Lean Season’s cost-effectiveness by a factor of two.

This reduced cost-effectiveness, along with our updated qualitative picture of No Lean Season’s evidence of effectiveness, led to the decision to remove No Lean Season from our top charities list.

What does the future of No Lean Season look like?

Although they are not raising more funding at this time, No Lean Season has over two years’ worth of remaining funding. We understand that the organization has made changes to the program design in 2018 based on emerging interpretations of the 2017 results, and has collected additional data to evaluate some of the hypotheses which may explain those results (including, for example, a survey of Migration Organizers who worked on the 2017 program). They plan to subject the 2018 implementation round to an additional ‘RCT-at-scale,’ with a particular focus on reassessing the program’s effects on migration, income and consumption, as well as potential effects at migration destinations. They will continue to explore what may have caused the issue in the 2017 program at scale, and to see whether they can find a solution. If they do that, we’ll want to reassess the evidence and the costs to determine whether No Lean Season meets our bar for top charity status. Evidence Action believes we should have the necessary information to reassess starting in mid-2019, based on the results of the RCT conducted during the 2018 lean season and other analyses they perform.

Conclusion

This is the second time since 2011 that we have removed a top charity from our list (prior to 2011, our top charities list was fairly different from today; we made a big-picture shift in our priorities that year that led us to our more recent lists). The previous removal occurred in 2013, when we took the Against Malaria Foundation (AMF) off of our list because we didn’t believe it could absorb additional funding effectively in the near term. AMF was reinstated as a top charity in 2014.

The decision to remove a top charity is never easy. But continuously evaluating GiveWell’s recommended charities is an important part of our work, and we take it seriously. It’s easy to talk about a commitment to evidence when the results are positive. It’s hard to maintain that commitment when the results are not. We’re excited to work with a group like Evidence Action that is committed to rigorous program evaluation and open discussion of the results of those evaluations. Its openness about these results has increased our confidence in Evidence Action as an organization. We look forward to seeing the results from the 2018 RCT in 2019.

Notes

[1] “At this early stage in analysis, we find no evidence that the program had an impact (positive or negative) on migration, caloric intake, food expenditure, or income.” Evidence Action, unpublished summary document, Page 1.

[2] The 2017 RCT studied a period from the fall of 2017 through early 2018.

[3] “This study has two main goals:

1. “A replication of previous findings showing positive impact of incentivized migration on seasonal migration, caloric intake, food and non-food expenditure, income, and food security. Our aim is to estimate impact of a scaled version of the No Lean Season program: intensifying program implementation within branches and expanding the provision of loans to all eligible households.”

Unpublished summary document, Page 1.

[4] “The second set of explanations focus on unintentional implementation changes caused by the change ineligibility, the vastly expanded scope of the program, or other factors. In the most recent round, it is possible that Migration Organizers (MOs) focused their efforts on those households who were most likely to migrate even without a loan to the exclusion of the target population households who need a loan to afford migration. Such behavior may have even been encouraged by the use of targets set by the NGO to manage implementation at such a large scale. We have implemented a qualitative survey to understand the incentives and actions of MOs last year, and are revising our instructions to avoid any possibility of this issue this year.” Evidence Action, unpublished summary document (with minor revision from Evidence Action), Page 11.

[5] “Most notably, the program was affected by severe flooding in many regions, and implementation was subsequently delayed as well. We are still evaluating whether these regions are the ones with the most diminished effects, although we lack the data in control areas to conduct an experimental comparison.” Evidence Action, unpublished summary document, Page 11-12.

[6] “It is possible that what we observe this year may be the true effect of the No Lean Season program when implemented at scale. This may be because conditions in rural Bangladesh have changed since the initial years of success, spillovers at scale cancel out any gains observed in small-scale pilots, or pilot villages were selected because they were most likely to be receptive to the program.” Evidence Action, unpublished summary document, Page 11.

[7] Evidence Action, “Interpretation of 2017 Results” deck and narrative (unpublished)

[8] “This adjustment is used to account for external validity concerns not accounted for elsewhere in the CEA.

“The default adjustment value of 80% is our best guess about the appropriate value, but it is not based on a formal calculation.

“The program at scale takes place in the same region with the same implementers (RDRS and Evidence Action) as the source of our key evidence for the intervention (the 2014 RCT). The program at scale differs in some aspects of implementation, particularly the inclusiveness of the eligibility criteria and the proportion of eligible households offered an incentive. In the 2014 RCT, the subsidy was a cash transfer rather than an interest-free loan, however the 2008 RCT found a similar effect regardless of whether the subsidy was a cash transfer or an interest-free loan.

“There is some evidence (from a 2013 RCT) suggesting that the program may be ineffective when the perceived risk of migrating increases for reasons such as labor strikes and violence. The researchers estimated that these are 1-in-10 year events.

“Additional discussion related to this parameter can be found at https://www.givewell.org/charities/no-lean-season#programdifferentfromRCTs.” 2018 GiveWell Cost-Effectiveness Model — Version 10, “Migration subsidies” tab, note on cell A19.

The post Update on No Lean Season’s top charity status appeared first on The GiveWell Blog.