We have published a more recent version of this page. See our mistakes page.
This page logs mistakes we've made, strategies we should have planned and executed differently, and lessons we've learned.
A full review of progress to date – both accomplishments and shortcomings – is available here.
Please contact us with other items that should be listed here.
- Major issues
- December 2007: Overaggressive and inappropriate marketing
- 2009 to 2012: Errors in publishing private material
- December 2011: Miscommunicating to donors about fees and the deductibility of donations to our top charity
- March to June 2011: Poor research time allocation
- 2006 to 2011: Tone issues
- July 2009 to November 2010: Quantitative charity ratings that confused rather than clarified our stances
- February to November 2010: Excessive resources allocated to U.S. Equality of Opportunity report
- May 2008 to December 22, 2008: Failed to track website traffic/statistics
- June to September 2008: Business plan over-focused on marketing, under-focused on research
- June 2007 to May 2008: Research process relied excessively on open-ended grant applications
- June to November 2007: Set overly ambitious deadlines for completing our research
- June 2007: Poorly constructed "causes" led to suboptimal grant allocation
- Smaller issues
- December 2014: Errors in our cost-effectiveness analysis of Development Media International (DMI)
- November to December 2014: Lack of confidence in the cost-effectiveness analyses we relied on for our top charities recommendations
- January to December 2014: Completed fewer intervention reports than projected
- November 2014: Suboptimal grant recommendation to Good Ventures
- November 2014: Not informing candidate charities of our recommendation structure prior to publishing recommendations
- July 2014: Published an update to the intervention report on cash transfers that misstated our view
- February 2014: Incorrect information on homepage
- January to November 2013: Social (non-family, non-financial) relationship between GiveWell staff members and staff of a recommended charity not publicly disclosed
- February to September 2013: Infrequent updates on our top-ranked charity
- May to June 2013: Unpublished website pages intermittently available publicly
- April to December 2012: Taking too much of job applicants' time early in the recruiting process
- March to November 2012: Poor planning led to delayed 2012 charity recommendations release
- October 2012: Website downtime
- June 2012: Failure to discuss sensitive public communication with board member
- July 2007 to March 2012: Phone call issues
- December 2011: Poor communication to donors making larger donations (e.g., greater than $5,000) via the GiveWell website
- December 2011: Problems caused by GiveWell's limited control over the process for donating to our top charities
- March to December 2011: Problem with tracking code in Google Analytics led to erroneous web stats
- January to September 2011: Staff time spent managing volunteers
- November 22, 2010: Website down during time with media attention
- Late 2009: Misinterpreted a key piece of information about a charity to which we gave a $125,000 grant
- August 1, 2009 to December 31, 2009: Grant process insufficiently clear with applicants about our plans to publish materials
- November 25, 2009: Mishandling incentives to share information
- May 20, 2009 to August 27, 2009 and October 25, 2009 to November 11, 2009: Carelessness with updates to website; broke email update signup form
- December 2007 to December 2009: Website not sufficiently engaging, generated too little substantive feedback
- May to August 2009: Excessive time spent on policies and procedures
- May 2009: Failed to remove two private references from a recording that we published
- January to March 2009: Poor research strategy
- January to September 2008: Paying insufficient attention to professional development and support
- June 2008: Overly aggressive time estimates
- February to May 2008: Premature hiring
- December 2007 to May 2008: Research process should have incorporated more and earlier discussions with charities' staff members
December 2007: Overaggressive and inappropriate marketing
How we fell short: As part of an effort to gain publicity, GiveWell's staff (Holden and Elie) posted comments on many blogs that did not give adequate disclosure of our identities (though we did use our real first names); in a smaller number of cases, we posted comments and sent emails that deliberately concealed our identities. Our actions were wrong and rightly damaged GiveWell's reputation. More detail is available via the page for the board meeting that we held in response.
Given the nature of our work, it is essential that we hold ourselves to the highest standards of transparency in everything we do. Our poor judgment caused many people who had not previously encountered GiveWell to become extremely hostile to it.
Steps we have taken to improve: We issued a full public disclosure and apology, and directly notified all existing GiveWell donors of the incident. We held a Board meeting and handed out penalties that were publicly disclosed, along with the audio of the meeting. We increased the Board's degree of oversight over staff, particularly with regard to public communications.
2009 to 2012: Errors in publishing private material
How we fell short: There were two issues, one larger and one smaller:
- Since 2009, we've made a practice of publishing notes from conversations with charities and other relevant parties. Our practice is to share the conversation notes we take with the other party before publication so that they can make changes to the text before publication. We only publish a version of the notes that the other party approves and will keep the entire conversation confidential if the party asks us to.
In November 2012, a staff member completed an audit of all conversations that we had published. He identified two instances where we had erroneously published the pre-publication (i.e., not-yet-approved) version of the notes. We have emailed both organizations to apologize and inform them of the information that we erroneously shared.
- In October 2012, we published a blog post titled, "Evaluating people." Though the final version of the post did not discuss specific people or organizations, a draft version of the post had done so. We erroneously published the draft version which discussed individuals. We recognized our error within 5 minutes of posting and replaced the post with the correct version; the draft post was available in Google's cache for several hours and was likely available to people who received the blog via RSS if they had their RSS reader open before we corrected our error (and did not refresh their reader).
We immediately emailed all of the organizations and people that we had mentioned to apologize and included the section we had written about them. Note that none of the information we published was confidential; we merely did not intend to publish this information and it had not been fully vetted by GiveWell staff and sent to the organizations for pre-publication comment.
Steps we've taken to improve: In November 2012, we instituted a new practice for publishing conversation notes. We now internally store both private and publishable versions of conversation notes in separate folders (we hope that this practice reduces the likelihood that we upload the wrong file) and have assigned a staff member to perform a weekly audit to check whether any confidential materials have been uploaded. As of this writing (December 2012) we have performed 3 audits and found no instances of publishing private material.
We take the issue of publishing private materials very seriously because parties that share private materials with us must have confidence that we will protect their privacy. We have therefore reexamined our procedures for uploading files to our website and are planning to institute a full scale audit of files that are currently public as well as an ongoing procedure to audit our uploads.
December 2011: Miscommunicating to donors about fees and the deductibility of donations to our top charity
How we fell short: Our top-rated charity in 2011 was the Against Malaria Foundation. We made two errors in the way we communicated to donors about the ramifications of donating to AMF.
- Fees: On our donate to AMF page, we told donors that "no fees are charged on donations to AMF." This was incorrect. Donors who give via AMF's website are charged normal credit card processing fees. We now understand that we miscommunicated with AMF on this issue; AMF did not intend to communicate that there are no processing fees and was unaware that we were communicating this on our site.
- Tax deductibility in Australia: On our top charities page that we published on November 29, 2011, we listed Australia as one of the countries for which donors could take tax deductions. We believed this was accurate because AMF listed Australia as one of the countries in which it is a registered charity. In early December, an Australian donor emailed us to let us know that while AMF is a registered charity and corporations can deduct donations to it, it does not have a status that allows individuals to deduct donations to it. (This issue is discussed in a 2012 blog post.)
Steps we have taken to improve:
- Fees: We changed the language on our page to clarify that credit card fees are charged on donations via AMF's website. We also provided donors who wished to give for the support of AMF to give donations directly to GiveWell. Because GiveWell is enrolled in Google's Grants programs, Google pays credit card processing fees for donations. GiveWell then has the ability to regrant these funds to AMF.
- Tax deductibility in Australia: We took several actions. (1) We emailed Rob Mather, AMF's CEO. He agreed that the charity status page on AMF's website was misleading. AMF edited the page to clarify its status in Australia, and Rob Mather offered to refund any donations (or parts of donations) made by Australians relying on the fact that they could receive a tax deduction. (2) On our site, we removed Australia from the list of countries in which AMF is registered for individual-donor tax deductibility. (3) We emailed all Australian donors who had given to AMF (and had found AMF via GiveWell) since we had posted that donations to AMF are tax-deductible for Australians to let them know we had erred and we communicated Rob Mather's offer to refund donations. AMF is in the process of applying for tax deductible status for individuals and will inform us if and when that has been granted. AMF has also told us that the two donors that have asked for refunds have both said they will donate the same amount to AMF when the tax deductible status is in place.
March to June 2011: Poor research time allocation
How we fell short: Our top priority in 2011 was identifying additional top-rated charities. In February, we selected approximately 10 organizations we felt were most promising and contacted them. Over the next 3 months, we engaged in extensive back and forth with all of them, spending a great deal of time on each as we tried to answer all of our questions and write full reviews even though we realized that most of these organizations seemed unlikely to receive our highest ratings. (Often, we even spent a great deal of time negotiating with particular charities about the language we would use in their review.) Because we allocated so much time to this set of organizations, we did not allocate sufficient time to engaging with and learning about new organizations which would likely have had a better chance of receiving our top ratings.
Part of the reason we did this was because we wanted to make sure we understood the charities under investigation; part of the reason was because we wanted to publish reviews that were amenable to both us and them. On reflection, however, we believe that both of these goals were less important than the goal of maximizing our odds of finding top-rated charities, and we should have been quicker to sacrifice full understanding and in-depth reviews of these groups for the more important goal.
Steps we have taken to improve: In mid-June 2011, we adjusted our research process. Instead of aiming for a full, in-depth review of each charity contacted, we only committed to completing a single phone call and document review for a charity before drafting a review based on the materials we have (choosing not to engage in further back-and-forth if we did not find the charity highly promising). This change allowed us to consider and contact close to 100 charities in 2011. Note that there are downsides to this change: as of February 2012, we have a large backlog of "pending charities" from 2011 for which we still need to draft and publish reviews.
2006 to 2011: Tone issues
How we fell short: We continue to struggle with an appropriate tone on our blog, one that neither understates nor overstates our confidence in our views (particularly when it comes to charities that we do not recommend). A recent example of a problematic tone is our December 2009 blog post, Celebrated Charities that we Don't Recommend. Although it is literally true that we don't recommend any of the charities listed in that post, and although we stand by the content of each individual blog post linked, the summaries make it sound as though we are confident that these charities are not doing good work; in fact, it would be more accurate to say that the information we would need to be confident isn't available, and we therefore recommend that donors give elsewhere unless they have information we don't.
We wish to be explicit that we are forming best guesses based on limited information, and always open to changing our minds, but readers often misunderstand us and believe we have formed confident (and, in particular, negative) judgments. This leads to unnecessary hostility from, and unnecessary public relations problems for, the groups we discuss.
Steps we have taken to improve: We do feel that our tone has slowly become more cautious and accurate over time. At the time of this writing (July 2010), we are also resolving to run anything that might be perceived as negative by the group it discusses, before we publish it publicly, giving them a chance to make any corrections to both facts and tone. (We have done this since our inception for charity reviews, but now intend to do it for blog posts and any other public content as well.)
July 2009 to November 2010: Quantitative charity ratings that confused rather than clarified our stances
How we fell short: Between July 2009 and November 2010, we assigned zero- to three-star ratings to all charities we examined. We did so in response to feedback from our fans and followers - in particular, arguments that people want easily digested, unambiguous “bottom line” information that can help them make a decision in a hurry and with a clean conscience. Ultimately, however, we decided that the costs of the ratings - in terms of giving people the wrong impression about where we stood on particular charities - outweighed the benefits.
Steps we have taken to improve: By December 2010 we will replace our quantitative ratings with more complex and ambiguous bottom lines that link to our full reviews.
- September 2010 blog post on the problems with quantitative charity ratings
- October 2010 blog post on why these ratings don't fit with our mission
February to November 2010: Excessive resources allocated to U.S. Equality of Opportunity report
How we fell short: In February 2010, we wrote that we believed we could do an in-depth report on U.S. equality of opportunity, without needing to allocate substantial resources (particularly co-Founders' time). We therefore believed that this report represented a good use of resources even though we didn't find the cause to be among the most promising.
As it turned out, we had substantially underestimated the complexity and difficulty of this report, which we eventually published in November 2010. As we wrote shortly afterward, we believe that there is relatively little value in research reports on causes we don't find to be among the most promising. Thus, we ended up putting a large amount of time and effort into a report that we don't consider highly valuable to our mission.
Steps we have taken to improve: We feel we have learned that it is never simple to do a first-time in-depth report on a given charitable sector. Going forward, we do not intend to research new sectors except as a means to finding the best giving opportunities possible, (see our February 2010 post for our reasoning). The one exception we've made is for disaster relief due to the substantial media attention around it.
May 2008 to December 22, 2008: Failed to track website traffic/statistics
How we fell short: We discovered on 12/22/2008 that we had not been tracking any visits to our site since May of that year. At the time, we were not emphasizing website traffic as a metric, and so had not been checking up on it regularly. Failing to do so meant that we lost valuable data that could have been used to measure our progress over time. Over a year later, our attempts to review our progress have been hampered by the fact that this data is missing.
Steps we have taken to improve: We reinstituted tracking immediately upon discovering our error, and have since checked up on the tracking more regularly. Going forward, we plan to review key website metrics on a quarterly basis.
June to September 2008: Business plan over-focused on marketing, under-focused on research
How we fell short: After completing our first year of research, we agreed that the focus of our second year should be on increasing the "money moved" by our research. (Details here.) After several months focusing on this goal, we felt that we were spending insufficient time on research - the core of our mission - and weren't getting enough return out of the time we were spending on marketing. (Details here.) We now believe that we over-focused on marketing, at a stage in our development where doing so was premature, and as a result did not make as much progress during these months as we should have.
Steps we have taken to improve: We created a new business plan that shifted our focus primarily to research; this business plan, along with companion documents that lay out the reasons for our change of direction, is available here.
June 2007 to May 2008: Research process relied excessively on open-ended grant applications
How we fell short: In our first year, we focused our time and effort overwhelmingly on getting information from applicants, as opposed to from academic and other independent literature. Applicants found this process extremely time-intensive and burdensome, particularly given the size of the grants. We also found that much of the information we found essential in making informed decisions was not submitted through grant applications (rather, we found it through independent research).
Steps we have taken to improve: We have modified our research process. Our basic process now (for the latest implementation, see our research process for international aid is to (a) search independent research to identify particularly promising approaches; (b) use heuristics to identify charities that are both transparent (i.e., sharing substantial information) and promising; (c) contact charities with specific and targeted questions. We feel that this process leads both to better information and to a lower burden on charities.
June to November 2007: Set overly ambitious deadlines for completing our research
How we fell short: In July 2007, when we mailed out grant applications to charities, we told them that we would award grants by December 2007. We did not finish all necessary research by December 2007, and thus only paid out 3 of our planned 5 grants by the agreed upon date. We paid the last two grants in early March of 2008.
What we are doing to improve: Setting deadlines was particularly difficult for our first year, as we had no previous experience with grantmaking. With the first year behind us, we have a better sense of how time-consuming research is, and we have since set more conservative expectations. Our 2008-2009 report on international aid was delivered by the deadline we had set.
June 2007: Poorly constructed "causes" led to suboptimal grant allocation
How we fell short: For our first year of research, we grouped charities into causes ("Saving lives," "Global poverty," etc.) based on the idea that charities within one cause could be decided on by rough but consistent metrics: for example, we had planned to decide Cause 1 (saving lives in Africa) largely on the basis of estimating the “cost per life saved” for each applicant. The extremely disparate nature of different charities' activities meant that there were major limits to this type of analysis (we had anticipated some limits, but we encountered more).
Because of our commitment to make one grant per cause and our overly rigid and narrow definitions of "causes," we feel that we allocated our grant money suboptimally. For example, all Board members agreed that we had high confidence in two of our Cause 1 (saving lives) applicants, but very low confidence in all of our Cause 2 (global poverty) applicants. Yet we had to give equal size grants to the top applicant in each cause (and give nothing to the 2nd-place applicant in Cause 1).
Steps we have taken to improve: We have shifted our approach to "causes" so that they are defined more broadly. This gives us more flexibility to grant the organizations that appeal to us most. We now explore broad sets of charities that intersect in terms of the people they serve and the research needed to understand them, rather than narrower causes based on the goal of an “apples to apples” comparison using consistent metrics. For example, our recent research report addresses the broad area of international aid.
December 2014: Errors in our cost-effectiveness analysis of Development Media International (DMI)
How we fell short: In early 2015, we discovered some errors in our cost-effectiveness analysis of DMI. See this blog post for details.
Steps we have taken to improve: Going forward, we plan to improve the general transparency and clarity of our cost-effectiveness models, and explicitly prioritize work on cost-effectiveness throughout our research process. See this section of our 2015 annual review for more.
November to December 2014: Lack of confidence in the cost-effectiveness analyses we relied on for our top charities recommendations
How we fell short: We were not highly confident in our cost-effectiveness estimates when we announced our updated charity recommendations at the end of 2014, a fact we noted in the post, because we finalized our cost-effectiveness analyses later in the year than would have been ideal. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: We plan to improve these analyses by reworking our cost-effectiveness models to improve the general transparency and clarity of the analyses and explicitly prioritizing work on cost-effectiveness throughout our research process.
We are also experimenting with more formal project management to increase the likelihood that we complete all tasks necessary for our year-end recommendations update at the appropriate time.
January to December 2014: Completed fewer intervention reports than projected
How we fell short: We published fewer intervention reports than we had planned to at the beginning of 2014. We completed two new intervention reports in 2014, but at the beginning of 2014, we wrote that we hoped to publish 9-14 new reports during the year. On reflection, our goal of publishing 9-14 intervention reports was arbitrary and unrealistic given the amount of time that it has typically taken us to complete intervention reports in the past. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: We have learned more about how much work is involved in completing an intervention report and hope to make more realistic projections about how many we can complete in the future.
November 2014: Suboptimal grant recommendation to Good Ventures
How we fell short: In 2014, we erred in our recommendation to Good Ventures about its giving allocation to our top charities. We made this recommendation two weeks before we announced our recommendations publicly so that we could announce their grants as part of our top charities announcement. If we had fully completed our analysis before making a recommendation to Good Ventures, we likely would have recommended relatively more to AMF and relatively less to GiveDirectly. See this part of our 2014 annual review for more detail.
Steps we have taken to improve: In the end, we adjusted the public targets we announced based on the grants Good Ventures had committed to, so we don’t believe that donors gave suboptimally overall. In the future we expect to make — and announce — our recommendations to Good Ventures and the general public simultaneously.
November 2014: Not informing candidate charities of our recommendation structure prior to publishing recommendations
How we fell short: In our 2014 recommendation cycle, we did not alert our candidate charities of our "Standout Charity" second-tier rating prior to announcing our recommendations publicly. Some of our candidate charities were surprised when they saw their ranking as a "Standout Charity," as they had been assuming that they would either be recommended as a top charity or not recommended at all.
Steps we have taken to improve: We will be more cognizant of how we communicate with charities in the future and will continue to solicit feedback from them so we can identify any other ways in which our communication with them is suboptimal.
July 2014: Published an update to the intervention report on cash transfers that misstated our view
How we fell short: Elie assigned a relatively new Research Analyst to the task of updating the intervention report on cash transfers. The analyst made the specific updates asked for in the task, which led him to change the report’s conclusion on the effect of cash transfers on business expenditures and revenue. A Summer Research Analyst vetted the page, and we published it. After publishing the update, another GiveWell staff member, who had worked on the page previously, noticed that the report’s conclusion on business expenditures and revenue misstated our view.
Steps we have taken to improve: We have made two changes. First, when passing off ownership of a page from one staff member to another, we now involve all staff members who had previously owned the page via an explicit "hand-off" meeting, and by getting their approval before publishing the page. Second, we are now more careful to ensure that all changes made by relatively inexperienced staff are reviewed by more experienced staff before publication.
February 2014: Incorrect information on homepage
How we fell short: On February 4, 2014, we asked our website developer to make a change to the code that generates our homepage. In the process, he inadvertently copied the homepage content from November 2013. This content had two main differences with the up-to-date content. First, it described our top charities as “proven, cost-effective, underfunded and outstanding” rather than “evidence-backed, thoroughly vetted, and underfunded,” wording we changed in late-2013 because we felt it more accurately described our top charities. Second, it listed our top charities as AMF, GiveDirectly, and SCI, rather than GiveDIrectly, SCI, and Deworm the World. According to our web analytics, 98 people visited our AMF page directly after visiting the homepage, possibly believing AMF to be a top charity of ours. Note that the top of our AMF review correctly described our position on AMF at this time.
Steps we’ve taken to improve: We discovered the problem on February 25 and fixed it immediately. We have added a step to our standard process for checking the website after a developer works on it to look for content that is not up to date.
January to November 2013: Social (non-family, non-financial) relationship between GiveWell staff members and staff of a recommended charity not publicly disclosed
How we fell short: Timothy Telleen-Lawton (GiveWell staff member as of April 2013) has been friends with Paul Niehaus (GiveDirectly President and Director) for many years. When Timothy met Holden Karnofsky (GiveWell's Co-Founder and Co-Executive Director) in April 2011, he suggested that GiveWell look into GiveDirectly and introduced Holden and Paul by email. GiveWell later recommended GiveDirectly as a top charity in November 2012, before Timothy was on GiveWell staff.
Starting in January 2013, Holden started living in a shared house with Timothy, around the same time Timothy started a trial to work at GiveWell. Paul has visited and stayed at the shared house several times. We should have publicly disclosed the social connection between Paul and Holden and Tim.
Note that this mistake solely relates to information we should have publicly disclosed to avoid any appearance of impropriety. We do not believe that this relationship had any impact on our charity rankings. Tim was not the staff member responsible for the evaluation of GiveDirectly, and Holden has had relatively little interaction with Paul (and had relatively little interaction with Tim prior to moving to San Francisco in 2013).
Steps we have taken to improve: We publicly disclosed this fact in December 2013; at that time, we also created a page to disclose conflicts of interest.
February to September 2013: Infrequent updates on our top-ranked charity
How we fell short: We aimed to publish regular updates on the Against Malaria Foundation, but we went most of the year (February to September) without any updates. This was caused by our desire to publish comprehensive updates, and we allowed expectations of new information being available shortly to delay publishing brief updates that had meaningful but limited information.
Steps we have taken to improve: As of July 2013, we changed our process for completing top-charity updates. We began publishing notes from our conversations with these charities (as we do for many of the conversations we have more generally) which should lead to more timely updates on our top charities.
May to June 2013: Unpublished website pages intermittently available publicly
How we fell short: From May 20 to June 26, private content was intermittently available to the public on the GiveWell website. A change we made on May 20 caused pages set to be visible by staff only to appear, in some browsers, as a page with a login screen and below it, the unpublished content. Unpublished content includes both confidential information and incomplete research. Confidential information on unpublished pages is generally information that we expect to be able to publish, but which we have not yet received approval from an external party to publish. However, there are exceptions to this and it is possible that sensitive information was revealed. We are not aware of any cases of sensitive information being revealed.
Steps we have taken to improve: We fixed the problem a few hours after discovering it. We have added monitoring of unpublished pages to our list of regular website checks.
April to December 2012: Taking too much of job applicants' time early in the recruiting process
How we fell short: During this period, our jobs page invited applicants to apply for our research analyst role. We responded to every applicant by asking them to work on a "charity comparison assignment" in which each applicant compared three charities and discussed which charity they would support and why. This assignment took applicants between 6 and 10 hours to complete. During this period, approximately 50 applicants submitted the assignment, of which we interviewed approximately 8.
We now feel that asking all applicants to complete this test assignment likely took more of their time than was necessary at an early stage in the recruiting process and may have led some strong applicants to choose not to apply.
Steps we've taken to improve: We no longer ask all applicants to complete this assignment. In December 2012, we changed our jobs page to more clearly communicate about our hiring process.
March to November 2012: Poor planning led to delayed 2012 charity recommendations release
How we fell short: In early GiveWell years, we aimed to release updated recommendations by December 1st in order to post our recommendations before "giving season," the weeks at the end of the year when the vast majority of donations are made. In 2011, we released our recommendations in the last week of November, but then ran into problems related to donation processing. To alleviate those problems in the future, we planned to release our recommendations in 2012 by November 1st to give us sufficient time to deal with problems before the end of the year rush of giving.
In 2012, we did not release our recommendations until the last week of November (significantly missing our goal). We continued to publish research about the cost-effectiveness and evidence of effectiveness for the interventions run by our top charities throughout December, which meant that some donors were making their giving decisions before we had published all the relevant information. The primary cause of the delay was that we did not start work on GiveDirectly, the new 2012 top-rated charity until mid-September, which did not give us enough time to finish its full review by the November 1st deadline.
Steps we've taken to improve: In 2013, we again aim to release our recommendations by November 1. This year, we plan to explicitly consider possible top-rated charities on July 1st and move forward with any contenders at that point.
October 2012: Website downtime
How we fell short: On October 2, 2012, we attempted to perform security updates for our website and in the process caused the website to crash. Our understanding is that this crash would have been very hard to predict. The mistakes we made were in (a) poor timing of the attempt, and (b) poor handling of the restoration.
The site was offline for about four hours largely due to (a) attempting the updates at a time when our website developer was not available to help us troubleshoot, and (b) our attempt to restore the site through a method that we had not previously tried and which failed and caused further problems. Due to higher than average traffic because of recent media attention, this caused about 300-400 visitors to be unable to access the site.
After the site came back online, there were a number of further problems, including some content being out of date, links pointing to incorrect pages, infinite redirect loops on some pages, and missing files. We did not fully test the site and therefore failed to recognize many of the problems at the time and they persisted until the next morning.
Internal communication about the problems was sub-optimal.
Dealing with the issues caused by the crash cost cost a non-significant amount of high-level staff time.
Steps we have taken to improve: Going forward our website developer will directly perform security updates for our website and other complex website changes, or, at a minimum, we will conduct all updates and changes at a time that we know the developer is available to help solve unexpected problems. Before each software update, we will create an up-to-date backup of the website server that can be easily and quickly restored in the case of a site crash.
We will limit technical changes to the website to low traffic times -- early mornings or evenings and never following significant media attention or at other peak traffic times, except in the case of a very high value change.
In the future, we will be more cognizant of the importance of checking the site for problems following changes and communicating internally and externally about issues.
June 2012: Failure to discuss sensitive public communication with board member
How we fell short: In late June 2012, we published a blog post on the partnership between GiveWell and Good Ventures. We generally discuss sensitive public communication with a board member before we post, but failed to do so in this case. The post was not as clear as it should have been about the nature of GiveWell's relationship with Good Ventures. The post caused confusion among some in our audience; for example, we received questions about whether we had 'merged.'
Steps we've taken to improve: GiveWell staff will be more attentive in the future to sharing sensitive public information with the board member responsible for public communication before posting.
July 2007 to March 2012: Phone call issues
How we fell short: Throughout GiveWell's history, we have relied on Skype and staff's individual cell phones to make phone calls. This led to instances of poor call quality or dropped calls, but given the fact that GiveWell was a startup, those we spoke with generally understood. In addition, we had not always confirmed with call participants the phone number to use for a particular call or set up and send agendas for the call in advance. Earlier in GiveWell's history, participants likely understood that we were a very new, small organization just getting started and aiming to control costs. But, as we've grown this is no longer a reasonable justification, and both of the problems listed here may have had implications for the professionalism we've projected to those we've spoken with.
Steps we have taken to improve: We have continued to be more vigilant about confirming that all participants are aware of the number to use for scheduled calls. In March 2012, we set up dedicated lines and handsets for our calls.
December 2011: Poor communication to donors making larger donations (e.g., greater than $5,000) via the GiveWell website
How we fell short: In giving season 2011, there were 3 major issues which we communicated poorly about to donors:
- While Google covers credit card processing fees for charities enrolled in the Google Grants program (which includes GiveWell, itself, and many of our top charities), many charities are not enrolled and therefore donors who give to them via our website do pay credit card processing fees on their donations. While these fees are small in absolute terms for smaller donors, a 3% fee on a $10,000 donation is $300. Some donors may realize this and choose to give via credit card regardless. Some, however, may not have realized this and would have preferred to have mailed a check to save the fee.
- People making large donations very frequently run into problems with their credit card companies (due to the fact that they are spending so much more on a single item than they usually do). In our experience, about half of donations over $5,000 are declined the first time a donor tries to make the gift and are only cleared after he or she speaks with his card company. This creates confusion and unexpected hassle for donors trying to give to charity.
- Giving via appreciated stock has beneficial implications for donors allowing them to reduce future capital gains taxes and therefore give more to charity (without giving more "real" money). We have never broadcast this message to donors.
Steps we have taken to improve: Though GiveWell's responsibility for communicating about the points above varies, communicating well about all of the above furthers our mission. We plan to communicate better about these points to larger donors in 2012. (More at a 2012 blog post.)
December 2011: Problems caused by GiveWell's limited control over the process for donating to our top charities
How we fell short:
- On December 21, 2011, a representative from Imperial College Foundation (the organization receiving donations for the support of the Schistosomiasis Control Initiative, our #2-rated charity in 2011) emailed us to let us know that its Google Checkout account had been suspended. Donors who wanted to give to SCI via the GiveWell website give via Google Checkout, and though the Google Checkout button is on the GiveWell website, the charity owns the Checkout account and donations go right to it. GiveWell staff therefore did not know there was a problem until the ICF representative informed us of it. We still do not know how long the problem lasted or whether any donors attempted to make donations during the time the account was suspended. (We do not even know how Google communicated to them about the error). ICF contacted Google but has not determined what led to the account suspension.
Once we learned of the problem, we reconfigured donations to go through GiveWell.
- As noted elsewhere on this page, many larger donations made via credit card are initially declined by credit card companies due to the fact that many donors give a larger amount to charity than they spend in a single purchase throughout the year. Because donations go directly to our charities, at times, GiveWell has to coordinate with charities representatives to cancel charges so that donors feel safe resubmitting their donation. This creates confusion, wastes time, and doesn't allow donors to complete the transaction as quickly as they would like.
- Setting up trackable donation processing for our top charities requires individual communication with each charity. This means that we must spend time communicating with each charity, and each charity must spend time creating its account. Also, in the event that the charity does not have time to set up the account or sets up the account but it has a problem, the required tracking may not be in place. With several charities in 2011, tracking was either not set up at the time we released our recommendations or we needed to create a one-off workaround to track donations to them.
Steps we have taken to improve:
- We plan to better advise larger donors of their non-credit-card options for donating and potential hassles of donating via credit card.
- We are now considering switching over donations to all charities to go through GiveWell so that we are immediately aware of any problems.
- We aim to complete our recommendations earlier in 2012 than 2011 (to give us additional time to address any problems that come up).
March to December 2011: Problem with tracking code in Google Analytics led to erroneous web stats
How we fell short:
Background: In March 2011, we were accepted into Google's Grants program, which provides $10,000 per month of free AdWords advertising to enrollees. At the time we enrolled in the program, our understanding was that we needed to create a new Google account and Google Analytics login to track traffic from our Grants' AdWords. We set up his account and added the necessary tracking code to our website. During 2011, we realized that Google Analytics was measuring significantly more visitors to our website than our other analytics program, Clicky. Elie Hassenfeld, the staff member primarily responsible for web metrics, wasn't able to quickly determine the cause of the discrepancy and thought it might be due to differences in the way the programs tracked visitors. In December 2011, it became apparent that the discrepancies were so great they were unlikely to be caused by differences between the programs, and instead the cause of the problem was likely a mistake we made when adding the additional tracking code in March.
Problems caused: We over-reported our web traffic in quarterly metrics updates published in April, July, and October 2011. We have the ability to view accurate historical webstats via Clicky, but it is harder to use Clicky for historical data than Analytics, and therefore it it is now more time consuming for us to analyze historical web traffic.
Steps we have taken to improve: On December 6, 2011, we fixed the problem. We confirmed this by comparing visitors in Clicky and Google Analytics for a few weeks afterwards, and they remained consistent. Our metrics update published January 5, 2011 presented accurate web statistics.
January to September 2011: Staff time spent managing volunteers
How we fell short: Before 2010, we generally did not offer volunteer opportunities. Starting in 2010 and to a larger extent in 2011, we felt that we had productive work for volunteers and asked anyone who wanted to volunteer to complete a test assignment before moving on to more volunteer work. Near the end of 2011, we reflected and realized that we had gained limited value from the volunteers we had while spending significant time managing the process.
The staff member responsible for managing volunteers tracked approximately 75 hours managing volunteers between January and September 2011, significant time spent on work that yielded limited value. (Note that this time is distinct from managing prospective full-time employees, which we do consider time well spent.)
Steps we have taken to improve: In October 2011, we emailed all volunteers to let them know that we no longer have volunteer opportunities. We are not currently accepting additional volunteers.
More at our 2011 blog post, A Good Volunteer is Hard to Find.
November 22, 2010: Website down during time with media attention
How we fell short: On November 21, 2010, New York Times columnist Nicolas Kristof wrote a column mentioning GiveWell which resulted in significant traffic to the GiveWell website. Due to a recent change to the GiveWell website, we knew that some visitors would see the www.givewell.org/your-charity page displayed incorrectly. On November 22, our web developer attempted to fix this. He made an error and at approximately 10:40am EST, the website went down. Because all of GiveWell staff were living in India at this time, no staff members noticed the problem. Several GiveWell fans emailed us to note that the site was down, but because we were in India, it was during the night, and we did not see them. Overall, the site was down for 10.5 hours and we estimate that we lost a total of approximately 450 visitors to our website. (Note that the website was not "down" it was just displaying a blank page, so our system that notifies us of website outages did not send us a message.)
Steps we have taken to improve:
- We do not ask our developer to make changes to the website during times when we expect increased traffic.
- We have informed our board members and others close to the project that any visible problems with the GiveWell site is a major issue and they should take whatever steps are necessary (i.e., email, call, text) to make sure we're aware of the problem and are working to address them.
Late 2009: Misinterpreted a key piece of information about a charity to which we gave a $125,000 grant
How we fell short: When reviewing Village Enterprise (formerly Village Enterprise Fund) in late 2009, we projected that they would spend 41% of total expenses on grants to business groups, because we misinterpreted a document they sent us which projected spending 41% of total expenses on business grants and mentorship expenses. We do not know what mentorship expenses were expected to be so we do not know the magnitude of our error. Village Enterprise ended up spending 20% of total expenses on business grants in FY 2010. We caught this mistake ourselves when we were updating the review in August 2011. Village Enterprise plans to spend 28% of total expenses on business grants in FY 2012.
Steps we are taking to improve: We have updated our review of Village Enterprise to reflect the correct distribution of expenses. Going forward, before publishing a page, at least one additional GiveWell employee will check the original source of figures that play a key role in our conclusions about a charity or program.
August 1, 2009 to December 31, 2009: Grant process insufficiently clear with applicants about our plans to publish materials
How we fell short: Between 8/1/2009-12/31/2009, we accepted applications for $250,000 in funding for economic empowerment programs in sub-Saharan Africa. We attempted to be extremely clear with charities that we planned on sharing the materials they submitted, and that agreeing to disclosure was a condition of applying, but in a minority of cases, we failed to communicate this. We conceded these cases and gave the charities in question the opportunity to have their materials - and even the mention of the fact that they had applied for funding - withheld.
We try to avoid keeping materials confidential unless absolutely necessary, and in this case our unclear communications led to confrontations and to confidentiality situations that could have been avoided.
Details at this blog post.
Steps we have taken to improve:
- We offered the minority of charities with whom we'd been unclear the option not only to have their materials omitted, but to have us not disclose the fact that they applied for funding from us.
- We added clarificatory language to the top of our charity reviews, in order to clarify what a "0-star rating" means.
- In the future, we may publicly publish pages on charities we consider before we accept materials from them, in order to make our intentions about disclosure and public discussion absolutely clear.
November 25, 2009: Mishandling incentives to share information
How we fell short: A blog post discussing the Acumen Fund paraphrased information we'd been given during Acumen's application for funding from us. An Acumen Fund representative told us this had come off as a "bait and switch": using the grant application as a pretense for gathering information that we could use for a negative piece. (This was not the case; we had invited Acumen to apply in the hopes that they would be a strong applicant, and would have written a similar blog post afterward if they had simply declined to speak with us.)
We try to avoid creating incentives for charities to withhold information, given how little is available currently. Therefore, we are generally careful with how we use any substantive information that is disclosed, and generally check with the charity in question before publishing anything that could be construed as "using it to make a negative point." (An example is our post on microfinance repayment rates, which uses voluntarily disclosed information to raise concerns about the repayment rate while attempting to be clear that the organization in question should not be singled out for this disclosure. We checked with the organization discussed before making this post.)
In this case, we published our post without such a check, reasoning that we were not sharing any substantive materials (only paraphrasing general statements from representatives). Doing so gave the impression that sharing more information can result in more negative coverage.
We continue to struggle with the balance between disclosing as much information as possible and avoiding disincentives to share information. We will not find a solution in every case, but feel that we mishandled this one.
Steps we have taken to improve: We have let Acumen Fund know that we regret this incident and resolved to be more careful about quoting from representatives and grant applications in the future.
May 20, 2009 to August 27, 2009 and October 25, 2009 to November 11, 2009: Carelessness with updates to website; broke email update signup form
How we fell short: In 2009, we made multiple changes to our website with the help of a contractor, and we were not sufficiently careful about checking all changes to make sure they retained all functionality. As a result, website visitors lost the ability to sign up for email updates (from the front page - other forms worked) between 5/20/09 and 8/27/09 and between 10/25/2009 and 11/11/2009.
Steps we have taken to improve: We try to make all changes on a "mirror site" first (we had a mirror site at the time but were bypassing it for multiple small changes). When we make a change directly to the live site, we test relevant functionality immediately afterward.
December 2007 to December 2009: Website not sufficiently engaging, generated too little substantive feedback
How we fell short: For the early years of our project, our reviews, while thorough, were overly dense and difficult to engage with. Feedback from our supporters often included this theme. In addition, we received less critical engagement with our analysis than we would have liked from those not directly involved in the project. We received many emails offering general support or asking us to consider a particular charity, but the number of people who critiqued the content of our reviews – through email, survey, discussion forum, or our blog – remained low.
Steps we have taken to improve: We devoted substantial time over the 8/2009-12/2009 period to revamping our website (for example, making summary information by topic easier to find) and to making frequent posts on our blog that present our research in more accessible ways. We have also actively sought out feedback from relevant experts. The level of feedback has improved.
May to August 2009: Excessive time spent on policies and procedures
In the process of our 2008 financial audit, we decided to draft a comprehensive set of internal policies, employee manual, etc.
Doing so turned out to be beyond our capacity as a small organization. We ultimately decided to maintain only the most essential policies, which are now posted online. All in all, the process ended up costing us significant time, and we struggled to file our return by the final deadline (although we did do so).
May 2009: Failed to remove two private references from a recording that we published
How we fell short: In May 2009, we discussed the Millions Saved project with a staff member of the project, Dr. Jessica Gottlieb, and then published a copy of the recording of the conversation to our website. Dr. Gottlieb approved making the recording public on the condition that we remove personal references that she made during the conversation. We partially removed the references, but we failed to remove one person's email address and Dr. Gottlieb's suggestion that we speak with a particular person. We noticed this error in February 2014 while preparing to use this recording as part of a test assignment for potential employees. According to our logs, no one had downloaded the audio file during the previous year.
Steps we have taken to improve: We notified Dr. Gottlieb about this mistake and apologized to her. Subsequent to (and unrelated to this error), we had implemented a formal procedure for reviewing uploaded files to confirm that all requested changes to files have been made.
January to March 2009: Poor research strategy
How we fell short: While conducting research for our 2008-2009 international aid report, we found ourselves very impressed with the Carter Center, and put a large amount of time into understanding - and writing up - its programs that most appealed to us. However, we had neglected to first establish the question of how its funding was allocated between programs. We had a general sense that the programs that appealed to us were also the largest programs, but after further investigation we found ourselves continually unable to verify this. As of November 2009, we still had not gotten the basic information we needed to have confidence in the Carter Center.
If we had focused on the basics (on which programs does the Carter Center focus?) first, we would have saved the substantial time we put into its review.
Steps we have taken to improve: We have become more disciplined about the order in which we ask questions about a charity. First, we need to understand where its funds are going and what it does; only then do we decide which programs to investigate deeply.
January to September 2008: Paying insufficient attention to professional development and support
How we fell short: At our board meeting in January 2008, we agreed to explore options for professional development and mentoring, in light of the relative youth and inexperience of our staff. GiveWell staff put a lower priority on this than more time-sensitive goals, and while we explored a few options, we made little progress on it between January and September. At the September Board meeting, the Board criticized this lack of progress and reiterated the need for professional development and mentoring.
Steps we have taken to improve: We now have two highly regular mentoring relationships, and two more in a "trial phase." We have also stepped up Board oversight through a monthly conference call (attendance is optional but has generally been high) and more regular calls with Board Vice-President Lindy Miller. An update on professional development was presented at our July 2009 Board meeting.
June 2008: Overly aggressive time estimates
How we fell short: The timeline presented in our year-2 plan (published 6/19/2008) called for us to finish a round of both marketing (raising GiveWell Pledges) and research (publishing our full report on developing-world aid) in December 2008. We had little to go on in forming this estimate, and largely chose this target date because of the significance of the holiday season in giving (rather than based on quantified time estimates).
Steps we have taken to improve: We have been recording our hours since January of 2008, and have gradually been improving the detail of our timesheets. We now have enough data on how our time is spent to make more detailed estimates. Our 2008-2009 international aid report was published on 7/1/2009, consistent with the goals we had set at the end of 2008.
February to May 2008: Premature hiring
How we fell short: In an attempt to increase our research capacity, we hired our strongest volunteer in January 2008, but terminated the relationship at the end of May. We mutually agreed that, at that stage of our development, we couldn't provide the training and management necessary for someone of his skill set to add significant value.
Steps we have taken to improve: We paused recruiting with the intent of picking it back up when we had a more systematized process. We went through a series of hires in early 2009, but at that time we still did not have a systematized enough process, and relationships were terminated fairly quickly. In mid-2009, we again reassessed our situation and made two hires who lasted substantially longer. One worked for us from May through December of 2009, added significant value, and left in December 2009 for an opportunity in the consulting industry; another started work in early July 2009 and is still with us (and adding significant value) as of March 2010.
December 2007 to May 2008: Research process should have incorporated more and earlier discussions with charities' staff members
How we fell short: We undervalued the discussions with staff during our first year; speaking directly with staff gives us the opportunity to get a clear picture of how an organization views itself, and therefore what sorts of information we should seek to get a picture of whether its approach works as intended. Applicants encouraged us to put more of our time into personal visits, so that we could “get to know” organizations rather than thrusting pre-defined questions on them.
Steps we have taken to improve: We now begin any charity evaluation (once the initial heuristics have been passed and we've learned what we can from the website) with discussions with staff.