- Top charities
This page logs mistakes we've made, strategies we should have planned and executed differently, and lessons we've learned.
A full review of progress to date – both accomplishments and shortcomings - is available here.
Please contact us with other items that should be listed here.
How we fell short: As part of an effort to gain publicity, GiveWell's staff (Holden and Elie) posted comments on many blogs that did not give adequate disclosure of our identities (though we did use our real first names); in a smaller number of cases, we posted comments and sent emails that deliberately concealed our identities. Our actions were wrong and rightly damaged GiveWell's reputation. More detail is available via the page for the board meeting that we held in response.
Given the nature of our work, it is essential that we hold ourselves to the highest standards of transparency in everything we do. Our poor judgment caused many people who had not previously encountered GiveWell to become extremely hostile to it.
Steps we have taken to improve: We issued a full public disclosure and apology, and directly notified all existing GiveWell donors of the incident. We held a Board meeting and handed out penalties that were publicly disclosed, along with the audio of the meeting. We increased the Board's degree of oversight over staff, particularly with regard to public communications.
How we fell short: There were two issues, one larger and one smaller:
In November 2012, a staff member completed an audit of all conversations that we had published. He identified two instances where we had erroneously published the pre-publication (i.e., not-yet-approved) version of the notes. We have emailed both organizations to apologize and inform them of the information that we erroneously shared.
We immediately emailed all of the organizations and people that we had mentioned to apologize and included the section we had written about them. Note that none of the information we published was confidential; we merely did not intend to publish this information and it had not been fully vetted by GiveWell staff and sent to the organizations for pre-publication comment.
Steps we've taken to improve: In November 2012, we instituted a new practice for publishing conversation notes. We now internally store both private and publishable versions of conversation notes in separate folders (we hope that this practice reduces the likelihood that we upload the wrong file) and have assigned a staff member to perform a weekly audit to check whether any confidential materials have been uploaded. As of this writing (December 2012) we have performed 3 audits and found no instances of publishing private material.
We take the issue of publishing private materials very seriously because parties that share private materials with us must have confidence that we will protect their privacy. We have therefore reexamined our procedures for uploading files to our website and are planning to institute a full scale audit of files that are currently public as well as an ongoing procedure to audit our uploads.
How we fell short: Our top priority in 2011 was identifying additional top-rated charities. In February, we selected approximately 10 organizations we felt were most promising and contacted them. Over the next 3 months, we engaged in extensive back and forth with all of them, spending a great deal of time on each as we tried to answer all of our questions and write full reviews even though we realized that most of these organizations seemed unlikely to receive our highest ratings. (Often, we even spent a great deal of time negotiating with particular charities about the language we would use in their review.) Because we allocated so much time to this set of organizations, we did not allocate sufficient time to engaging with and learning about new organizations which would likely have had a better chance of receiving our top ratings.
Part of the reason we did this was because we wanted to make sure we understood the charities under investigation; part of the reason was because we wanted to publish reviews that were amenable to both us and them. On reflection, however, we believe that both of these goals were less important than the goal of maximizing our odds of finding top-rated charities, and we should have been quicker to sacrifice full understanding and in-depth reviews of these groups for the more important goal.
Steps we have taken to improve: In mid-June 2011, we adjusted our research process. Instead of aiming for a full, in-depth review of each charity contacted, we only committed to completing a single phone call and document review for a charity before drafting a review based on the materials we have (choosing not to engage in further back-and-forth if we did not find the charity highly promising). This change allowed us to consider and contact close to 100 charities in 2011. Note that there are downsides to this change: as of February 2012, we have a large backlog of "pending charities" from 2011 for which we still need to draft and publish reviews.
How we fell short: Our top-rated charity in 2011 was the Against Malaria Foundation. We made two errors in the way we communicated to donors about the ramifications of donating to AMF.
Steps we have taken to improve:
How we fell short: between July 2009 and November 2010, we assigned zero- to three-star ratings to all charities we examined. We did so in response to feedback from our fans and followers - in particular, arguments that people want easily digested, unambiguous “bottom line” information that can help them make a decision in a hurry and with a clean conscience. Ultimately, however, we decided that the costs of the ratings - in terms of giving people the wrong impression about where we stood on particular charities - outweighed the benefits.
Steps we have taken to improve: by December 2010 we will replace our quantitative ratings with more complex and ambiguous bottom lines that link to our full reviews.
How we fell short: In February 2010, we wrote that we believed we could do an in-depth report on U.S. equality of opportunity, without needing to allocate substantial resources (particularly co-Founders' time). We therefore believed that this report represented a good use of resources even though we didn't find the cause to be among the most promising.
As it turned out, we had substantially underestimated the complexity and difficulty of this report, which we eventually published in November 2010. As we wrote shortly afterward, we believe that there is relatively little value in research reports on causes we don't find to be among the most promising. Thus, we ended up putting a large amount of time and effort into a report that we don't consider highly valuable to our mission.
Steps we have taken to improve: we feel we have learned that it is never simple to do a first-time in-depth report on a given charitable sector. Going forward, we do not intend to research new sectors except as a means to finding the best giving opportunities possible, (see our February 2010 post for our reasoning). The one exception we've made is for disaster relief due to the substantial media attention around it.
How we fell short: we continue to struggle with an appropriate tone on our blog, one that neither understates nor overstates our confidence in our views (particularly when it comes to charities that we do not recommend). A recent example of a problematic tone is our December 2009 blog post, Celebrated Charities that we Don't Recommend. Although it is literally true that we don't recommend any of the charities listed in that post, and although we stand by the content of each individual blog post linked, the summaries make it sound as though we are confident that these charities are not doing good work; in fact, it would be more accurate to say that the information we would need to be confident isn't available, and we therefore recommend that donors give elsewhere unless they have information we don't.
We wish to be explicit that we are forming best guesses based on limited information, and always open to changing our minds, but readers often misunderstand us and believe we have formed confident (and, in particular, negative) judgments. This leads to unnecessary hostility from, and unnecessary public relations problems for, the groups we discuss.
Steps we have taken to improve: we do feel that our tone has slowly become more cautious and accurate over time. At the time of this writing (July 2010), we are also resolving to run anything that might be perceived as negative by the group it discusses, before we publish it publicly, giving them a chance to make any corrections to both facts and tone. (We have done this since our inception for charity reviews, but now intend to do it for blog posts and any other public content as well.)
How we fell short: We discovered on 12/22/2008 that we had not been tracking any visits to our site since May of that year. At the time, we were not emphasizing website traffic as a metric, and so had not been checking up on it regularly. Failing to do so meant that we lost valuable data that could have been used to measure our progress over time. Over a year later, our attempts to review our progress have been hampered by the fact that this data is missing.
Steps we have taken to improve: we reinstituted tracking immediately upon discovering our error, and have since checked up on the tracking more regularly. Going forward, we plan to review key website metrics on a quarterly basis.
How we fell short: after completing our first year of research, we agreed that the focus of our second year should be on increasing the "money moved" by our research. (Details here.) After several months focusing on this goal, we felt that we were spending insufficient time on research - the core of our mission - and weren't getting enough return out of the time we were spending on marketing. (Details here.) We now believe that we over-focused on marketing, at a stage in our development where doing so was premature, and as a result did not make as much progress during these months as we should have.
Steps we have taken to improve: we created a new business plan that shifted our focus primarily to research; this business plan, along with companion documents that lay out the reasons for our change of direction, is available here.
How we fell short: For our first year of research, we grouped charities into causes ("Saving lives," "Global poverty," etc.) based on the idea that charities within one cause could be decided on by rough but consistent metrics: for example, we had planned to decide Cause 1 (saving lives in Africa) largely on the basis of estimating the “cost per life saved” for each applicant. The extremely disparate nature of different charities' activities meant that there were major limits to this type of analysis (we had anticipated some limits, but we encountered more).
Because of our commitment to make one grant per cause and our overly rigid and narrow definitions of "causes," we feel that we allocated our grant money suboptimally. For example, all Board members agreed that we had high confidence in two of our Cause 1 (saving lives) applicants, but very low confidence in all of our Cause 2 (global poverty) applicants. Yet we had to give equal size grants to the top applicant in each cause (and give nothing to the 2nd-place applicant in Cause 1).
Steps we have taken to improve: We have shifted our approach to "causes" so that they are defined more broadly. This gives us more flexibility to grant the organizations that appeal to us most. We now explore broad sets of charities that intersect in terms of the people they serve and the research needed to understand them, rather than narrower causes based on the goal of an “apples to apples” comparison using consistent metrics. For example, our recent research report addresses the broad area of international aid.
How we fell short: In July 2007, when we mailed out grant applications to charities, we told them that we would award grants by December 2007. We did not finish all necessary research by December 2007, and thus only paid out 3 of our planned 5 grants by the agreed upon date. We paid the last two grants in early March of 2008.
What we are doing to improve: Setting deadlines was particularly difficult for our first year, as we had no previous experience with grantmaking. With the first year behind us, we have a better sense of how time-consuming research is, and we have since set more conservative expectations. Our 2008-2009 report on international aid was delivered by the deadline we had set.
How we fell short: In our first year, we focused our time and effort overwhelmingly on getting information from applicants, as opposed to from academic and other independent literature. Applicants found this process extremely time-intensive and burdensome, particularly given the size of the grants. We also found that much of the information we found essential in making informed decisions was not submitted through grant applications (rather, we found it through independent research).
Steps we have taken to improve: We have modified our research process. Our basic process now (for the latest implementation, see our research process for international aid is to (a) search independent research to identify particularly promising approaches; (b) use heuristics to identify charities that are both transparent (i.e., sharing substantial information) and promising; (c) contact charities with specific and targeted questions. We feel that this process leads both to better information and to a lower burden on charities.
How we fell short: Timothy Telleen-Lawton (GiveWell staff member as of April 2013) has been friends with Paul Niehaus (GiveDirectly President and Director) for many years. When Timothy met Holden Karnofsky (GiveWell's Co-Founder and Co-Executive Director) in April 2011, he suggested that GiveWell look into GiveDirectly and introduced Holden and Paul by email. GiveWell later recommended GiveDirectly as a top charity in November 2012, before Timothy was on GiveWell staff.
Starting in January 2013, Holden started living in a shared house with Timothy, around the same time Timothy started a trial to work at GiveWell. Paul has visited and stayed at the shared house several times. We should have publicly disclosed the social connection between Paul and Holden and Tim.
Note that this mistake solely relates to information we should have publicly disclosed to avoid any appearance of impropriety. We do not believe that this relationship had any impact on our charity rankings. Tim was not the staff member responsible for the evaluation of GiveDirectly, and Holden has had relatively little interaction with Paul (and had relatively little interaction with Tim prior to moving to San Francisco in 2013).
Steps we have taken to improve: We publicly disclosed this fact in December 2013; at that time, we also created a page to disclose conflicts of interest.
How we fell short: We aimed to publish regular updates on the Against Malaria Foundation, but we went most of the year (February to September) without any updates. This was caused by our desire to publish comprehensive updates, and we allowed expectations of new information being available shortly to delay publishing brief updates that had meaningful but limited information.
Steps we have taken to improve: As of July 2013, we changed our process for completing top-charity updates. We began publishing notes from our conversations with these charities (as we do for many of the conversations we have more generally) which should lead to more timely updates on our top charities.
How we fell short: From May 20 to June 26, private content was intermittently available to the public on the GiveWell website. A change we made on May 20 caused pages set to be visible by staff only to appear, in some browsers, as a page with a login screen and below it, the unpublished content. Unpublished content includes both confidential information and incomplete research. Confidential information on unpublished pages is generally information that we expect to be able to publish, but which we have not yet received approval from an external party to publish. However, there are exceptions to this and it is possible that sensitive information was revealed. We are not aware of any cases of sensitive information being revealed.
Steps we have taken to improve: We fixed the problem a few hours after discovering it. We have added monitoring of unpublished pages to our list of regular website checks.
How we fell short: During this period, our jobs page invited applicants to apply for our research analyst role. We responded to every applicant by asking them to work on a "charity comparison assignment" in which each applicant compared three charities and discussed which charity they would support and why. This assignment took applicants between 6 and 10 hours to complete. During this period, approximately 50 applicants submitted the assignment, of which we interviewed approximately 8.
We now feel that asking all applicants to complete this test assignment likely took more of their time than was necessary at an early stage in the recruiting process and may have led some strong applicants to choose not to apply.
Steps we've taken to improve: We no longer ask all applicants to complete this assignment. In December 2012, we changed our jobs page to more clearly communicate about our hiring process.
How we fell short: In early GiveWell years, we aimed to release updated recommendations by December 1st in order to post our recommendations before "giving season," the weeks at the end of the year when the vast majority of donations are made. In 2011, we released our recommendations in the last week of November, but then ran into problems related to donation processing. To alleviate those problems in the future, we planned to release our recommendations in 2012 by November 1st to give us sufficient time to deal with problems before the end of the year rush of giving.
In 2012, we did not release our recommendations until the last week of November (significantly missing our goal). We continued to publish research about the cost-effectiveness and evidence of effectiveness for the interventions run by our top charities throughout December, which meant that some donors were making their giving decisions before we had published all the relevant information. The primary cause of the delay was that we did not start work on GiveDirectly, the new 2012 top-rated charity until mid-September, which did not give us enough time to finish its full review by the November 1st deadline.
Steps we've taken to improve: In 2013, we again aim to release our recommendations by November 1. This year, we plan to explicitly consider possible top-rated charities on July 1st and move forward with any contenders at that point.
How we fell short:
On October 2, 2012, we attempted to perform security updates for our website and in the process caused the website to crash. Our understanding is that this crash would have been very hard to predict. The mistakes we made were in (a) poor timing of the attempt, and (b) poor handling of the restoration.
The site was offline for about four hours largely due to (a) attempting the updates at a time when our website developer was not available to help us troubleshoot, and (b) our attempt to restore the site through a method that we had not previously tried and which failed and caused further problems. Due to higher than average traffic because of recent media attention, this caused about 300-400 visitors to be unable to access the site.
After the site came back online, there were a number of further problems, including some content being out of date, links pointing to incorrect pages, infinite redirect loops on some pages, and missing files. We did not fully test the site and therefore failed to recognize many of the problems at the time and they persisted until the next morning.
Internal communication about the problems was sub-optimal.
Dealing with the issues caused by the crash cost cost a non-significant amount of high-level staff time.
Steps we have taken to improve:
Going forward our website developer will directly perform security updates for our website and other complex website changes, or, at a minimum, we will conduct all updates and changes at a time that we know the developer is available to help solve unexpected problems. Before each software update, we will create an up-to-date backup of the website server that can be easily and quickly restored in the case of a site crash.
We will limit technical changes to the website to low traffic times -- early mornings or evenings and never following significant media attention or at other peak traffic times, except in the case of a very high value change.
In the future, we will be more cognizant of the importance of checking the site for problems following changes and communicating internally and externally about issues.
How we fell short: In late June 2012, we published a blog post on the partnership between GiveWell and Good Ventures. We generally discuss sensitive public communication with a board member before we post, but failed to do so in this case. The post was not as clear as it should have been about the nature of GiveWell's relationship with Good Ventures. The post caused confusion among some in our audience; for example, we received questions about whether we had 'merged.'
Steps we've taken to improve: GiveWell staff will be more attentive in the future to sharing sensitive public information with the board member responsible for public communication before posting.
How we fell short: Throughout GiveWell's history, we have relied on Skype and staff's individual cell phones to make phone calls. This led to instances of poor call quality or dropped calls, but given the fact that GiveWell was a startup, those we spoke with generally understood. In addition, we had not always confirmed with call participants the phone number to use for a particular call or set up and send agendas for the call in advance. Earlier in GiveWell's history, participants likely understood that we were a very new, small organization just getting started and aiming to control costs. But, as we've grown this is no longer a reasonable justification, and both of the problems listed here may have had implications for the professionalism we've projected to those we've spoken with.
Steps we have taken to improve: We have continued to be more vigilant about confirming that all participants are aware of the number to use for scheduled calls. As of February 2012, we are in the process of purchasing office phones that should improve call quality issues.
How we fell short: In giving season 2011, there were 3 major issues which we communicated poorly about to donors:
Steps we have taken to improve: Though GiveWell's responsibility for communicating about the points above varies, communicating well about all of the above furthers our mission. We plan to communicate better about these points to larger donors in 2012. (More at a 2012 blog post.)
How we fell short: Before 2010, we generally did not offer volunteer opportunities. Starting in 2010 and to a larger extent in 2011, we felt that we had productive work for volunteers and asked anyone who wanted to volunteer to complete a test assignment before moving on to more volunteer work. Near the end of 2011, we reflected and realized that we had gained limited value from the volunteers we had while spending significant time managing the process.
The staff member responsible for managing volunteers tracked approximately 75 hours managing volunteers between January and September 2011, significant time spent on work that yielded limited value. (Note that this time is distinct from managing prospective full-time employees, which we do consider time well spent.)
Steps we have taken to improve: In October 2011, we emailed all volunteers to let them know that we no longer have volunteer opportunities. We are not currently accepting additional volunteers.
More at our 2011 blog post, A Good Volunteer is Hard to Find.
How we fell short:
Once we learned of the problem, we reconfigured donations to go through GiveWell.
Steps we have taken to improve:
How we fell short:
Background: In March 2011, we were accepted into Google's Grants program, which provides $10,000 per month of free AdWords advertising to enrollees. At the time we enrolled in the program, our understanding was that we needed to create a new Google account and Google Analytics login to track traffic from our Grants' AdWords. We set up his account and added the necessary tracking code to our website. During 2011, we realized that Google Analytics was measuring significantly more visitors to our website than our other analytics program, Clicky. Elie Hassenfeld, the staff member primarily responsible for web metrics, wasn't able to quickly determine the cause of the discrepancy and thought it might be due to differences in the way the programs tracked visitors. In December 2011, it became apparent that the discrepancies were so great they were unlikely to be caused by differences between the programs, and instead the cause of the problem was likely a mistake we made when adding the additional tracking code in March.
Problems caused: We over-reported our web traffic in quarterly metrics updates published in April, July, and October 2011. We have the ability to view accurate historical webstats via Clicky, but it is harder to use Clicky for historical data than Analytics, and therefore it it is now more time consuming for us to analyze historical web traffic.
Steps we have taken to improve: On December 6, 2011, we fixed the problem. We confirmed this by comparing visitors in Clicky and Google Analytics for a few weeks afterwards, and they remained consistent. Our metrics update published January 5, 2011 presented accurate web statistics.
How we fell short: On November 21, 2010, New York Times columnist Nicolas Kristof wrote a column mentioning GiveWell which resulted in significant traffic to the GiveWell website. Due to a recent change to the GiveWell website, we knew that some visitors would see the www.givewell.org/your-charity page displayed incorrectly. On November 22, our web developer attempted to fix this. He made an error and at approximately 10:40am EST, the website went down. Because all of GiveWell staff were living in India at this time, no staff members noticed the problem. Several GiveWell fans emailed us to note that the site was down, but because we were in India, it was during the night, and we did not see them. Overall, the site was down for 10.5 hours and we estimate that we lost a total of approximately 450 visitors to our website. (Note that the website was not "down" it was just displaying a blank page, so our system that notifies us of website outages did not send us a message.)
Steps we have taken to improve:
How we fell short: When reviewing Village Enterprise (formerly Village Enterprise Fund) in late 2009, we projected that they would spend 41% of total expenses on grants to business groups, because we misinterpreted a document they sent us which projected spending 41% of total expenses on business grants and mentorship expenses. We do not know what mentorship expenses were expected to be so we do not know the magnitude of our error. Village Enterprise ended up spending 20% of total expenses on business grants in FY 2010. We caught this mistake ourselves when we were updating the review in August 2011. Village Enterprise plans to spend 28% of total expenses on business grants in FY 2012.
Steps we are taking to improve: We have updated our review of Village Enterprise to reflect the correct distribution of expenses. Going forward, before publishing a page, at least one additional GiveWell employee will check the original source of figures that play a key role in our conclusions about a charity or program.
How we fell short: Between 8/1/2009-12/31/2009, we accepted applications for $250,000 in funding for economic empowerment programs in sub-Saharan Africa. We attempted to be extremely clear with charities that we planned on sharing the materials they submitted, and that agreeing to disclosure was a condition of applying, but in a minority of cases, we failed to communicate this. We conceded these cases and gave the charities in question the opportunity to have their materials - and even the mention of the fact that they had applied for funding - withheld.
We try to avoid keeping materials confidential unless absolutely necessary, and in this case our unclear communications led to confrontations and to confidentiality situations that could have been avoided.
Details at this blog post.
Steps we have taken to improve:
How we fell short: a blog post discussing the Acumen Fund paraphrased information we'd been given during Acumen's application for funding from us. An Acumen Fund representative told us this had come off as a "bait and switch": using the grant application as a pretense for gathering information that we could use for a negative piece. (This was not the case; we had invited Acumen to apply in the hopes that they would be a strong applicant, and would have written a similar blog post afterward if they had simply declined to speak with us.)
We try to avoid creating incentives for charities to withhold information, given how little is available currently. Therefore, we are generally careful with how we use any substantive information that is disclosed, and generally check with the charity in question before publishing anything that could be construed as "using it to make a negative point." (An example is our post on microfinance repayment rates, which uses voluntarily disclosed information to raise concerns about the repayment rate while attempting to be clear that the organization in question should not be singled out for this disclosure. We checked with the organization discussed before making this post.)
In this case, we published our post without such a check, reasoning that we were not sharing any substantive materials (only paraphrasing general statements from representatives). Doing so gave the impression that sharing more information can result in more negative coverage.
We continue to struggle with the balance between disclosing as much information as possible and avoiding disincentives to share information. We will not find a solution in every case, but feel that we mishandled this one.
Steps we have taken to improve: we have let Acumen Fund know that we regret this incident and resolved to be more careful about quoting from representatives and grant applications in the future.
In 2009, we made multiple changes to our website with the help of a contractor, and we were not sufficiently careful about checking all changes to make sure they retained all functionality. As a result, website visitors lost the ability to sign up for email updates (from the front page - other forms worked) between 5/20/09 and 8/27/09 and between 10/25/2009 and 11/11/2009.
Steps we have taken to improve: we try to make all changes on a "mirror site" first (we had a mirror site at the time but were bypassing it for multiple small changes). When we make a change directly to the live site, we test relevant functionality immediately afterward.
In the process of our 2008 financial audit, we decided to draft a comprehensive set of internal policies, employee manual, etc.
Doing so turned out to be beyond our capacity as a small organization. We ultimately decided to maintain only the most essential policies, which are now posted online. All in all, the process ended up costing us significant time, and we struggled to file our return by the final deadline (although we did do so).
How we fell short: In May 2009, we discussed the Millions Saved project with a staff member of the project, Dr. Jessica Gottlieb, and then published a copy of the recording of the conversation to our website. Dr. Gottlieb approved making the recording public on the condition that we remove personal references that she made during the conversation. We partially removed the references, but we failed to remove one person's email address and Dr. Gottlieb's suggestion that we speak with a particular person. We noticed this error in February 2014 while preparing to use this recording as part of a test assignment for potential employees. According to our logs, no one had downloaded the audio file during the previous year.
Steps we have taken to improve: we notified Dr. Gottlieb about this mistake and apologized to her. Subsequent to (and unrelated to this error), we had implemented a formal procedure for reviewing uploaded files to confirm that all requested changes to files have been made.
While conducting research for our 2008-2009 international aid report, we found ourselves very impressed with the Carter Center, and put a large amount of time into understanding - and writing up - its programs that most appealed to us. However, we had neglected to first establish the question of how its funding was allocated between programs. We had a general sense that the programs that appealed to us were also the largest programs, but after further investigation we found ourselves continually unable to verify this. As of November 2009, we still had not gotten the basic information we needed to have confidence in the Carter Center.
If we had focused on the basics (on which programs does the Carter Center focus?) first, we would have saved the substantial time we put into its review.
Steps we have taken to improve: we have become more disciplined about the order in which we ask questions about a charity. First, we need to understand where its funds are going and what it does; only then do we decide which programs to investigate deeply.
How we fell short: at our board meeting in January 2008, we agreed to explore options for professional development and mentoring, in light of the relative youth and inexperience of our staff. GiveWell staff put a lower priority on this than more time-sensitive goals, and while we explored a few options, we made little progress on it between January and September. At the September Board meeting, the Board criticized this lack of progress and reiterated the need for professional development and mentoring.
Steps we have taken to improve: we now have two highly regular mentoring relationships, and two more in a "trial phase." We have also stepped up Board oversight through a monthly conference call (attendance is optional but has generally been high) and more regular calls with Board Vice-President Lindy Miller. An update on professional development was presented at our July 2009 Board meeting.
How we fell short: The timeline presented in our year-2 plan (published 6/19/2008) called for us to finish a round of both marketing (raising GiveWell Pledges) and research (publishing our full report on developing-world aid) in December 2008. We had little to go on in forming this estimate, and largely chose this target date because of the significance of the holiday season in giving (rather than based on quantified time estimates).
Steps we have taken to improve: we have been recording our hours since January of 2008, and have gradually been improving the detail of our timesheets. We now have enough data on how our time is spent to make more detailed estimates. Our 2008-2009 international aid report was published on 7/1/2009, consistent with the goals we had set at the end of 2008.
How we fell short: In an attempt to increase our research capacity, we hired our strongest volunteer in January 2008, but terminated the relationship at the end of May. We mutually agreed that, at that stage of our development, we couldn't provide the training and management necessary for someone of his skill set to add significant value.
Steps we have taken to improve: We paused recruiting with the intent of picking it back up when we had a more systematized process. We went through a series of hires in early 2009, but at that time we still did not have a systematized enough process, and relationships were terminated fairly quickly. In mid-2009, we again reassessed our situation and made two hires who lasted substantially longer. One worked for us from May through December of 2009, added significant value, and left in December 2009 for an opportunity in the consulting industry; another started work in early July 2009 and is still with us (and adding significant value) as of March 2010.
How we fell short: For the early years of our project, our reviews, while thorough, were overly dense and difficult to engage with. Feedback from our supporters often included this theme. In addition, we received less critical engagement with our analysis than we would have liked from those not directly involved in the project. We received many emails offering general support or asking us to consider a particular charity, but the number of people who critiqued the content of our reviews – through email, survey, discussion forum, or our blog – remained low.
Steps we have taken to improve: We devoted substantial time over the 8/2009-12/2009 period to revamping our website (for example, making summary information by topic easier to find) and to making frequent posts on our blog that present our research in more accessible ways. We have also actively sought out feedback from relevant experts. The level of feedback has improved.
How we fell short: We undervalued the discussions with staff during our first year; speaking directly with staff gives us the opportunity to get a clear picture of how an organization views itself, and therefore what sorts of information we should seek to get a picture of whether its approach works as intended. Applicants encouraged us to put more of our time into personal visits, so that we could “get to know” organizations rather than thrusting pre-defined questions on them.
Steps we have taken to improve: We now begin any charity evaluation (once the initial heuristics have been passed and we've learned what we can from the website) with discussions with staff.