As I sit here writing about risk, the date at the bottom of my laptop screen – September 11 – is a jarring reminder that risk analysis is both futile and indispensable. 

It’s futile if we use risk analysis to predict the future. Or come to believe that the act of reflection itself inoculates us against harm. So goes the temptation: Now that we have completed this risk analysis, what could go wrong?

It’s indispensable if we understand that success lies not in following a scripted plan but in our facility with contingency planning, situation recognition, and adaptation. And by this I mean success in anything, from sustaining a marriage to running a restaurant kitchen to, yes, philanthropy.

Let’s start with two truths.

First, the textbook explanation of risk analysis is linear and rational, beginning with assessment:

What can go wrong? What is the likelihood it could go wrong? And what are the consequences? [1]

Next we review options for managing hazards that materialize:

What can be done and what options are available? What are the associated tradeoffs in terms of costs and benefits? What are the impacts of current management decisions on future options?

Then we move on to how we communicate risk:

Emphasize information relevant to any practical actions that individuals can take. Respect the audience and its concerns. Seek strictly to inform the recipient, unless conditions clearly warrant the use of influencing techniques. [2]

Which brings us to the second truth: We do not behave rationally.

Or as behavioral economist Dan Ariely says:

My further observation is that we are not only irrational, but predictably irrational – that our irrationality happens the same way, again and again.

Ariely explains this in a book titled Predictably Irrational, reinforcing the point that there’s no reason to expect that we can apply a rational risk analysis method.

In philanthropy, we are predictably irrational when our program staff puff up rose-colored grant recommendations to boards. As the late philosophy scholar Michael Hooker described, the formal and informal rewards systems can swell the irrationality. Foundation board members appreciate the game when they initiate grants to favored groups. Grant seekers are happy to overstate their case if that’s the unspoken guideline to trigger funding.

Hooker described this situation as “a game of rhetorical persuasion where the rules regarding honesty and candor are suspended or subtly altered, just as they are in poker.” [3] The mutual reinforcement of the delusion leaves us all shocked when real life does not unfold as we all agreed it would in the grant agreement.

When I was leading evaluation at Knight Foundation, I reviewed 150 or so grantee reports to figure out how the grantees’ experiences lined up with the original pacts.

Our biggest miss was time. Something like two-thirds of the grantees reported that activities were running behind schedule.

True to Hooker’s hunch, our grant applicants were promising they could advance the work with great speed. This, of course, made these particular applications stand out as just the kind of high performers we wanted. The staff recommended approval for what appeared to be the most promising applications.

More important than the delays were the reasons. Often as not, it wasn’t programmatic failure slowing things down. It was frequently grantee staffing changes. Or imagine an afterschool program created to serve 30 middle school students – and then 200 parents enroll their children. The program administrators may hit the pause button to revisit the approach.

We don’t always overpromise on purpose. Part of being predictably irrational is maintaining an exaggerated – and false – sense of control. Having vetted a grant proposal personally, we assign to ourselves more control over the future events than we have.

There is a way out of this pickle. It requires us to be explicit about the methods we use to make decisions, the criteria we use to make decisions, and the implications of the decisions. This is how we can mitigate our predictably irrational nature and do a better job of managing risk.

When we ask a program officer, for example, for her recommendation on a grant request, do we also ask her to articulate the criteria she used in making the decision? Consider a recommendation in support of a large, multi-year grant to a Pre-K program. On its face, it’s wonderful. In taking the next step and responding to questions about the criteria used to make the decision, the program officer may report that it’s based on a positive site visit or that tests show this Pre-K program does a better job of preparing students for school than other programs. Or that a recent national grant to this Pre-K program presents an attractive opportunity for leverage. If the foundation never surfaces the criteria used to make its decisions, it stands no chance of understanding the risks.

Similarly, it’s critical to confront the implications of decisions. It would be no surprise to learn that many homeowners in a community favor a decision to support a reduction in property taxes. Yet there may be fewer homeowners in favor of the decision if certain implications were made explicit: the closure of a fire station and a community swimming pool.

The balancing act is to require deliberation that reveals the potential risks without allowing the realization automatically to stifle action and innovation. We have to remember that the risks were always were there – whether we acknowledged them or not. And in this case it’s better to know than not know. When we look risk in the eye, we can make a fair choice on whether the potential return outweighs what we’re putting at stake.

This is, in fact, the approach encouraged by the poet Robert Frost.

We saw the risk we took in doing good,
But dared not spare to do the best we could
Though harm should come of it

In philanthropy, we don’t want to inhibit risk taking. It’s not the risk that is bad. It’s the failure to examine our decisions, the failure to reach agreement on the criteria we will use to make decisions and the failure to reflect on the implications of decisions at hand. That’s what leaves us exposed. When we are clear-eyed about what we are putting at risk, the harm that may come is no surprise at all.


[1] Risk assessment and management methods are drawn from Thomas A. Longstaff, Yacov, Y. Haimes, and Carol Sledge, “Are We Forgetting the Risks of COTS Products in Wireless Communications,” Risk Analysis, Vol. 22, No. 1, 2002, pp. 1-6.

[2] Paul F. Deisler, Jr., “A Perspective: Risk Analysis as a Tool for Reducing the Risks of Terrorism,” Risk Analysis, Vol. 22, No. 3, 2002, pp. 405-414.

[3] Michael Hooker, “Moral Values and Private Philanthropy,” Social Philosophy & Policy, Vol. 4, No. 2, pp. 129-141.


Editor's note: For more information on risk taking in philanthropy, see the session "Expanding Your Comfort Zone: Managing Risk in Family Philanthropy" on Friday, October 16th at the 2015 National Forum on Family Philanthropy in Seattle. To continue conversations about this article, we invite you to comment with your thoughts below or via social media hashtag #NCFP15Risk on what type of risk most concerns your board or staff and when does philanthropy feel most risky to you.