Ode to a world-saving idea

...and to the psychologist who put it on the map.

Five weeks ago the psychologist Lee Ross died, and five days ago the New York Times published his obituary. Better late than never. But, even with all that time for research, the obituary doesn’t do justice to its subject.

By “its subject” I don’t mean Ross; he comes off very well. I mean the idea he is most closely associated with, an idea that occupies much of the obituary and is no doubt the reason Ross was finally deemed to have Times obit status. The idea is called “attribution error.”

Actually, Ross called it “the fundamental attribution error,” and that’s the term the Times uses. But as the idea evolved in keeping with new research, the word “fundamental” became less apt. And, oddly, this evolution made the concept itself more fundamental. In fact, if I had to pick only one scientific finding about how the human mind works and promulgate it in hopes of saving the world, I’d probably go with attribution error.

Ross coined the term “the fundamental attribution error” in 1977, in a paper that became a landmark in social psychology. The basic idea was pretty simple: When we’re explaining the behavior of other people, we tend to put too much emphasis on “disposition”—on their character, their personality, their essential nature. And we tend to put too little emphasis on “situation”—on the circumstances they find themselves in. The Times gives an illustration:

A 2014 article in Psychology Today titled ‘Why We Don’t Give Each Other a Break’ used the example of someone who cuts into a line in front of you. You might think, “What a jerk,” when in reality this person has never skipped ahead in a line before and is doing so now only because he would otherwise miss a flight to see a dying relative.

Right after this paragraph, Alex Traub, author of the obituary, writes: “Delivering folk wisdom in multisyllabic packaging, ‘the fundamental attribution error’ became one of those academic phrases that add a whiff of sophistication to any argument they adorn.”

I can see why Traub treats the idea a bit sardonically. The original formulation of it can be rendered in ways that make it seem borderline obvious, and Traub may not be aware that it evolved into something subtler. (He’s a newspaper reporter who has to write about all kinds of stuff every week, not a psych grad student.) But, before we move on to the subtler version of the idea, it’s important to understand that even the original version has potentially radical implications.

For example: Ross and Richard Nisbett, another eminent figure in modern psychology, argued that due appreciation of the power of situation in shaping behavior should lead us to revise the way we think about categories of people. If you see a picture of a minister and then a picture of a prison inmate, you’ll probably assume they have very different characters. But, Ross and Nisbett wrote:

Clerics and criminals rarely face an identical or equivalent set of situational challenges. Rather, they place themselves, and are placed by others, in situations that differ precisely in ways that induce clergy to look, act, feel, and think rather consistently like clergy and that induce criminals to look, act, feel, and think like criminals.

It’s possible to take the point Ross and Nisbett are making too far, but I think a more common mistake is not taking it far enough. In any event, that’s the basic idea behind the fundamental attribution error: most people don’t take the power of circumstance seriously enough. Now for the subtler version of the concept:

It turns out that our tendency to attribute people’s behavior to disposition rather than situation isn’t as general as Ross and other psychologists originally thought. There are two notable exceptions to it:

(1) If an enemy or rival does something good, we’re inclined to attribute the behavior to situation. (Granted, my rival for the affections of the woman I love did give money to a homeless man, but that was just to impress the woman I love, not because he’s actually a nice guy!) (2) If a friend or ally does something bad, we’re inclined to attribute the behavior to situation. (Yes, my golf buddy embezzled millions of dollars, but his wife was ill, and health care is expensive—plus, there was the mistress to support!)

These exceptions are the reason I say there is no single “fundamental” attribution error. We’re not as generally biased toward disposition in our attributions as was originally thought; under certain circumstances, we lean more toward situation. Attribution error, you might say, is itself situational.

Among the consequences of this fact is that attribution error reinforces allegiances within tribes and reinforces antagonisms between tribes. Sure, the people in your tribe may do bad things, but only in response to extraordinary circumstances. And, sure, the people in the other tribe may do good things, but only in response to extraordinary circumstances. So the fact remains that the people in your tribe are essentially good and the people in the other tribe are essentially bad.

So attribution error is one reason that, once a nation’s politics get polarized, they can be hard to de-polarize.

Attribution error can also have a big impact on international politics. It means that, once you’ve defined another nation as the enemy, that label will be hard to change. As the social scientist Herbert Kelman has put it:

Attribution mechanisms… promote confirmation of the original enemy image. Hostile actions by the enemy are attributed dispositionally, and thus provide further evidence of the enemy’s inherently aggressive, implacable character. Conciliatory actions are explained away as reactions to situational forces—as tactical maneuvers, responses to external pressure, or temporary adjustments to a position of weakness—and therefore require no revision of the original image.

This helps explain why Americans who argue for invading a country spend so much time emphasizing how nefarious the country’s leaders are. Once we’re convinced that a foreign government is bad, it’s hard to imagine how to fix the problem with anything short of regime change. After all, if the existing regime changes its behavior for the better, evil will still lurk within.

I hope you’re starting to see why I think attribution error is really important—why I think that, if we could dispel its more destructive influences, the world would be a much better place. But to see why I think attribution error is really, really important—why it may have more salvific potential than any other idea in psychology—you need to understand what I consider the most potent tool in the human toolkit for ending or avoiding conflict and nurturing constructive collaboration.

Regular readers of this newsletter can probably guess what I’m referring to: cognitive empathy. And regular readers know that by “cognitive empathy” I don’t mean “feeling their pain.” That’s emotional empathy. I just mean seeing how things look from another person’s point of view: perspective taking.

I believe that one of the most common reasons people and groups of people fail to solve non-zero-sum problems—fail to reach an arrangement that’s good for both parties, and instead get stuck in a lose-lose situation—is that they don’t see how things look from the other side. I also believe that the world is in deep trouble if nations don’t solve the more consequential of the non-zero-sum problems they face, ranging from environmental challenges to arms control challenges to disease control challenges to whole new kinds of technological challenges.

It follows that—as I see the world, at least—big impediments to cognitive empathy are a grave threat to the planet. And attribution error may be the biggest impediment there is. Obviously, if you’re blind to the way circumstance shapes someone’s behavior, it’s going to be hard to really appreciate how the world looks to them.

Could more awareness of attribution error actually make people better at cognitive empathy? Not in an easy, automatic way. Attribution error is a “cognitive bias,” and there’s good reason to think it was engineered by natural selection for that purpose: to bias our view of the world, to distort our perception. And a well-engineered bias can be pretty stubborn in its tendency to fool people into thinking they’re seeing things clearly when they’re not. 

Still, I do think that cognitive empathy can be cultivated. And I do think awareness of attribution error, of our tendency in most situations to downplay the role of circumstance, can help us cultivate it.

In fact, Ross’s own life offers anecdotal evidence to this effect. The Times obit reports that Nisbett considered Ross not just a collaborator but “my therapist and my guru.” Nisbett once asked Ross why he was so good at giving advice, and he replied, “Here’s why, Dick: I don’t take your point of view when you tell me what the problem is. I try to figure out how the other person or persons are viewing it.”

You might ask: If awareness of attribution error helps you exercise cognitive empathy, then why hadn’t Nisbett, who was himself quite aware of attribution error, exercised it in the first place? The answer, I’d guess, is that the people whose perspective Ross was taking were people Nisbett was in some sense at odds with—that’s why there was a problem to solve. And, of course, the problematic behavior of people we’re at odds with is behavior we’re especially likely to attribute to disposition. Since Ross wasn’t at odds with these people, he was less susceptible to that bias and so better able to see their point of view.

This is what I mean when I say that a well-engineered bias can be hard to neutralize. Nisbett’s mere awareness of attribution error doesn’t seem to have done the trick. At the same time, his experience suggests a workaround: When you’re having trouble with someone you dislike, or at least someone you find highly annoying, and you’re dying to tell someone about the problem, don’t tell someone who shares your attitude toward them, even though that’s the most tempting thing to do.

So that’s today’s self-help tip. As for planetary help—solving momentous non-zero-sum problems, and subduing the international and intranational antagonisms that keep us from even trying to solve them—well, that’s kind of a big subject. (That’s why it takes a whole Apocalypse Aversion Project to address it!)

To take just one chunk of the subject: Every day lots of important players—politicians, social media potentates, think tank experts, journalists—reinforce and even intensify attribution error. They describe various groups and people crudely, in ways that make it especially hard to really understand why they do what they do, hard to exercise cognitive empathy.

I’m not saying these politicians, potentates, experts, and journalists are bad people. As Ross would have been the first to point out, they’re just responding to circumstance as humans naturally do. They’re saying things that will get them elected or increase their Twitter follower count or get them on MSNBC or get them clicks, or whatever.

Besides, if we think of them as bad people—as the enemy—that may just cloud our view of their motivation at a time when understanding it is important. So, though I’d like to say something inspirational at this point, I won’t get Churchillian (“We must fight them on the beaches” and so on). I’d rather just quote William James and say that what’s needed here is careful comprehension accompanied by “the moral equivalent of war.” 

Share