11 Comments

"Prospects won’t start looking bright for humans in the distant future until humans in the present get their shit together."

Yes, this a 1000x. And by deferring to some imagined future one invariably fails to meet one's appointment with the present--the moment that matters most.

I don't sense much compassion and empathy for the billions who live now on the part of those who advocate for longtermism (most prominently Elon Musk). So I think it's not unreasonable to assume that they don't truly care about people in the future. It seems more of an abstract intellectual shell game rather than a genuine attempt at improving the lot of the planet (including billions and trillions of other sentient creatures).

There are strong philosophical arguments against the notion of the three times (e.g., by Nagarjuna and Vasubandhu, both of whom were very concerned about how to reduce suffering). Plus the idea that one can adequately measure (or even qualify) well-being and thereby find an algorithm to maximize it will likely increase suffering, not lessen it. By trying to reduce the happiness and well-being of sentient beings (or charitable giving in the case of effective altruism) to abstract math, one ends up creating a world saturated with abstractions, devoid of meaning and connection, which are the things that matter to most of us, irrespective of where we are on the evolutionary rungs.

“Meaning emerges from engagement with the world, not from abstract contemplation of it.”

-- Iain McGilchrist

Expand full comment

Well stated. I'm listening to McGilchrist's The Master and His Emissary right now.

Expand full comment

It's a monumental work. I'd like to sink my teeth into The Matter with Things but don't yet know when I'll be able to shoulder that even bigger tome.

Expand full comment

Great piece, thanks Bob.

Expand full comment
Aug 11, 2022Liked by Robert Wright

Shortermisim, then, should be an essential part of longtermisim.

Expand full comment
author

Yes. And so far I don't think it has been. That may be partly because shorttermism draws you into political/ideological questions that many longtermists think longtermism should and can transcend.

Expand full comment
Aug 11, 2022Liked by Robert Wright

I guess the most rational way to look at longtermism is to let it come up with some intervention ideas, and then see how one feels about them, one by one.

Expand full comment

"Longtermism" is the latest silly craze sweeping through the wokester philosophy set. Bob Wright has explained some of the many obvious flaws with this latest anti-intellectual "hula hoop" craze. As Bob points out, people worry about themselves, their families and social groups and the surroundings in which they live and they have no reason to worry about people who don't exist and probably will never exist. We have enough to worry about in the present and our immediate concerns for peace, our own happiness and our own personal, family and social goals. Where Bob goes off the rails is when he starts talking about "salvation," a meaningless quasi-religious term. I think Bob is trying to use salvation to mean "avoiding the Apocalypse." But avoiding particular threats is not "salvation" and it is careless to bring that religious/teleological concept into the already confused longtermism debate. Whatever people do they aren't going to experience "salvation" as we are, after all, merely mortals.

Expand full comment

I not saying I am sold on Andrew Yang but I like his non-political viewpoint. A breath of fresh air.

Expand full comment

Apocalypse aversion? If the leaders of the world would be working on those existential threats right now, there would be hope. Because these are not technical problems. But they are going in the opposite direction. And accelerate.

If there would be millions in the streets right now, protesting, there would be hope. But they are silent.

Enjoy the apocalypse.

Expand full comment
deletedAug 11, 2022·edited Aug 11, 2022
Comment deleted
Expand full comment
author

One thing elite EA types would say in their defense is that they do try to focus philanthropic resources toward very non-elite people. EA favors buying mosquito nets over donating to art museums. At least in theory--whether actual billionaire philanthropists who purport to embrace EA really stick with that guidance is another question.

Expand full comment