"Prospects won’t start looking bright for humans in the distant future until humans in the present get their shit together."
Yes, this a 1000x. And by deferring to some imagined future one invariably fails to meet one's appointment with the present--the moment that matters most.
I don't sense much compassion and empathy for the billions who live now on the part of those who advocate for longtermism (most prominently Elon Musk). So I think it's not unreasonable to assume that they don't truly care about people in the future. It seems more of an abstract intellectual shell game rather than a genuine attempt at improving the lot of the planet (including billions and trillions of other sentient creatures).
There are strong philosophical arguments against the notion of the three times (e.g., by Nagarjuna and Vasubandhu, both of whom were very concerned about how to reduce suffering). Plus the idea that one can adequately measure (or even qualify) well-being and thereby find an algorithm to maximize it will likely increase suffering, not lessen it. By trying to reduce the happiness and well-being of sentient beings (or charitable giving in the case of effective altruism) to abstract math, one ends up creating a world saturated with abstractions, devoid of meaning and connection, which are the things that matter to most of us, irrespective of where we are on the evolutionary rungs.
“Meaning emerges from engagement with the world, not from abstract contemplation of it.”
Yes. And so far I don't think it has been. That may be partly because shorttermism draws you into political/ideological questions that many longtermists think longtermism should and can transcend.
I guess the most rational way to look at longtermism is to let it come up with some intervention ideas, and then see how one feels about them, one by one.
"Longtermism" is the latest silly craze sweeping through the wokester philosophy set. Bob Wright has explained some of the many obvious flaws with this latest anti-intellectual "hula hoop" craze. As Bob points out, people worry about themselves, their families and social groups and the surroundings in which they live and they have no reason to worry about people who don't exist and probably will never exist. We have enough to worry about in the present and our immediate concerns for peace, our own happiness and our own personal, family and social goals. Where Bob goes off the rails is when he starts talking about "salvation," a meaningless quasi-religious term. I think Bob is trying to use salvation to mean "avoiding the Apocalypse." But avoiding particular threats is not "salvation" and it is careless to bring that religious/teleological concept into the already confused longtermism debate. Whatever people do they aren't going to experience "salvation" as we are, after all, merely mortals.
Apocalypse aversion? If the leaders of the world would be working on those existential threats right now, there would be hope. Because these are not technical problems. But they are going in the opposite direction. And accelerate.
If there would be millions in the streets right now, protesting, there would be hope. But they are silent.
One thing elite EA types would say in their defense is that they do try to focus philanthropic resources toward very non-elite people. EA favors buying mosquito nets over donating to art museums. At least in theory--whether actual billionaire philanthropists who purport to embrace EA really stick with that guidance is another question.
"Prospects won’t start looking bright for humans in the distant future until humans in the present get their shit together."
Yes, this a 1000x. And by deferring to some imagined future one invariably fails to meet one's appointment with the present--the moment that matters most.
I don't sense much compassion and empathy for the billions who live now on the part of those who advocate for longtermism (most prominently Elon Musk). So I think it's not unreasonable to assume that they don't truly care about people in the future. It seems more of an abstract intellectual shell game rather than a genuine attempt at improving the lot of the planet (including billions and trillions of other sentient creatures).
There are strong philosophical arguments against the notion of the three times (e.g., by Nagarjuna and Vasubandhu, both of whom were very concerned about how to reduce suffering). Plus the idea that one can adequately measure (or even qualify) well-being and thereby find an algorithm to maximize it will likely increase suffering, not lessen it. By trying to reduce the happiness and well-being of sentient beings (or charitable giving in the case of effective altruism) to abstract math, one ends up creating a world saturated with abstractions, devoid of meaning and connection, which are the things that matter to most of us, irrespective of where we are on the evolutionary rungs.
“Meaning emerges from engagement with the world, not from abstract contemplation of it.”
-- Iain McGilchrist
Well stated. I'm listening to McGilchrist's The Master and His Emissary right now.
It's a monumental work. I'd like to sink my teeth into The Matter with Things but don't yet know when I'll be able to shoulder that even bigger tome.
Great piece, thanks Bob.
Shortermisim, then, should be an essential part of longtermisim.
Yes. And so far I don't think it has been. That may be partly because shorttermism draws you into political/ideological questions that many longtermists think longtermism should and can transcend.
I guess the most rational way to look at longtermism is to let it come up with some intervention ideas, and then see how one feels about them, one by one.
"Longtermism" is the latest silly craze sweeping through the wokester philosophy set. Bob Wright has explained some of the many obvious flaws with this latest anti-intellectual "hula hoop" craze. As Bob points out, people worry about themselves, their families and social groups and the surroundings in which they live and they have no reason to worry about people who don't exist and probably will never exist. We have enough to worry about in the present and our immediate concerns for peace, our own happiness and our own personal, family and social goals. Where Bob goes off the rails is when he starts talking about "salvation," a meaningless quasi-religious term. I think Bob is trying to use salvation to mean "avoiding the Apocalypse." But avoiding particular threats is not "salvation" and it is careless to bring that religious/teleological concept into the already confused longtermism debate. Whatever people do they aren't going to experience "salvation" as we are, after all, merely mortals.
I not saying I am sold on Andrew Yang but I like his non-political viewpoint. A breath of fresh air.
Apocalypse aversion? If the leaders of the world would be working on those existential threats right now, there would be hope. Because these are not technical problems. But they are going in the opposite direction. And accelerate.
If there would be millions in the streets right now, protesting, there would be hope. But they are silent.
Enjoy the apocalypse.
One thing elite EA types would say in their defense is that they do try to focus philanthropic resources toward very non-elite people. EA favors buying mosquito nets over donating to art museums. At least in theory--whether actual billionaire philanthropists who purport to embrace EA really stick with that guidance is another question.