11 Comments
Aug 10, 2022·edited Aug 10, 2022Liked by Paul Logan

Instrumental goals will almost ALWAYS trump terminal goals in a chaotic system, because instrumental goals involve changing yourself, which is easier to predict. Instrumental goals can promise the potential to pay off over extremely long timeframes, whereas terminal goals generally involve expending resources for some one time gain, and are harder to predict when they involve changing outcomes further away.

I think what we are seeing now is the result of around 100+ years of elites thinking official religions are a bad idea. The end result is we get unofficial religions that have most of the downsides of official religion, with little of the upside.

My most recent post was an argument that people are _going_ to try and define good, and unless it's a purely abstract category, it'll lead to bad things because people will replace true good with the vehicle for advancing good:

https://apxhard.substack.com/p/a-simplified-predictive-model-of

Expand full comment
Aug 10, 2022Liked by Paul Logan

Loved this. But, as my background is in STEM and regretfully clueless I have to ask: is this a well known phenomena of organizational ‘evolution’ or your own observations?

Expand full comment
Aug 12, 2022·edited Aug 12, 2022Liked by Paul Logan

The fact that they are trying to quantify the benefits of seemingly self-evident moral precepts like "it's good to share food and medicine (and mosquito nets)" is not so terrible.

The fact that a group of hyper-educated elites have become self-consciously virtuous and have a "mission" is mostly terrible.

I've come across what seems to be a flurry of writing about EA (of which yours was the most pleasant to read) and after looking into it as much as I'm going to, I'm not personally terrified. There is exactly zero, nada, nothing original or new about any of this stuff. Enough people will see that the theory is just an explanation for these guys doing what they were gonna do anyway. (I'm hoping the ascetic branch wins the argument with the self-enrichment branch, although that never happens.)

Also I'd like to hope that the elaborate Wikipedia scam the woman created got saved somehow.

Expand full comment

Hi - fairly involved EA here. I'm pretty confused about the specific criticism this post is trying to make? I see a bunch of instrumental reasoning I mostly accept, combined with absolutist language that I don't accept. Is the worry that EA's instrumental goals are going to make it slide into absolutism and thus Evil? Let me point to some specific spots.

"EA is inherently good", "any action it takes is justified", "any use of external resources is also justified", "all others must either bow or die in its stead". I think literally any EA you ask would disagree with these claims. There are definitely people (myself included) who would agree with more moderate versions, like "EA is probably good and I generally trust it to take good actions, moving resources towards EA is also probably good, and people actively opposing EA is probably bad". These can obviously be overruled by EA taking bad actions. I also think these claims are totally reasonable and not warning flags, since they're basically required to undertake any action at all that involves disagreeing with other people.

Expand full comment

Aren't there plenty of other groups who are fighting for (subjectively) similarly high stakes and don't engage in violence or theft? My fundamentalist friend truly thinks my soul is damned and the media is satanic but is chill and law abiding. So many groups fit this that it feels silly to name them.

Expand full comment