this post was submitted on 01 Sep 2023
8 points (100.0% liked)

SneerClub

37 readers
8 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

Does anyone here know what exactly happened to lesswrong to become so cult-y? I had never seen or heard anything about it for years, back in my day it was seen as that funny website full strange people posting weird shit about utliltarianism, nothing cult-y, just weird. The aritcle on TREACLES and this sub's mentioning of lesswrong made me very curious about how it went from people talking out of their ass for the sheer fun of "thought experiments" to a straight-up doomsday cult?
The one time I read lesswrong was probably in 2008 or so.

top 4 comments
sorted by: hot top controversial new old
[–] TerribleMachines@awful.systems 8 points 1 year ago* (last edited 1 year ago) (1 children)

Only half joking: there was this one fanfic you see...

Mainly I don't think there was any one inciting incident beyond its creation: Yud was a one man cult way before LW, and the sequences actively pushed all the cultish elements required to lose touch with reality. (Fortunately, my dyslexic ass only got as far as the earlier bits he mostly stole from other people rather than the really crazy stuff.)

There was definitely a step-change around the time CFAR was created, that was basically a recruitment mechanism for the cult and part of the reason I got anywhere physically near those rubes myself. An organisation made to help people be more rational seemed like a great idea—except it literally became EY/MIRI's personal sockpuppet. They would get people together in these fancy ass mansions for their workshops and then tell them nothing other than AI research mattered. I think it was 2014/15 when they decided internally that CFAR's mission was to create more people like Yudkowsky. I don't think its a coincidence that most of the really crazy cult stuff I've heard about happened after then.

Not that bad stuff didn't happen before either.^___^

[–] skillissuer@discuss.tchncs.de 8 points 1 year ago (1 children)

I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky

the real AI doom is Eliezer cloning facility

[–] TerribleMachines@awful.systems 10 points 1 year ago

Truer words were never spoken, probably.

CFAR is the mind killer (because they kill you and replace you with a Yud clone).

[–] swlabr@awful.systems 6 points 1 year ago* (last edited 1 year ago)

prefacing this with IANALCUM (i am not a legit cult understanding mechanism)

I imagine it has been a cult from the start, or at least the primordial soup of factors before any of this hit the internet in earnest had all the right ingredients for a cult:

  • A leader claiming to have nigh omnipotence, and some version of high charisma amongst potential followers (Yud, who despite everything, is charismatic within the confines of the ratsphere)
  • A framework of lore (in this case rationalism) in which recruits are to be indoctrinated and taught how to think
  • An upwards power structure in said lore that concentrates authority at the apex of the heirarchy (iq as intelligence metric, blindly doing what higher iq people say)
  • Purity tests that create a positive feedback cycle that reinforces adherence to doctrine (either you believe yud about many worlds/AGI/whatever and are on the road to smartness, or you don’t and you will be cast out of the ratsphere)

I mean the list goes on. As to when it became culty? To use a trusty thought experiment, it’s the paradox of the heap.