this post was submitted on 29 Aug 2023
16 points (100.0% liked)

SneerClub

37 readers
7 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
top 32 comments
sorted by: hot top controversial new old
[–] titotal@awful.systems 30 points 1 year ago (6 children)

As a physicist, this quote got me so mad I wrote an excessively detailed debunking a while back. It's staggeringly wrong.

[–] bitofhope@awful.systems 12 points 1 year ago (1 children)

Kudos for the effortpost. My 5-second simpleton objection went something like

YEA BECAUSE WEBCAMS COME WITH DENSITY SENSORS INCLUDED RIGHT?

[–] dgerard@awful.systems 12 points 1 year ago* (last edited 1 year ago) (1 children)

A Bayesian superintelligence, hooked up to a webcam, would generate the world's most beautiful camgirl, like a photoshop that's been photoshopped, and take over OnlyFans to raise money to wedgie rationalists, just don't look too closely at the fingers or teeth

[–] BrickedKeyboard@awful.systems 4 points 1 year ago* (last edited 1 year ago) (1 children)

I'm trying to find the twitter post where someone deepfakes eliezer's voice into saying full speed ahead on AI development, we need embodied catgirls pronto.

[–] dgerard@awful.systems 2 points 1 year ago

now that's a positive contribution to the space

Love this!

Alas, if Yud took an actual physics class, he wouldn't be able to use it as the poorly defined magic system for his OC doughnut-steal IRL bayesian superintelligence fanfic.

[–] dgerard@awful.systems 6 points 1 year ago

that was shockingly polite

[–] earthquake@lemm.ee 5 points 1 year ago

“Eliezer has sometimes made statements that are much stronger than necessary for his larger point, and those statements turn out to be false upon close examination” is something I already generically believe, e.g. see here.

I get the impression that this guy (whose job at an AGI thinkpiece institute founded by a cryptobillionaire depends on believing this) would say this about ALL of EYs statements, leaving his larger point floating in the air, "supported" by whatever EY statements you aren't currently looking at.

[–] self@awful.systems 4 points 1 year ago (2 children)

this is fantastic! if you’ve ever got another one of these in you, feel free to tag it NSFW and post it here or on MoreWrite depending on what feels right. I live to see yud get destroyed in slow motion by real expertise

[–] blakestacey@awful.systems 9 points 1 year ago (1 children)

I've more than once been tempted to write Everything the Sequences Get Wrong about Quantum Mechanics, but the challenge is doing so in a way that doesn't just amount to teaching a whole course in quantum mechanics. The short-short version is that it's lazy, superficial takes on top of cult shit — Yud trying to convince the reader that the physics profession is broken and his way is superior.

[–] self@awful.systems 4 points 1 year ago (1 children)

I’d be happy to contribute what CS material I can to a multidisciplinary effort to prove that Yud’s lazy, superficial takes and cult shit are universal

[–] blakestacey@awful.systems 5 points 1 year ago

I got as far as this blog post that I shared in the first days of new!SneerClub, but that was only a first stab.

[–] titotal@awful.systems 4 points 1 year ago

Yeah, I've been writing up critiques for a year or two now, collected over at my substack. I've been posting them to the EA forum and even Lesswrong itself and they've been generally well received.

[–] bitofhope@awful.systems 9 points 1 year ago

The word "Einstein" appears no less than eight times in this story.

Bringing up Hendrix every ten sentences doesn't make you an amazing guitarist either.

[–] sus@programming.dev 8 points 1 year ago* (last edited 1 year ago)

Never underestimate the rationalist's ability to write a 5000 word, extremely fanciful short story to make a point that could be compressed into 2 sentences, in a failed attempt to dismiss a strawman

and of course the story includes a cameo from the writer's opponents who are naive fools and proceed to doom the universe with their hubris (of disagreeing with the author)

[–] gerikson@awful.systems 7 points 1 year ago

So LW is just a fanfic appreciation forum, got it.

[–] self@awful.systems 7 points 1 year ago (1 children)

But in this world there are careful thinkers, of great prestige as well, and they are not so sure. "There are easier ways to send a message," they post to their blogs

please destroy this gate address, it leads to Reply Guy Earth

[–] self@awful.systems 8 points 1 year ago (8 children)

also, sincerely, can anyone explain to me what’s good about Yud’s writing? this shit is structured exactly like a goosebumps short except instead of being written by a likeable author targeting grade schoolers it’s written by some asshole who loves using concepts he doesn’t understand, targeting other assholes who don’t understand fucking anything because all their knowledge got filtered through Yud

[–] froztbyte@awful.systems 4 points 1 year ago

I don't think there's anything good about the writing, but there's a few things that stand out ito mechanics employed and to which outcome effect they appear to be aiming

  • (bad) storyteller style (nerds love 'em some stories as much as the next, even those who think they don't)
  • touching on sufficiently many topics ("oh wow he's thought about this so hard"
  • going just far enough in detail to convince that there's some kind of deeper aspect/more ("wow he knows so much about this")

even this horrible essay pulled the infomercial "but wait, there's more!" at least 5 times. a terrible Plot Twist because he can't figure out how to layer his story devices any better

load more comments (7 replies)
[–] fasterandworse@awful.systems 7 points 1 year ago (1 children)

"Imagine a world much like this one"

I don't know how you all muster the focus to read past these hallmarks of shit ideas

[–] BernieDoesIt@kbin.social 5 points 1 year ago

I imagine a world much like this one all the time!

[–] corbin@awful.systems 5 points 1 year ago (1 children)

He thinks he's discount Peter Watts.

[–] froztbyte@awful.systems 4 points 1 year ago

he probably doesn't think it's all that discount, but this comment is even funnier given how much Watts was responsible for watering down/clobbering some of the meaning of things

[–] jonhendry@awful.systems 4 points 1 year ago

What if the webcam were upside down.

[–] carlitoscohones@awful.systems 3 points 1 year ago

I am saving this to read later, but having eugenics and IQ in the very first sentence makes it look especially promising.

[–] saucerwizard@awful.systems 3 points 1 year ago

I hate the co-option of SETI these guys try.