couldn't help myself, there are seldom more perfect opportunities to use this one
istewart
I actually think it's part-and-parcel of Yarvin's personality. As much as he rails against "the Cathedral," PMCs, whatever, he himself is a perfect example of a pathological middle manager. Somebody who wants power without having to shoulder ultimate responsibility. He craves the childishly simplified social environment of a medieval-fantasy king's court, but he doesn't want to be the king himself. He wants to be (and has been, up until now) the scheming vizier who can run his manipulation games in the background, deciding who gets in front of the king but not having to take the heat if the king makes a bad decision. (And the "kings" he works for have made plenty of bad decisions, but consequences have only just begun to catch up.)
I suspect this newfound mainstream attention is far more uncomfortable than it is validating for him. Perhaps the NYT profile was a burst of exhilaration, but the shine has worn off quickly. This correlates with the story last year about him coming back to Urbit as a "wartime CEO." If Urbit is so damn important for building his ridiculous vision, why wasn't he running it the whole time? He doesn't actually want to be CEO of anything. Power without responsibility.
He will never stop to reflect that his "philosophy," such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo
Obvious joke is obvious, but
The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned “into dust. Into quarks” with the coherence of a meth-addled squirrel.
Harvard isn't already full of Quarks?
Another thread worth pulling is that biotechnology and synthetic biology have turned out to be substantially harder to master than anticipated, and it didn't seem like it was ever the primary area of expertise for a lot of these people anyway. I don't have a copy of any of Kurzweil's books at hand to look at his predicted timelines for that stuff, but they're surely way off.
Faulty assumptions about the biological equivalence of digital neural network algorithms have done a lot of unexamined heavy lifting in driving the current AI bubble, and keeping the harder stuff on the fringes of the conversation. That said, I don't doubt that a few refugees from the bubble-burst will attempt to inflate the next bubble on the back of speculative biotech, and I've seen a couple of signs of that already.
For my money, 2015/16 Adams trying to sell Trump as a "master persuader" while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining he's ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.
"This Is What Yudkowsky Actually Believes" seems like a subtitle that would get heavy use in a future episode of South Park about Cartman dropping out after one semester at community college.
Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they're at best a student club that only aspires to be a proper curriculum. It's surely no coincidence that they're anchored in Berkeley, adjacent to the university's famous student-led DeCal program.
FWIW, my capsule summary of TPOT/"post-rationalists" is that they're people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.
I've been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was "just another fad for these people," and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.
I also prefer to highlight Kurzweil's obsession with perpetual exponential growth curves as a central point. That's often what I start with when I'm explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It'll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples' PhD theses.
awful.systems
Huh, 2 paradigm shifts is about what it takes to get my old Beetle up to freeway speed, maybe big Yud is onto something
It is what happened to look good in the valley between the Adderall comedown and yesterday evening's edible really starting to hit
Mesa-optimization... that must be when you rail some crushed-up Adderall XRs, boof some modafinil for good measure, and spend the night making sure your kitchen table surface is perfectly flat with no defects abrasions deviations contusions...