So seeing the reaction on lesswrong to Eliezer’s book has been interesting. It turns out, even among people that already mostly agree with him, a lot of them were hoping he would make their case better than he has (either because they aren’t as convinced as him, or they are, but were hoping for something more palatable to the general public).
This review (lesswrong discussion here), calls out a really obvious issue: Eliezer’s AI doom story was formed before Deep Learning took off, and in fact was mostly focusing on more GOFAI than neural networks, yet somehow, the details of the story haven’t changed at all. The reviewer is a rationalist that still believes in AI doom, so I wouldn’t give her too much credit, but she does note this is a major discrepancy from someone that espouses a philosophy that (nominally) features a lot of updating your beliefs in response to evidence. The reviewer also notes that “it should be illegal to own more than eight of the most powerful GPUs available in 2024 without international monitoring” is kind of unworkable.
This reviewer liked the book more than they expected to, because Eliezer and Nate Soares gets some details of the AI doom lore closer to the reviewer’s current favored headcanon. The reviewer does complain that maybe weird and condescending parables aren’t the best outreach strategy!
This reviewer has written their own AI doom explainer which they think is better! From their limited description, I kind of agree, because it sounds like the focus on current real world scenarios and harms (and extrapolate them to doom). But again, I wouldn’t give them too much credit, it sounds like they don’t understand why existential doom is actually promoted (as a distraction and source of crit-hype). They also note the 8 GPUs thing is batshit.
Overall, it sounds like lesswrongers view the book as an improvement to the sprawling mess of arguments in the sequences (and scattered across other places like Arbital), but still not as well structured as they could be or stylistically quite right for a normy audience (i.e. the condescending parables and diversions into unrelated science-y topics). And some are worried that Nate and Eliezer’s focus on an unworkable strategy (shut it all down, 8 GPU max!) with no intermediate steps or goals or options might not be the best.
Scales remain firmly unfalling from eyes I’d bet.
I’m still not sure if they actually grasp the totalitarian implications of going ham on tech companies and research this way. He sure doesn’t get called out about his ‘solutions’ that imply that some sort of world government has to happen that will also crown him Grand Central Planner of All Technology.
It’s possible they just believe the eight [specific consumer electronic goods] per household is doable, and at worst equally authoritarian with the tenured elites snubbing their noses at HBD research.
You thought a right-and-proper Communist Five-Year Plan couldn’t also be a self-insert fanfic? Hold Yud’s beer
did you hear about Yudoslavia? They have this Eight GPU policy, if a household is about to have another GPU and they find it can’t run Crysis at max settings at 60fps, they leave it outside for the wolves. But this might just be Yudophobic propaganda
In Eliezer’s “utopian” worldbuilding fiction concept, dath ilan, they erased their entire history just to cover up the any mention of any concept that might inspire someone to think of “superintelligence” (and as an added bonus purge other wrong-think concepts). The
Philosopher KingsKeepers have also discouraged investment and improvement in computers (because somehow, despite now holding any direct power and the massive financial incentives and dath ilan being described as capitalist and libertarian, the Keepers can just sort of say their internal secret prediction market predicts bad vibes from improving computers too much and everyone falls in line). According to several worldbuiding posts, dath ilan has built an entire secret city that gets funded with 2% of the entire world’s GDP to solve AI safety in utter secrecy.In Eliezer’s “utopian” worldbuilding fiction concept, dath ilan
In Eliezer’s sweet fucking what now?
It would be only light cringey if he kept it as a fiction writing project not strongly linked to his other writings. But as Architeuthis says, he not only links to it frequently, he cites it like it was a philosophy paper or something that obviously everyone should have read. Example from our old reddit discussions: https://www.reddit.com/r/SneerClub/comments/x3pihv/to_be_fair_you_have_to_have_a_very_high_iq_to/
That he cites as if it were a philosophy paper, to non-rationalists.
In my ideal version of Omelas there’s a secret institution where people like me are given vast power and wealth to think about how to solve the tortured child issue.
tell me you didn’t subject yourself to glowfic
Where else am I supposed to fine deep analyses of the economic implications of 1st level wizards and clerics on an early modern setting?
and analyses of Intelligence score distributions across the nations of Golarion?glowfic: it’s like a forum, but worse™
Glowfic sounds like fan fiction written by CIA agents.
Yep, from what I can tell second hand dath ilan world building definitely skews towards it doesn’t count as totalitarianism if the enforced orthodoxy is in line with my Obviously Objectively Correct and Overdetermined opinions.
It is funny that this is the best Yud, with an IQ of 191, can come up with.
Thanks for the lore, and sorry that you had to ingest all that at some point.
Like a lot of sci-fi that I will comment on without having read, that sounds like a Dune ripoff (that I also have not read). Except, rather than “ok we tried the whole AI thing and it turned out bad”, Yud is saying “my world is smarter because they predicted AI would be bad, so they just didn’t try it, neener neener neener!”
Also, iterating on the name origins of “dath ilan,” as I’ve said before, it’s an anagram of Thailand. But I have a new hypothesis. It’s formed from “Death Island” minus the letters “ESD”. What could this mean!!! Electrostatic discharge? “Eek, Small Dragon?” “Eliezer Sudden Death?” The abyss pondering continues…
OG Dune actually had some complex and layered stuff to say about AI before the background lore was retconned to dollar store WH40K by the current handlers of the IP.
There was no superintelligence, thinking machines were gatekept by specialists who formed entrenched elites, overreliance to them was causing widespread intellectual stagnation, and people were becoming content with letting unknowable algorithms decide on matters of life and death.
The Butlerian Jihad was first and foremost a cultural revolution.
Wasn’t there a bit in the OG Dune lore where the person the Jihad was named after actually didn’t want to start a Jihad? (I think that was retconned into ‘they sneakily did try to become a martyr to start the war’ bit, which sort of messes with the themes of Dune that even lofty goals can have horrible consequences, see Paul).
The author certainly wants you to know that finding yourself as the head of a revolutionary movement means very little with regards to your abilities to steer it, but I don’t remember.
It was very much a Luddite movement that succeeded
I mean the aftermath of Buterlian Jihad eventually lead to brutal feudalism that lasted a really long time and halted multiple lines of technological and social development, so I wouldn’t exactly call it a success for the common person.
Thanks for the lore, and sorry that you had to ingest all that at some point.
Ironically, one of the biggest lore drops about dath ilan happens in a story I initially thought at the time was a parody of rationalists and the concept of dath ilan (Eliezer used a new penname for the story). The main dath ilan character (isekai’d into an Earth mostly similar to our own but with magic and uh… other worldbuilding conceits I won’t get into here) jumps to absurd wild conclusion throughout basically every moment of the story, and unlike HJPEV is actually wrong about basically every conclusion she jumps to. Of course, she’s a woman, and it comes up towards the ending that she is below average for dath ilan intelligence (but still above the Earth average, obviously), so don’t give Eliezer too much credit for allowing a rationalist character to be mostly wrong for once.
I don’t know how he came up with the name… other fanfic writers in rationalist-adjacent space have complained about his amateurish attempts at conlanging, so there probably isn’t a sophisticated conlang explanation about phonemes involved. You might be on the right track guessing at weird anagrams?
other fanfic writers in rationalist-adjacent space have complained about his amateurish attempts at conlanging
that feeling when the chess club shoves you into a locker
other worldbuilding conceits I won’t get into here
My invite-only internal prediction market is offering good odds that this is referring to either math pets or tentacles, and a solid parlay opportunity on it being both combined
You are close! It is a bdsm AU (inspired by an Archive of Our Own Trend of writing alternate universe settings of a particular flavor), i.e. everyone identifies as “Dominant” or “Submissive”, and that identification is more important than gender in most ways. Ironically the dath ilan character is the one freaked out by this.
I can smell the ‘rape (play) is the best kind of sex actually’ from over here.
(Eliezer used a new penname for the story).
Stop trying to make dath ilan happen, Yud.
HJPEV
I didn’t what this meant initially. “HJP” pointed me toward “harry james potter” but I couldn’t figure out “EV,” so ofc my gutter brain decided this was “Handjob Penis Envy Vagina”. But no, it’s just “HJP Evans Verres”, Verres being HJPEV’s adopted dad.
so don’t give Eliezer too much credit
He has a credit score of 250 from this (imagine me holding two thumbs up at myself) ratings agency.
I don’t know how he came up with the name
Yeah his naming is wack. I’m guessing that Yud went with “Verres” for HJP’s nonbiodad because it’s close to “veritas”, the latin word for truth. But searching “verres” gives the latin word for boar, lol. And also Gaius Verres, a notoriously bad magistrate according to wikipedia.
I hope somebody said: ‘If we let Eliezer write this, everybody dies’
I have not read it, but the 8 gpu max thing is funny. I’m halfway there already. Worse perhaps, I’m not sure there still is a Voodoo Graphics card in a box somewhere (don’t think those count as a gpu however).
And it is funny that microsoft will now be limited to 8 gpus. But your anarchist polycule commune with 10 people can have 80 of them.
see, this also includes
aella’s bday party locationrationalist castlegreater sfba area polyculeimperial chinese harem where they will solve alignment in no time. all according to the planHey, it was Caroline Ellison who wanted to be part of the Imperial Chinese Harem! Also, judging by Dragon Army and Leverage Research. the communes aren’t anarchist! They are tightly controlled by a single great man type leader.
Doubt they can resist the urge to wrap those communes into contracts and so theyvwill be organisations
As I was going to St Ives,
I met a man with eight wives…Yeah, in some hypothetical bizarro universe where they get their 8 (cutting edge) GPU limit actually passed, I bet the monitoring scheme would be loose in a way that Microsoft just has a box to tick while strict enough that it imposes untenable costs on private individuals.
lol they would just get a waiver from Hegseth that they’re doing very important national security stuff with it
waiver that says: I can do whatever I want, possibly acquired with help of agent Jack Daniels
Also, all this would do is change the processing from GPU to CPU. Microsoft commissions AMD, Nvidia or Intel to create a technically-not-a-GPU CPU and just have a computer that uses GDDR instead of the standard DDR.
Given the USA legislature’s incompetence, I imagine they would leave some sort of massive loopholes. Depending on the exact wording, you could get around it with technically not GPUs (as you suggest), or subdividing companies so each subdivision can be technically under the limit, or cranking up the size of individual GPUs so 8 GPUs is a massive amount of compute. Of course, I really doubt it would get that far in the first place, look at how they killed California’s attempt at the most moderate AI legislation.
From what I’ve read (granted from other reviews), the limit would be the equivalent performance of 8 4090s. Which means (assuming we believe Nvidia’s claims of 3352 AI TOPS for the 5090 vs the 1321 AI TOPS for the 4090) that you couldn’t possess more than the equivalent of 3 5090s. Then that keeps going, so.
BREAKING: Police perform 5th anti GPU raid in as many days. Found another illegal weed farm instead. Is the tactic of checking for unusual power usage failing? Our digital AI of John McAfee expresses concern this might hurt bitcoin miners.
I particularly agree with the point about the style being much more science-y than I’d expected, in a way that surely filters out large swathes of people. I’m assuming “people who are completely clueless about science and are unable to follow technical arguments” are just not the target audience. To crudely oversimplify, I think the target audience is 120+ IQ people, not 100 IQ people.
I haven’t read the damn book and I never will, but I have a hard time imagining there’s any modern science that can’t be explained to 100IQ smoothbrains, assuming the author is good enough.
I have a hard time imagining there’s any modern science that can’t be explained to 100IQ smoothbrains, assuming the author is good enough.
Same here. The main things stopping the LWers are that
(a) what they’re doing is utterly divorced from modern science
(b) they are godawful writers, to the point where it took years of billionaire funding and an all-consuming economic bubble to break them into the mainstream
Here’s a few examples of scientifically-evidenced concepts that provoke Whorfian mind-lock, where people are so attached to existing semantics that they cannot learn new concepts. If not even 60% of folks get it, then that’s more than within one standard deviation of average.
- There are four temporal tenses in a relativistic setting, not three. “Whorfian mind-lock” was originally coined during a discussion where a logician begs an astrophysicist to understand relativity. Practically nobody accepts this at first, to the point where there aren’t English words for discussing or using the fourth tense.
- Physical reality is neither objective nor subjective, but contextual (WP, nLab) or participatory. For context, only about 6-7% of philosophers believe this at most, from a 2020 survey. A friend-of-community physicist recently missed this one too, and it’s known to be a very subtle point despite its bluntness.
- Classical logic is not physically realizable (WP, nLab) and thus not the ultimate tool for all deductive work. This one does much better, at around 45% of philosophers at most, from the same 2020 survey.
@gerikson@awful.systems Please reconsider the use of “100IQ smoothbrain” as a descriptor. 100IQ is average, assuming IQ is not bogus. (Also if IQ is not bogus then please y’all get the fuck off my 160+IQ
lawnpollinator’s & kitchen garden.)
It’s a microcosm of lesswrong’s dysfunction: IQ veneration, elitism, and misunderstanding the problem in the first place. And even overlooking those problems, I think intellect only moderately correlates with an appreciation for science and an ability to understand science. Someone can think certain scientific subjects are really cool but only have a layman’s grasp of the technical details. Someone can do decently in introductory college level physics with just a willingness to work hard and being decent at math. And Eliezer could have avoided tangents about nuclear reactors or whatever to focus on stuff relevant to AI.
To be fair, you have to have a very high IQ to understand
Rick and MortyIf Anyone Builds It, Everyone Dies. The humor is extremely subtle, and without a solid grasp of theoretical physics most of the jokes will go over a typical viewer’s head. (I’m doing a variant of this meme)There’s also Eliezer’s nihilistic outlook, which is deftly woven into his parables-- his personal philosophy draws heavily from Godel Escher Bach, for instance. The fans understand this stuff; they have the intellectual capacity to truly appreciate the depths of his parables, to realize that they’re not just entertaining- they say something deep about the nature of Intelligence. As a consequence people who dislike IABIED truly ARE idiots- of course they wouldn’t appreciate, for instance, the motivation in Eliezier’s existencial catchphrase “Tsuyoku Naritai!,” which itself is a cryptic reference to Japanese culture. I’m smirking right now just imagining one of those addlepated simpletons scratching their heads in confusion as Nate Soares genius unfolds itself on their copy of IABIED. What fools… how I pity them. 😂 And yes by the way, I DO have a rationalist tattoo. And no, you cannot see it. It’s for the math pet’s eyes only- And even they have to demonstrate that they’re within 5 IQ points of my own (preferably lower) beforehand.
Took 3d6 SAN damage.
translator’s note: IABIED means “plan”
there’s ied in this acronym so everyone using it gets on a watch list
yabaied
The Yude iabieds
Fuck, that’s good
I grew up nearby the hometown of one of the Rick and Morty creators, I think the one that got fired for excessive drunkenness and harassment. And when I found that out, my immediate reaction was, “yup, of course a guy from Manteca would make a cartoon about having an alcoholic grandpa”