This is a classic sequence post: (mis)appropriated Japanese phrases and cultural concepts, references to the AI box experiment, and links to other sequence posts. It is also especially ironic given Eliezer’s recent switch to doomerism with his new phrases of “shut it all down” and “AI alignment is too hard” and “we’re all going to die”.
Indeed, with developments in NN interpretability and a use case of making LLM not racist or otherwise horrible, it seems to me like their is finally actually tractable work to be done (that is at least vaguely related to AI alignment)… which is probably why Eliezer is declaring defeat and switching to the podcast circuit.
The linked post exemplifies the virtue of hazukashii, truly itai school hakushin-class yabai-style writing. In Japan they don’t say “Rationalism” they say “クリンジ” and I think that’s beautiful.
HPMOR aka ハリポッターとクリンジのじゅつ
Such pointless orientalism, unbearable levels of cringe.
Jesus Christ, I clicked on this just to see the crimes against the Japanese language (since I’m always keeping an eye out for my own), thinking huh, maybe I can learn something here—
The virtue of tsuyoku naritai, “I want to become stronger”, is to always keep improving
get the fuck out yud
Doing real work is hard. Making things up is so much simpler.
Did I win? Why yes, I did.
(dead link)
archive got your back: https://web.archive.org/web/20091221032219/http://yudkowsky.net/singularity/aibox
It’s amazing that one can use so many words to describe such a simple concept. Astonishing lack of economy. I am not going to read or even skim that, I just thought it was funny that he was like ‘I won, see?’ and his proof is a dead link.
“baffle them with bullshit”
It’s amazing that one can use so many words to describe such a simple concept. Astonishing lack of economy. I am not going to read or even skim that, I just thought it was funny that he was like ‘I won, see?’ and his proof is a dead link.
deleted by creator
deleted by creator
deleted by creator