Hah, nice read.
It’s always been wild to me how the idea of “a future AI threatens incredible suffering if you don’t help it come into existence” sparks “oh, then I better help create the torture-AI” in people, rather than “what the fuck, that’s evil and cruel, let’s make sure this doesn’t come to pass”.
But of course, the Rolo folks start from the notion that this AI is 100% inevitable, at which point… There’s no need to sadistically torture people.
In the same vein as “God can’t be all-powerful and all-good”:
The AGI either doesn’t exist, or has no need to torture people.









Oh, also: you should cross-post this to [email protected], they will absolutely love this.