Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
I guess if all science fiction written after 2024 includes something suspiciously like Butlerian Jihad then be careful what you wish(ed) for?
Well, in my setting, I chose the route of having the initial singularity be aborted by humanity panicking the fuck out regardless of whether the AI was actually aligned or not.
And the justification for limiting the spread of generalized ASI was to prevent that from happening again, with the operational solution being either having AGIs locked to the last human level proven empirically safe, or only allowing narrowly superhuman AGI.
It's a world where Yudkowskian fighter jets dropping bombs on data centers isn't a joke, but they usually go for nukes and antimatter explosives.
I'll leave it to the reader to decide whether that's a bad thing, but at the very least I don't commit the sin of writing about technology worse than today without an explanation of any kind. Butlerian Jihad it isn't though.
More options
Context Copy link
More options
Context Copy link