• 2 Posts
  • 19 Comments
Joined 1 year ago
cake
Cake day: September 28th, 2023

help-circle

  • Post from July, tweet from today:

    Itā€™s easy to forget that Scottstar Codex just makes shit up, but what the fuck ā€œdynamicā€ is he talking about? Heā€™s describing this like a recurring pattern and not an addled fever dream

    Thereā€™s a dynamic in gun control debates, where the anti-gun side says ā€œYOU NEED TO BAN THE BAD ASSAULT GUNS, YOU KNOW, THE ONES THAT COMMIT ALL THE SCHOOL SHOOTINGSā€. Then Congress wants to look tough, so they ban some poorly-defined set of guns. Then the Supreme Court strikes it down, which Congress could easily have predicted but they were so fixated on looking tough that they didnā€™t bother double-checking it was constitutional. Then they pass some much weaker bill, and a hobbyist discovers that if you add such-and-such a 3D printed part to a legal gun, it becomes exactly like whatever category of guns they banned. Then someone commits another school shooting, and the anti-gun people come back with ā€œWHY DIDNā€™T YOU BAN THE BAD ASSAULT GUNS? I THOUGHT WE TOLD YOU TO BE TOUGH! WHY CANā€™T ANYONE EVER BE TOUGH ON GUNS?ā€

    Embarrassing to be this uninformed about such a high profile issue, no less that youā€™re choosing to write about derisively.














  • Short answer: ā€œmajorityā€ is hyperbolic, sure. But it is an elite conviction espoused by leading lights like Nick Beckstead. You say the math is ā€œbasically alwaysā€ based on flesh and blood humans but when the exception is the ur-texts of the philosophy, counting statistics may be insufficient. You canā€™t really get more inner sanctum than Beckstead.

    Hell, even 80000 hours (an org meant to be a legible and appealing gateway to EA) has openly grappled with whether global health should be deprioritized in favor of so-called suffering-risks, exemplified by that episode of Black Mirror where Don Draper indefinitely tortures a digital clone of a woman into subjugation. I canā€™t find the original post, formerly linked to from their home page, but they do still link to this talk presenting that original scenario as a grave issue demanding present-day attention.


  • less than 1%ā€¦on other long-termā€¦which presumably includes simulated humans.

    Oh itā€™s way more than this. The linked stats are already way out of date, but even in 2019 you can see existential risk rapidly accelerating as a cause, and as you admit much moreso with the hardcore EA set.

    As for what simulated humans have to do with existential risk, you have to look to their utility functions: they explicitly weigh the future pleasure of these now-hypothetical simulations as outweighing the suffering of any and all present or future flesh bags.