
Evidence that could count includes:
1. detecting a star system full of paperclips or some other dumb thing being maximized
2. detecting radio transmissions of "help help the cylons are trying to genocide us"
Drake equation and tack on a few parameters. Odds of human-level intelligent life eventually developing AGI = 0.5, odds that AGI goes bad = 0.5, odds that bad AGI does something detectable at a distance (by acquiring huge resources to maximize whatever) = 0.8
Should be at least as high as the AI doom market, minus the probability that AI kills everyone before anyone publishes anything
@ThomasKwa A misaligned superintelligence on earth would probably act too quickly for humans to notice and publish about it before they're dead. For this question I'm mainly thinking about extraterrestrial misaligned superintelligences.
@MartinRandall Yes, if there’s still anyone around on earth to observe/publish and resolve the market