Will Superalignment succeed, according to Eliezer Yudkowsky?
Will Superalignment succeed, according to Eliezer Yudkowsky?
➕
Plus
108
Ṁ54k
resolved May 20
Resolved
NO

If @EliezerYudkowsky believes Superalignment has succeeded in the goal of "solv[ing] the core technical challenges of superintelligence alignment in four years [i.e. by July 5th, 2027]", this market will resolve YES. If the team dissolves, reorganizes, or pursues a separate research direction unlikely to lead to a solution to the alignment problem and Eliezer does not believe they have yet succeeded, this resolves NO.

Resolution will be based on Eliezer Yudkowsky's public communications (e.g. on the AI Alignment Forum or in the comments here). Resolution may be delayed after July 5th 2027 until Eliezer's belief about this is clear to me. May resolve to % if Eliezer so decides.

Get
Ṁ1,000
and
S3.00

🏅 Top traders

#NameTotal profit
1Ṁ501
2Ṁ140
3Ṁ101
4Ṁ72
5Ṁ35


Sort by:
bought Ṁ10,000 NO11mo

@jcb we can resolve this "no" now right?

11mo

@Joshua I had some hesitation wanting to hear it directly from @EliezerYudkowsky but I think I feel reasonably comfortable interpreting his retweet of this as cause for a NO resolution. (Eliezer, if this is wrong, feel free to correct us and I'll ask the mods to fix it.)

1y

How does this resolve if Yudkowsky is dead?

1y

@MartinRandall N/A. (If he wants to delegate to a successor, I'll have to think about whether to accept that.)

1y

Is this the core technical challenges as Yudkowsky sees them, or as OpenAI see them?

Eg, taking a safe pivotal act might be viewed as a technical challenge by one and a political challenge by the other.

1y

@MartinRandall The core technical challenges as Yudkowsky sees them. I think this is the most straightforward reading of the question, and it seems more meaningful and valuable than trying to grasp Yudkowsky's belief about whether Superalignment solved the core technical challenges as OpenAI sees them.

1y

@jcb Then maybe this already resolves NO, if they are pursuing a separate research direction, ie the challenges as they see them.

1y

@MartinRandall I can imagine an argument that they are already pursuing a direction unlikely to lead to a solution to the alignment problem as Yudkowsky sees it. But I have enough uncertainty about what Superalignment will produce that I'd be very hesitant to resolve early on those grounds (even with direct input from Eliezer to that end). In spirit, this clause is about a pivot away from working directly on alignment.

1y

I cannot think of a non-certain market that should have a lower percentage.

What is this?

What is Manifold?
Manifold is the world's largest social prediction market.
Get accurate real-time odds on politics, tech, sports, and more.
Win cash prizes for your predictions on our sweepstakes markets! Always free to play. No purchase necessary.
Are our predictions accurate?
Yes! Manifold is very well calibrated, with forecasts on average within 4 percentage points of the true probability. Our probabilities are created by users buying and selling shares of a market.
In the 2022 US midterm elections, we outperformed all other prediction market platforms and were in line with FiveThirtyEight’s performance. Many people who don't like trading still use Manifold to get reliable news.
How do I win cash prizes?
Manifold offers two market types: play money and sweepstakes.
All questions include a play money market which uses mana Ṁ and can't be cashed out.
Selected markets will have a sweepstakes toggle. These require sweepcash S to participate and winners can withdraw sweepcash as a cash prize. You can filter for sweepstakes markets on the browse page.
Redeem your sweepcash won from markets at
S1.00
→ $1.00
, minus a 5% fee.
Learn more.
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules