If there exists a super-intelligent AI, would majority of AI researchers answer Yes to "Have we reached AGI?" ?
Plus
24
Ṁ10662031
55%
chance
1D
1W
1M
ALL
Super-intelligent AI :
"Something along the lines of -> smarter than humans at most cognitive tasks, very very good at some key tasks, and can afford to be indifferent too anything it can't do." (@Duncn's comment)
"AI that is better than majority of the humans at most economically valuable tasks, but not necessarily better than the best humans in all of those tasks."
(I created this market to gauge opinion for @Primer's question)
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Sort by:
@ShadowyZephyr Resolves whenever there is such a survey and such a superintelligent AI, until then market trades according to what that survey will point to
Related questions
Related questions
Will we have at least one more AI winter before AGI is realized?
33% chance
In which year will a majority of AI researchers concur that a superintelligent, fairly general AI has been realized?
Will Artificial General Intelligence (AGI) lead directly to the development of Artificial Superintelligence (ASI)?
78% chance
Will we have an AGI as smart as a "generally educated human" by the end of 2025?
26% chance
Will humans create AGI, either directly or indirectly, within the next 24 months?
16% chance
Will AI create the first AGI?
41% chance
The probability of extremely good AGI outcomes eg. rapid human flourishing will be >24% in next AI experts survey
59% chance
When artificial general intelligence (AGI) exists, what will be true?
When will manifold users think we have AGI? [Resolves to a majority yes in poll]
Conditional on their being no AI takeoff before 2050, will the majority of AI researchers believe that AI alignment is solved?
52% chance