WhiteBox Research will have at least four fellows spending ≥10 h/wk on an alignment-oriented project by EoY 2024.
Plus
3
Ṁ988Dec 31
79%
chance
1D
1W
1M
ALL
WhiteBox Research is currently holding an interpretability fellowship in Manila. 11 participants are currently in our fellowship’s guided research phase, 6 of whom will use Neel Nanda's 200 Concrete Open Problems to rapidly upskill in mechanistic interpretability. We are currently funded by the Long-Term Future Fund and Manifund.
Some of the fellows in our first cohort include:
- an IOI silver medalist
- an Iranian Geometry Olympiad bronze medalist
- an AI engineering technical manager and former trainer for the Philippine IOI team
Other fellows have also won first and third places in recent Apart Hackathons.
Said project must last at least four weeks, and may not necessarily be funded by any external funding source or grantmaker.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
WhiteBox Research will get at least one fellow accepted into a ≥0.5 FTE AI safety research fellowship by EoY 2024.
55% chance
Will any foundation models/LLMs be able to reliably come up with novel unparalleled misalignments before EOY 2024?
49% chance
Will I have a career as an alignment researcher by the end of 2024?
38% chance
Will I still work on alignment research at Redwood Research in 2 years?
74% chance
How much money will Tetra get for alignment research in 2024?
Will I still work on alignment research at Redwood Research in 3 years?
58% chance
Will I still work on alignment research at Redwood Research in 5 years?
30% chance
Will a major AI alignment office (eg Constellation/Lightcone/HAIST) give out free piksters to alignment ppl by EOY 2027?
43% chance
Z Fellows partners with another university or accelerator program by May 31, 2025
57% chance
Will any of the 2016 Pareto Fellows end up with a MacArthur Fellowship by EOY 2040?
50% chance