If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
Plus
10
Ṁ1080Jan 2
19%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will there be a OpenAI LLM known as GPT-4.5? by 2033
70% chance
Will an open source model beat GPT-4 in 2024?
65% chance
Will xAI release an LLM with BIG-Bench score as good as GPT-4 Turbo before the end of 2024?
77% chance
Will China recreate GPT-4o by EOY 2024?
20% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
53% chance
Will OpenAI release a model better than GPT-4o without test time compute before EOY 2024?
74% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
How much compute will be used to train GPT-5?
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22