How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?
Basic
5
Ṁ1204Jul 2
2%
<1e24
2%
[1e24, 3e24)
12%
[3e24, 1e25)
15%
[1e25, 3e25)
33%
[3e25, 1e26)
25%
[3e26, 1e27)
4%
[1e27, 3e27)
1.9%
[3e27, 1e28)
1.6%
[1e28, 3e28)
1.2%
[3e28, 1e29)
1.2%
>=1e29
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
End of pre-training era for language models: Will an LM fine-tune for more FLOPs than it is pre-trained for, before 2026
42% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
57% chance
How many FLOPs will go into training the first ASL-3 model?
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
48% chance
When will a US government AI run overtake private AI compute by FLOP?
Will a machine learning training run exceed 10^27 FLOP in China before 2030?
66% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
43% chance
Will the largest machine learning training run (in FLOP) as of the end of 2030 be in the United States?
74% chance
Will the largest machine learning training run (in FLOP) as of the end of 2040 be in the United States?
64% chance