
Will I believe in 1 year that DeepSeek R1 was substantially trained via distillation of a US model?
Basic
7
Ṁ1482026
64%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000and
3.00
Related questions
Related questions
Will DeepSeek's next reasoning model be called R3?
1% chance
"Holy shit!" -> my reaction to deepseek r1. Will I feel the same about any AI developments in the next 5 months?
6% chance
Will OpenAI’s claims that DeepSeek is a distillation of their models become the consensus view?
16% chance
will DeepSeek become a closed AI lab by EOY?
25% chance
When will DeepSeek release R2?
Will DeepSeek's next reasoning model be open-sourced?
83% chance
Did DeepSeek violate OpenAI's terms of service by using OpenAI model outputs for distillation in 2024 or January 2025?
22% chance
Will DeepSeek R2 be open source?
83% chance
Did DeepSeek lie about the GPU compute budget they used in the training of v3?
13% chance
Will DeepSeek be banned in US before 2027?
38% chance