Will an LLM Built on a State Space Model Architecture Have Been SOTA at any Point before EOY 2027? [READ DESCRIPTION]
Plus
15
Ṁ5022027
43%
chance
1D
1W
1M
ALL
I don't mean "achieves SOTA on one benchmark", or "is the best FOSS model", I mean "is the equivalent of what GPT-4 is right now".
The SSM must be in contention for the position as the most generally capable LLM. I will not trade in this market because the resolution condition isn't entirely objective.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@HanchiSun I think he means like Mamba: https://arxiv.org/pdf/2312.00752.pdf
They are vaguely related to RNNs though
Related questions
Related questions
Will any LLM released by EOY 2025 be dangerously ASL-3 as defined by Anthropic?
48% chance
What will be true of Anthropic's best LLM by EOY 2025?
What will be true of OpenAI's best LLM by EOY 2025?
[Situational awareness] Will pre-2026 LLMs achieve token-output control?
30% chance
Will the transformer architecture be replaced in SOTA LLMs by 2028?
66% chance
Will the most interesting AI in 2027 be a LLM?
52% chance
[Situational awareness] Will pre-2028 LLMs achieve token-output control?
38% chance
Will an LLM be able to solve the Self-Referential Aptitude Test before 2027?
66% chance
LLMs widely used in economics modeling by the end of 2026?
43% chance
By 2026, will it be standard practice to sandbox SOTA LLMs?
29% chance