Will the next major LLM by OpenAI use a new tokenizer?
Plus
44
Ṁ1273Dec 31
77%
chance
1D
1W
1M
ALL
The GPT-2 model used r50k_base: vocab size = 50k
The GPT-3 model used r50k_base: vocab size = 50k
The GPT-3.5 model used cl100k_base: vocab size = 100k
The GPT-4 model used cl100k_base: vocab size = 100k
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@firstuserhere So YES if there's a GPT-4.5/5 that uses a tokeniser not on this list, and NO if there's a GPT-4.5/5 that uses a tokeniser that is on this list?
Related questions
Related questions
When will OpenAI release a more capable LLM?
Will there be a OpenAI LLM known as GPT-4.5? by 2033
72% chance
Will OpenAI's next major LLM (after GPT-4) solve more than 2 of the first 5 new Project Euler problems?
45% chance
Will OpenAI release a tokenizer with more than 210000 tokens before 2026?
24% chance
Will OpenAI reveal a textless LLM before 2025?
12% chance
Will OpenAI release a tokenizer with vocab size > 150k by end of 2024?
42% chance
What will be true of OpenAI's best LLM by EOY 2025?
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
43% chance
Will OpenAI give their new LLM an anthropomorphic name?
20% chance
Will the most interesting AI in 2027 be a LLM?
52% chance