Will second-order optimizers displace first-order optimizers for training LLMs by 2030?
➕
Plus
27
Ṁ1960
2029
35%
chance

Will something like Shampoo become the default instead of Adam?

Get
Ṁ1,000
and
S3.00
Sort by:

Shampoo was proposed in 2018, but implemented by Google in 2021 and Pytorch optimizers in 2023. The most recent major proposed improvement is SOAP, which has since been implemented in the optimizers library.

We can get a sense of how many people are using shampoo by looking at Github results for imports of the Google code and Pytorch optimizers code. As of January 7th, I'm seeing 262 and 526 results respectively. Normalizing by the number of days that the repositories have been available:

- 262 files/1407 days = 0.186

-526 files/695 days = 0.756

Now let's look at Lion. You remember Lion, right? The Adam-killer from a little under three years ago?

lucidrains posted an implementation 693 days ago, one day after the paper's release. Searching for mentions on Github yields 616 results. 616/683 = 0.902. But seriously, when was the last time that you thought about Lion?

This market resolves in 2030 so maybe some amazing thing will happen in 2028, but the early evidence doesn't favor a probability any higher than 35%.

For fun, some stats on Adam: the paper was submitted in 2014, but the first Pytorch commit was in 2016. Searching for the Pytorch implementation yields a lot of hits, producing a files/day ratio of 38.16 as of today.

Seems possible but <50%? Why do you think it's so high?

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules