A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
➕
Plus
15
Ṁ1495
Mar 31
21%
chance

Will a published machine learning paper demonstrate a new architecture that combines transformers with symbolic methods (category theory, programming language theory, or logic theory) and achieves superior performance on standard benchmarks compared to traditional transformer-only architectures?

Resolution criteria: Paper must be published on arXiv, and very much preferably a major ML conference/journal, and show statistically significant improvements over baseline transformers on multiple standard tasks.

Get
Ṁ1,000
and
S3.00
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules