Will Yudkowsky and Soares' book get on the NYT bestseller list in 2025?
378
Ṁ130k
Dec 31
66%
chance

Will Eliezer Yudkowsky and Nate Soares book "If Anyone Builds It, Everyone Dies" get on the NYT bestseller list this year?


Verification will be based on the official NYT Best Seller lists. Currently I understand that to mean this resolves YES if it makes the online list (top 35), but I intend it to mean whatever best maps to "can write, New York Times Bestseller on the book".

Number sold question: https://manifold.markets/NathanpmYoung/how-many-copies-of-yudkowsky-and-so?play=true

  • Update 2025-05-18 (PST) (AI summary of creator comment): The criteria for the book appearing on the NYT Bestseller list are:

    • List frequency: weekly

    • Required placement: top 35

    • Eligible lists: any category

Get
Ṁ1,000
and
S3.00
Sort by:
bought Ṁ200 NO

Just hedging, don't mind me :3

bought Ṁ60 NO

I’d love to be wrong but I don’t think it’s a topic popular enough to become a bestseller.

@MartinZokov well, some books on the topic have become bestsellers, so the issue is not the topic.

I’m still puzzled. What makes people think this is so much likelier than base rates (yes, even accounting for topic and publisher)? Rough estimates suggest base rates in the low two digits. Of course, this one could be different! But why?

@MachiNi Familiarity bias amongst Manifold's userbase

@MachiNi coordinated effort by rationalists to manipulate the bestseller list

@nikki everybody knows Barnes and Noble customers are eager to hear what the rationalists recommend!

@MachiNi Both Superintelligence and What We Owe The Future were NYT best sellers (others I'd say were in this books reference class that were not are Human Compatible and The Precipice).

Rats are cute! And smart! 🐀

@RobertCousineau I’m honestly struggling to construct a good reference class. AI risk nonfiction is too narrow to make meaningful inferences. AI broadly construed is too loose. For every Superintelligence and WWOTF there are probably at least twice as many comparable books that don’t become bestsellers.

Buying some "No" here seems like a good idea on the basis that, imo, Manifold tends to be overly 'bullish' on Yudkowsky. That being said, given the hook title and the topic being AI, if anyone sees it, they'll (probably) pick it up.

@vitamind

if anyone sees it, they'll (probably) pick it up

Why?

@vitamind I actually think AI will probably be a boring topic by September since gpt5 will likely be a boring release.

bought Ṁ400 NO

My feeling is what gets on the NYT bestseller list is less to do with how true or important the book is, and more to do with vibes. I have a (baseless) feeling that most normal people will see a book with "everyone dies" in the title and not buy it.

@pietrokc

that most normal people will see a book with "everyone dies" in the title and not buy it.

I strongly disagree. Tons of viral content and bestselling books are centered around Doomsday / Apocalyptic predictions (both fiction and non fiction)

few recent examples: Oppenheimer (long + black and white biopic) made over a billion dollars. Black Mirror is super popular etc

  1. It confirms peoples existing biases against AI & Tech — normies are concered about AI from decades of Sci Fi, NYT is not a big fan of “tech bros” , concern around AI taking people’s jobs, AI labs IP theft etc

  2. Negativity Bias — Evolutionarily we’re biased toward information that helps us survive / warns of imminent danger

@pietrokc The bestseller list is, as it turns out, less about being the best seller and more about good marketing and getting the attention of the NYT editorial board. This is going through a legit, traditional publisher who knows how to push for a place on the list, and the marketing copy so far is solid. 70% is probably pretty fair, leaning optimistic but certainly a reasonable probability.

I preordered the book.

filled a Ṁ150 NO at 60% order

@pietrokc I agree. Also, is it is far from clear to me that it is true or important.

@elf Fair points about Oppenheimer and Black Mirror. I'll harp again on this distinction though: neither of these have "everyone will die" in the title. Again I'm not saying it won't be a bestseller; just that the odds are <70%.

For some context, the base rate for a book getting on the NYT bestseller list is 0.5%. So people are positing a >100x multiplier on this book. That feels way too high, and the Manifold community is known to be way too obsessed with AI.

@CraigDemel I personally don't think EY's stronger claims about AI are anywhere close to true, but I wanted to keep that out of the equation since this market is at best indirectly about what's true.

@pietrokc The base rate is a lot higher for books by this specific publisher.

@Driftloom That's really interesting. Do you have concrete numbers?

@pietrokc I have napkin math. LBC is one of the six adult-trade imprints at Hachette Book Group. Making some assumptions about share of total output and share of bestseller output, their base rate to end up on the NYT list is ~8%.

@CraigDemel Graham Hancock sells a lot of books. The truth and importance of a nonfiction topic is… fairly orthogonal to its sales.

@Driftloom GPT gave me a 10-13% base rate for them. Still much much lower than market price.

@MachiNi Yeah, it’s a base rate. You gotta consider whether and if so to what degree the specifics of this particular instance produce deviation from that rate. That’s kinda the point of the question, right?

@Driftloom Thanks, that's super helpful. With some cursory googling and wikipediaing I had come up with a 5-10% number so it's reassuring to see it corroborated. So yeah when I first saw it this market was putting a ~10x multiplier on this base rate.

@MachiNi I wouldn't trust GPT with something like this.

@pietrokc Concur on GPT, unless you check its sources itself.

To put all the cards on the table, I think the current odds are optimistic but not unreasonably so. It would require a well coordinated and executed marketing and distribution strategy from the publisher, which probably includes putting Eliezer and Nate out there on press tours and podcasts. If they choose to make that investment I suspect they have a reasonable chance of success. If they can’t break out of the rat-adjacent niche into normie space, this probably goes to zero. If they don’t actively choose to try, it definitely goes to zero. Signals to watch for would be podcast appearances (Fridman, even Rogan), public comments (even negative ones, from Gary Marcus or Yann LeCun, say) and even a hint of buzz in more mainstream publications. Physical presence and distribution in bookstores will matter. It’s a real hill! And a harder one to climb with a complex nonfiction than with the latest in an established fiction series.

And, the NYT can always exercise editorial discretion and exclude it for any reason they want. So it needs some buzz. There’s a reason the base rate is low.

But, if it were gonna hit the bestseller list, the path so far looks like the one it would need to be on to get there.

@Driftloom you guys do what you want but the analysis was tailored to this specific case and convincing to me. Of course it’s just a base rate. But you have to justify wild deviations from base rates and I’m not seeing it for now.

.

@MachiNi The problem is that the internet is awash with examples of people asking the AIs questions exactly like this, involving specific researchable numbers, and the AI simply makes up the numbers. For all we know all these %s it produced are entirely fictional. Whether it's convincing to us is irrelevant -- the whole reason we're asking is we don't know the answer!

[Added: here's just one example I saw yesterday]

@pietrokc I fear you may be right. It still turned a figure that's perfectly in line with the back-of-the-envelope calculations in the thread. Maybe it's garbage, but based on what I've seen these are not unreasonable estimates.

No need to keep saying the reason we're asking is that we don't know. There are things we know and they inform our predictions. There are more and less informed bets.

Comment hidden
Comment hidden
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules