Q: How would how I want to spend my life change if I estimated that Pr(humans are extinct before 2030) is large?

(I’m specifically thinking about this because I’ve recently become concerned about x-risk posed by failure to align superhuman AI agents, although x-risk is not a novel problem).

For simplicity, let’s just suppose you were certain that humanity would be extinct by 2030. For the record I don’t think this is certain, but I think it’s an interesting extreme case to think about.

  • Well, this would mean that a lot of long term goals cease to matter.
  • For instance, curing aging or cancer would no longer be such an important thing to do.
  • It seems to make “accumulate money” a pretty obviously bad strategy. This can help cure an aversion to allocating money to wherever you think it should go in order to maximize utility.
  • One other update is that you should try to do some things to lower x-risk. Should you drop everything and just work on reducing x-risk?
    • I mean, kind of, but don’t be a naive optimizer.
    • For instance, if you don’t have a realistic way to pivot into a career directly reducing x-risk, then maybe donating to charities that fund people working to reduce x-risk is a smarter allocation of your time.
  • I think it makes it more obvious that you shouldn’t engage in trade-offs of “pain now” for nebulous future benefits.

So, believing that humanity is near the end should result in some shuffling of priorities. But, if you look at my distilled more general priorities that had no reference to an end-time for humanity, I think these priorities still are all pretty good.

If we have one decade, or one week, or even just one day left. Then I’d advocate for filling that decade/week/day with awesome conscious experiences.

”The proof is left to the reader as an exercise. The battle is worth it, Johann.” - Nathan