Q: How would how I want to spend my life change if I estimated that Pr(humans are extinct before 2030) is large?
(Iâm specifically thinking about this because Iâve recently become concerned about x-risk posed by failure to align superhuman AI agents, although x-risk is not a novel problem).
For simplicity, letâs just suppose you were certain that humanity would be extinct by 2030. For the record I donât think this is certain, but I think itâs an interesting extreme case to think about.
- Well, this would mean that a lot of long term goals cease to matter.
- For instance, curing aging or cancer would no longer be such an important thing to do.
- It seems to make âaccumulate moneyâ a pretty obviously bad strategy. This can help cure an aversion to allocating money to wherever you think it should go in order to maximize utility.
- One other update is that you should try to do some things to lower x-risk. Should you drop everything and just work on reducing x-risk?
- I mean, kind of, but donât be a naive optimizer.
- For instance, if you donât have a realistic way to pivot into a career directly reducing x-risk, then maybe donating to charities that fund people working to reduce x-risk is a smarter allocation of your time.
- I think it makes it more obvious that you shouldnât engage in trade-offs of âpain nowâ for nebulous future benefits.
So, believing that humanity is near the end should result in some shuffling of priorities. But, if you look at my distilled more general priorities that had no reference to an end-time for humanity, I think these priorities still are all pretty good.
If we have one decade, or one week, or even just one day left. Then Iâd advocate for filling that decade/week/day with awesome conscious experiences.
âThe proof is left to the reader as an exercise. The battle is worth it, Johann.â - Nathan