Bunkobons

← All books

The Dark Forest

by Cixin Liu

Buy on Amazon

Recommended by

"SA: In the first book of the trilogy, science basically stops working on Earth, and there’s a big puzzle as to why. Particle accelerators start giving random results, and a bunch of scientists commit suicide. It is then revealed that an alien civilisation is at the origin of those events. These aliens themselves are going through a systemic collapse, and they create an AI that they send across space to take control of another civilisation. LS: So interestingly, in this book, the alien civilisation is experiencing an existential risk, and by using its technology to prevent it, it creates an existential risk for humanity. SA: Exactly. And all of this is the setup of The Dark Forest , in which humanity realises the scale of the potential risk, and learn that they have 400 years to prepare before the alien fleet arrives. They know that the adversary is more technologically advanced than them, but not by how much. And they can’t do fundamental scientific research anymore. So the story says a lot about the massive advantage that exists in higher technological advancement. LS: It goes back to what Simon was saying earlier, about research and the scientific method being our most precious resource. SB: And humanity doesn’t even know what the alien side’s technology is, and what it should be catching up on. SA: Without revealing too much, the novel sort of makes the broad point that as a species, it’s hard science, cold game theory , and consequentialist reasoning, that will keep you alive. The ‘fluffy stuff’ like human love, morality and ethics won’t save you at all. LS: But you need the cold, hard reasoning to preserve the fluffy stuff. SA: Interestingly, this book is among the most popular science fiction coming from China right now. It showcases the Cultural Revolution at the beginning of the first novel, most characters in the book are Chinese, and there is some uncharitable yet very accurate portrayal of Western democracy and the inefficiency of the United Nations. But it doesn’t glorify the planned economy either. It simply makes the point that humanity is fairly weak, and that if we’re ever faced by a really big threat, we most likely won’t survive it. SB: Eliezer Yudkowsky, who runs the non-profit Machine Intelligence Research Institute , says it’s really important to realise that there’s no natural law that says our civilization can’t collapse and our species can’t go extinct. It is a real, live option on the table. It really could happen. Just like as an individual you could suddenly die this afternoon, humanity could suddenly disappear, and all the ingredients necessary for that to happen already exist. Being able to accept that fact without looking away from it and then do something about it, that is the message that one would hope people might take away from these books. That’s one of the reasons why it’s worth exploring existential risks through science fiction and novels, rather than just through non-fiction books: all of the people in these stories have to engage with these problems, realise the mess they’re in, and decide how to respond. We need more people who are willing to do that; taking these issues seriously but not just getting depressed or angry, and instead actually doing the cold, hard thinking about what can be done."
Existential Risks · fivebooks.com