Climate Shock: The Economic Consequences of a Hotter Planet
by Gernot Wagner & Martin L. Weitzman
Buy on AmazonRecommended by
"SB: The reason we dedicated our only non-fiction choice to climate change is that at the end of the day, if you’re interested in existential risks and wondering what you can do to prevent them, then probably the one direct threat you can do something about is climate change. People often ask us what the most dangerous existential risks are. If you want something that could definitely leave us all dead, that would be AI: if an AI becomes superintelligent and decides that humans are a waste of space, it’s over. If you’re interested in something that could happen tomorrow, that would be nuclear weapons: it only takes one person to press the wrong button. But if you want to work on the thing we’re currently running fastest towards, it is catastrophic climate change. There is a lot of complacency about the likelihood of extreme climate change. The uncertainty in current climate models means that we could very easily aim for a global increase of 2°C or lower, but actually get 6°C instead. People are always going to be pushing for the most optimistic predictions, but there are many tipping points and feedback loops that could drive climate change to levels that would be truly catastrophic for us. So we’re driving very fast, in the dark; and we need to do something about that. For most people out there, this is the risk they can do most about, in particular by changing how they vote and what they expect from policy-makers. If people demand that politicians talk about climate change, stuff will happen. If people don’t care then it will get taken off the agenda again and stuff will not happen. Ordinary individuals can have a big say on that, if they want to. There are also issues about individual lifestyle and consumption, and our perception of what is ‘normal’. Unfortunately not everyone has a such big role to play in something like AI safety. “If you want to work on the thing we’re currently running fastest towards, it is catastrophic climate change” SA: If you think of a map of the world, you can identify the geographical centres that are most contributing to each risk. For nuclear weapons, it’s mainly military bases where warheads are stored, and the chain of command that would lead to their use. If you look at AI, you can look at some key research labs and data centres in California, London, China, etc. For bioengineering, again geographical sources of risks are fairly circumscribed. But if you look at climate change, then suddenly almost the entire world, both developed and developing, is part of a very distributed source of risk. This means that any individual anywhere on the planet can act regarding climate change, and take partial ownership of the problem. LS: For a non-fiction book on a quite heavy topic, Climate Shock is very readable. SB: Yes, the fact that it’s about climate change makes it much more accessible. And actually, one of its chapters is a very gentle introduction to existential risks. SA: There are some fairly good and popular books about AI, such as Life 3.0 by Max Tegmark, or The Technological Singularity by Murray Shanahan. But then what? Unless you go on to do a PhD on AI safety, those books wouldn’t really make you behave any differently after you’ve read them. SA: Maybe, but if you look at the situation with nuclear weapons, there was once mass literacy about the potential effects and dangers, and very large mobilisations by people worried about it. It certainly did play a role, as did Hiroshima and Nagasaki. But ultimately if the decision power is centralised in the hands of a few people, there isn’t that much that the public can really do."
Existential Risks · fivebooks.com