Superforecasting: The Art and Science of Prediction
by Dan Gardner & Philip E Tetlock
Buy on AmazonRecommended by
"Yes, this is very relevant to the one in six question. We talked about the importance of the scout mindset, of thinking clearly, of trying to have beliefs that reflect reality. Once we start thinking about issues that are not just happening now but over the coming years or even decades, that gets particularly challenging. There’s a long track record of people making predictions about the future that are hilariously wrong, both in too optimistic a direction—where they say, ‘In the year 2000, we’ll be walking in spacesuits on the moon’—and in too pessimistic a direction. JBS Haldane , one of the early futurists at the beginning of the 20th century, made many great predictions. But he also said it would be 8 million years before there was a return trip to the moon. That was only a few decades before there was one. Support Five Books Five Books interviews are expensive to produce. If you're enjoying this interview, please support us by donating a small amount . How should we reason in the face of uncertainty? Forecasting is this discipline, this art and skill of getting better and better at making predictions about the future. The key thing is starting to reason in terms of precise probabilities. You might ask, ‘What’s the chance of x happening in our lifetime?’ and people will say, ‘It’s unlikely.’ That’s very vague. I don’t really know what ‘unlikely’ means. Is there a 40% chance x might happen? 10%? 1%? These are really big differences. The approach of forecasting is to make very precise predictions and then, over time, you see which are correct and which are incorrect. They have developed a whole set of skills for improving the way we reason about probabilistic matters. The book is very helpful for thinking about things that are intrinsically uncertain, questions like, ‘When will we develop human-level artificial intelligence?’ Or ‘Will there be a third world war in our lifetime?’ Yes, you could think of it as a nascent field of forecasting studies, within economics and psychology in particular. Exactly, but that’s the wrong way to think about things because it’s always a matter of doing better or worse. Here’s one example. Metaculus is a forecasting platform that overlaps a lot with the effective altruism community. In 2015, there was a forecast about the probability of a global pandemic between the years 2016 and 2026 that would kill at least 10 million people. Metaculus put the odds at one in three. Now, if the world’s decision-makers and political leaders had internalized a one in three chance of a global pandemic of that magnitude within the coming decades, we would have prepared for the pandemic that did occur much better than we did. We weren’t thinking probabilistically. We thought, ‘Nah, it won’t happen.’ One in three is not that high a probability, but it’s certainly enough to prepare for."
Longtermism · fivebooks.com
"Superforecasting is interesting because it’s more about the research side of the topic. It’s written by the journalist Dan Gardner and Philip Tetlock. Tetlock, who is a Canadian American political scientist, conducted a lot of experiments to try and understand what makes people become better at making predictions. Most of his work was conducted with IARPA , which is the research branch of American national intelligence. Tetlock’s project was called ‘The Good Judgement Project.’ He recruited several dozen people and tried to see, first, whether some people are consistently better than others at making predictions. And second, he tried to understand what happens if you train these people to use good probability and estimate techniques. In other words, he tried to make them think about the world in a way that does not try to be as fancy or impressive as possible—he wasn’t trying to turn them into typical pundits—but into the most accurate possible forecasters. What he found is that in a whole range of topics, those he called ‘superforecasters’ tend to be, on average, much better at predicting events than the topic experts that he compared them with. When we listen to the radio or watch TV, we often see these pundits who seem to have extremely strong views about something: for example, they claim with absolute certainty what’s going to happen in the war in Ukraine, or swear that a given politician is going to get elected in the next cycle. And because these people have a very assertive and confident way of telling us these things, we tend to believe them. Tetlock went to the effort of logging all the predictions made by these people, and he found that they were no better than random. A lot of them said very random things, but because no one ever checked back and asked them later to explain themselves, they just kept on being reinvited to give those opinions. A lot of the book is about the story of this project, how he hired those people, realized that there were some ‘superforecasters’ among them, and tried to group them together and see what that would do. It turns out that groups of superforecasters did even better than superforecasters alone. He also goes into smaller details about what it is that those people do to make themselves better—things like breaking down very large questions into smaller questions, thinking about base rates, thinking about all the information that is at your disposal rather than the most recent piece of evidence, thinking about probabilities and not just about something being true or false, etc. It’s going back to a lot of the research that’s behind Nate Silver’s book. Nate Silver’s book is more sci-pop, it’s much more accessible, and about practical applications in different domains. Phil Tetlock’s book is certainly not dry, but it’s more about the actual origins of this research. It’s very much still running. They still have the research project, the Good Judgement Project, and they also have a commercial spin-off called Good Judgement Inc. They run forecasting tournaments for clients and also maintain an online forecasting platform. We’ve been working with them a little bit at Our World in Data : they’ve picked 10 specific charts on our website and got their superforecasters to try and predict what’s going to happen to these metrics over the very long run—so 1, 3, 10, 30, and 100 years in the future—and they’re working on a report."
Using Data to Understand the World · fivebooks.com