Bunkobons

← All books

The Fourth Revolution: How the Infosphere is Reshaping Human Reality

by Luciano Floridi

Buy on Amazon

Recommended by

"I love Luciano Floridi—who is a philosopher of technology—because I think he has a breadth of vision, a genuinely systematic approach to the ethics of technology, but also he is deeply literate in history, and deeply interested in human nature. He’s not a consequentialist, in the sense of being interested in maximising some kind of uber-beneficial long-term outcome for humanity. A lot of tech philosophy naturally leans towards consequentialism in terms of mega payoffs and outputs. This can be a great tool for engaging with the outputs of particular systems, but it’s not a systematic philosophy of human nature or thriving. In this particular book, Floridi starts by referencing and updating Freud ’s account of three historical revolutions in human consciousness. With Copernicus and the birth of heliocentric models of the universe, we gradually learned as a species that we weren’t the centre of the universe. It’s not all about us, in a cosmological sense. Then in the Darwinian revolution of the 19th century, we found evidence that suggested that we are not the unique pinnacle of creation—that there wasn’t this moment where humans were ‘made,’ the best and most intelligent at the top of a pyramid. In fact, we’re connected with and emergent from the rest of nature; and nature turns out to be far vaster and stranger than we previously imagined. The timespan within which we exist is immense while—incidentally—you no longer need a deity to explain our existence. Next, Freud argued that his own psychoanalytic revolution was another de-centring of human consciousness, because rather than the sublimely lucid self-knowledge of Descartes—I think therefore I am, cogito ergo sum —you can’t be at all sure of what’s going on when you introspect. You’re grasping at straws. Floridi adds to this account what he calls a ‘fourth revolution’: a similar de-centring of human consciousness where suddenly, through artificial means, we’re creating entities that are capable of incredible feats of information processing. And by doing so we’ve been forced to re-conceive ourselves as informational organisms – and take on board the fact that even our intellectual capabilities may not be beyond replication. “Our ability to be good and do good is bound up with the systems through which we’re connected and interconnected” The subtle point he makes—which I think is a Kantian one—is to recognise that human dignity and thriving become more rather than less important in this informational context. For Floridi, the informational webs we weave across the world are themselves sites of ethical activity. Building on this, he discusses what it means for the technologies we use to be ‘pro-ethical’ by design, in the sense that they enhance or uplift our capacity for accountable, autonomous actions. In order to do so, they need – for example – to give us correct, actionable information. They need to help us arrive at decisions that are appropriate and genuinely linked to our needs and concerns, rather than manipulating or disempowering us. You can contrast this to what have elsewhere been described as ‘dark patterns,’ where you have system that are opaque and exploitative: where the interface is more like a casino, and it doesn’t want you to make a good decision, or really to have any choice at all. Some forms of social media might be one example of that—where the incentives are to do with emotional arousal, getting people to act fast and without consideration, with little ability to apply filters of quality to what they are doing. I didn’t list it among my recommendations, but I love Neal Stephenson’s novel Seveneves , about a hypothetical future in which social media so deranged people’s collective judgement that it nearly led to the human species being wiped out. In his story, social media is up there with gene editing or bioweapons as a forbidden technology, because the danger is so great when its seductions meet human fallibilities and vulnerabilities. I think intentions are important. But it’s very dangerous to ascribe too much foresight and brilliance to the people creating tools. I’ve spent quite a lot of time in the headquarters of tech companies dealing with very smart people, and it’s important to remember that even very smart people often have quite narrow knowledge and incentives. So, yes, intentions are important, but what you really need to be interested in is the blind spots, the lack of foresight, the capacity of people to pursue profit at the expense of other issues. The big thing for me is what I call the ‘virtuous processes’ of technology’s development and deployment. What I mean by that is forms of regulation and governance where you don’t get to ignore certain consequences, to move fast and break things and not worry about the result: where you are obligated to weigh up the wider impacts of a technology. You need to assume there are a lot of blindspots, lots of stuff that will emerge over time that you can’t anticipate. It’s one of the great lessons of systems thinking. I quote the French philosopher Paul Virilio in the book: “When you invent the ship, you invent the shipwreck.” This refers to those accidental revelations that will always come with technology. They’re inevitable; but this makes it all the more important to have feedback mechanisms whereby unintended or undesirable results—damage, injustice, unfairness—can be filtered back into the system, and checks and balances and mitigations created. There’s been a lot of good stuff written recently about the Luddites. Brian Merchant has a great recent book recasting the Luddites as a sophisticated labour movement: not just people who didn’t like technology, who wasted everyone’s time by busting up factories and resisting the inevitable. He’s partly saying that, actually, all forms of automation potentially bring a great deal of injustice and disempowerment and exploitation. And much of the story of industrialisation has entailed, very gradually, societies working out collectively how industrial processes can be compatible with respect for human bodies and minds. Of course, there are urgent conversations we still need to have about how models of production and modern lifestyles can be made compatible with sustainability. It’s not about moving back to some mythical Eden before technology; I don’t think that’s possible or meaningful. But I do think that keeping faith with our ability to change and adapt rapidly is important. We have the tools and the compassion, the awareness and the empathy, to come up with solutions for a way to live together. And, ironically, best serving these values tends to mean focusing on the local and the practical and the incremental, not the grand and the hand-waving."
The Ethics of Technology · fivebooks.com