Bunkobons

← All books

Mindf*ck: Inside Cambridge Analytica’s Plot to Break the World

by Christopher Wylie

Buy on Amazon

Recommended by

"I really liked this book. I wanted to choose a book about privacy. I think privacy is possibly the most concerning challenge we have with digital technologies. Privacy can be tricky because it can feel very abstract. It doesn’t feel like anything to have your data collected. It seems innocent and painless. The consequences are not always tangible or sometimes very far off in the future. “Privacy protects us from possible abuses of power” This book is great because it’s written by Christopher Wylie, who was the whistleblower who exposed Cambridge Analytica. He tells the story of exactly how Cambridge Analytica got the idea to use personal data to try to sway elections and how he became part of this. He was the data analyst who made it happen, and he writes about how they built the tool and what exactly that tool could do. The book makes something very abstract and difficult to understand very tangible. It’s a bit like that. It’s a book that also has a lot of narrative interest. Carole Cadwalladr, the Guardian journalist, persuaded him to become a whistleblower. It wasn’t his idea. There is also an ethical, interesting story there to be told about how someone becomes a whistleblower, how someone switches from thinking, ‘This is my job, and it’s okay to do it’ to realizing ‘Maybe I’ve done something really, really wrong and I have to make amends and try to change this.’ He did leak documents, but what he revealed was mainly about the tool itself, and how it was developed. But you’re right. Daniel Ellsberg, the whistleblower of Watergate , had to spend hours photocopying those papers as the only way he could take them out. For someone like Snowden, it was very different. In general, privacy protects us from possible abuses of power. If you share things like your heartbeat or what you drank last night, that could be used against you by insurance companies. Likewise your genetics. If you do a genetic test that reveals hereditary conditions you could pay a higher premium, even though it’s through no fault of your own that you have those genes. Furthermore, there’s a Nature study that shows that about 40% of the results of these commercial DNA tests yield false positives. But many insurance companies still take them at face value. No, because you can argue that the fundamental principle is unfair. Even if they got it right about your genes, you’re still not to be blamed for your genes. There’s an argument to be made for why you shouldn’t be paying more than other people for genes that you didn’t choose, and you couldn’t change. That sounds good, but it’s a very sterilized and clean image of a reality that just doesn’t pan out that way. Typically, the people who do exercise are people who are wealthier—they have more time to do exercise than somebody who is working two jobs to survive. “Cambridge Analytica no longer exists, but there are more than 300 firms that do pretty much the same thing” Furthermore, there are all kinds of assumptions there. For instance, it may be that you do a kind of exercise that doesn’t get tracked as easily with a watch. So that might nudge you to run, which might actually be harder on the body than doing, say, yoga. Whenever we track things and categorize people, there is a risk of tracking the wrong thing, or of nudging people into doing things that are actually not as good for them as it might seem at a superficial level, and of being unfair in different ways, either because we miscategorized them or because the categories are not respectful of social realities that should be taken into account. When personal data gets used to treat people in different ways, often it ends up in unfair discrimination because it takes into account things that shouldn’t be taken into account, and because it doesn’t take into account things that should be taken into account. In the end, we are not being treated as equal citizens anymore, but on the basis of our data, and that’s an affront to democracy. Yes! Another reason why I like this book is because we haven’t changed anything to make sure that it doesn’t happen again, so it serves as a warning. What this book reveals is still relevant. Cambridge Analytica no longer exists, but there are more than 300 firms that do pretty much the same thing. We haven’t fixed it. I’m worried that we are building a surveillance structure that could be co-opted by anyone. It could be an authoritarian regime, or it could even be a company. Something that I’ve been thinking about recently is how digitization is analogous to colonialism. It’s a kind of colonizing of reality, a colonizing of the world to make trackable what wasn’t trackable before, to turn the analog into digital. When we look back to colonialism in India, the default image that comes to our mind—or at least it was for me—is that it was the British government who colonized India. But actually it was the East India Company, which at some point had more soldiers than the UK government. So the rogue player could be an authoritarian government, but companies could also become oppressive enough that they jeopardize our freedom. When I see something like Amazon ring cameras becoming more and more popular—and having this very close connection to the police—that’s definitely a worry. When we have rivals like China, who are not very democratic, keen on collecting data and becoming leaders in AI, that’s a geopolitical risk that we’re taking."
Digital Ethics · fivebooks.com