Bunkobons

← All books

A Hacker's Mind: How the Powerful Bend Society's Rules, and How to Bend them Back

by Bruce Schneier

Buy on Amazon

Recommended by

"Schneier is a famous cryptographer: he wrote the basic textbook, Applied Cryptography , he’s now at Harvard’s Kennedy School of Government, and after a couple of decades of writing about security thinking he’s now thinking about security in the context of geopolitics and economics. (He’s also an old friend of mine: he designed my wedding ring, which is a cipher wheel, and he wrote the afterword to my novel Little Brother ). We hear a lot about design thinking. After 9/11 and 7/7, there were a lot of ill-considered measures taken in the name of security. Schneier states that to make things secure, we have to take a rigorous approach. So, if you’re making everybody take off their shoes, you have to think about where else they might conceal something. If you remove all the rubbish bins from Central London, you have to imagine where else someone from the IRA might plant a bomb. After all, the IRA’s campaign was not a litter-oriented campaign. Nor was Osama bin Laden an opponent in civil aviation. You need to approach this in a systematic way. He says when we think about the tax code, real estate regulation, financial rules, and security problems, we need to apply security thinking. If we plug this loophole, what remaining loopholes might our adversaries switch to? If we design a system of this degree of complexity, what lurking unsuspected bugs might it contain that we can never address, which might be eliminable if we reduce the complexity of the system through fundamental refactoring? Schneier is adding power to the story of security and moving beyond the technical dimension, and even the social dimension, into the political economy. Security experts have long understood that there’s a social dimension to security. One of the easiest ways to break into a computer system is to call the owner and pretend to be a police officer and say, “I really need you to give me this information from your system.” That works surprisingly well. What we’re learning from things like the Twitter Files is that it isn’t always someone pretending to be a police officer. Sometimes someone calls up and says, truthfully, “I am with the FBI—or the Home Office, or a very large corporation that you’re very dependent on—and I require you to do these things that violate your policies and put other people at risk.” When that happens, no policy and no technology can defend against it. The only things that defend against that are accountability, pluralism, and deconcentrating power. Chokepoint Capitalism is a book I co-authored with copyright scholar Rebecca Giblin from the University of Melbourne. It’s about concentration in the labor market for creative work and what we can do about it. The first half of the book is composed of case studies that dig into the complex accounting scams that allow the entertainment industry to steal from its workforce. The second half of the book presents detailed systemic solutions that are shovel-ready. Anything that can’t go on forever will eventually stop. As the creative labor markets have lurched from crisis to crisis, the absence of well-developed proposals for alternatives has led us to keep doing the same thing and hoping for a different outcome. Our theory of change is that if we can introduce these ready proposals, when crisis strikes again, we can do something different. I mention this because the first half of the book is so enraging that readers have told us: “I got to chapter six and I heard this high-pitched whining that I realized was an incipient rage aneurysm. For my own well-being, I had to put the book down.” I’m here to tell you that you’ve got to finish the book. The second half of the book provides relief by telling you what we can do about it. I promise you don’t need to only buy fair-trade music. It’s about how we can systematically alter the arts market so that workers get a larger share of the outcome and capital gets a smaller share. Workers within the creative industries (including editors, people who work at labels, and so on) and everyone in our audiences are class allies against the forces that want to drain all of our pocketbooks and treat us as badly as they can. Chokepoint Capitalism is about what’s wrong and about what to do about it. All of our answers are systemic solutions presented as alternatives to what we’ve done for the last 40 years whenever copyright has been in crisis, which is to add more copyright. In a world where there are only five major publishers that can bring your book to market, any copyright given to authors is transferred to those publishers. Just as if your kid gets beaten up for his lunch money every day and you give him extra lunch money, he’s going to get beaten up for the extra lunch money, too. We have to think about power as an important mechanism here. When sampling arose, at first no one knew how to manage it within copyright. It wasn’t considered part of the copyright question. If you go to New Orleans and hear a jazz band, the horn player will drop a couple of bars of something familiar in the middle of a solo, and it’s not a copyright violation. It’s not even fair dealing or fair use; it’s just cultural practice. That was how sampling was treated. Over time, through precedent, and through business practice, we created an exclusive right to control samples. Immediately, the Big Three labels amended their standard terms so that you couldn’t license a sample without signing a label contract. Every label contract now required you to sign away the rights to profit from your own samples, which meant that everyone who wanted to use samples had to pay the shareholders of the record labels. Musicians didn’t get any more money. As a class, musicians got less money. Taylor Swift is the most powerful bargainer in music today, the most popular recording artist and touring artist in the Western world, and she couldn’t get control of her masters back. A private equity weirdo acquired those masters and refused to sell them back to her at any price. He sold them to a private equity fund owned by the Disney family. He made sure that he would get a long-term revenue stream from that, not because he needed it but because he vindictively wanted her to know that every time her music gave someone pleasure, he would make money from it. However, everyone in the world has the right to record a Taylor Swift song under this thing called the compulsory license. It’s a collective right owned by everyone who cares about music… including Taylor Swift. So Swift recorded cover albums of her own albums. These are available for sale and for streaming from everywhere you can get her albums in their original form. The cover albums are better, they’re blessed by her, and this creep doesn’t get a penny. As we contemplate the rise of AI as a means of displacing creative income, a lot of creators are saying, “We will create the right to decide who can train a model with my work.” However, we already know how that’s going to work. Let’s say you’re a voice actor. The game studios that employ voice actors now begin every session with this kind of ritual recitation: “My name is Cory Doctorow. I freely and permanently assign the right to train a machine-learning system with my voice.” “To be a part of the ruling elite, you must distance yourself as far as possible from any productive activity” The game studios, and all the entities who have chokeholds over our creative labor markets, have motive, means, and opportunity to transfer any new right to train to themselves. They are the only ones who want to reduce creative labour’s share of the income. Kids making memes on social media don’t want to stop artists from getting paid, nor do they pay artists. They’re orthogonal to the question of how artists get paid. If, in the name of stopping those people, we create a new right to train, that right will land with the giant media conglomerates. Schneier suggests that we approach this as a security problem. What can we do as a countermeasure? We could say, as the US Copyright Office thus far has been very consistent in saying, that there is no copyright on machine-learning-generated works. As copyright adheres at the moment of fixation of the creative work generated by a human, Disney can therefore amend its animators’ contracts to require them to sign away the training rights. They can create models that, based on a prompt, can produce Pixar cartoons while the executive is out at lunch. However, that model will not attract a copyright, and everything in that model’s output doesn’t attract a copyright. Disney can release that movie, but everyone else can take it and sell it. At that point, Disney may say, “We will just pay the animators.” That’s how you think through this security problem that factors in power."
Chokepoint Capitalism · fivebooks.com