George Dyson's Reading List
George Dyson is a historian of technology whose publications broadly cover the evolution of technology in relation to the physical environment and the direction of society. He has written on a wide range of topics, including the history of computing, the development of algorithms and intelligence, communication systems, space exploration, and the design of water craft.
Open in WellRead Daily app →The Origins of Computing (2012)
Scraped from fivebooks.com (2012-03-14).
Source: fivebooks.com
Martin Davis · Buy on Amazon
"Martin Davis is a brilliant mathematical logician who was working with von Neumann at the beginning of the 1950s and came up with some of our best interpretations of Turing’s work. This is a brilliant, accessible and not overlong book looking at the whole evolution of the idea of computing, going back to Gottfried Leibniz – whom I would credit as the grandfather of it all. Davis explains how Leibniz’s ideas became real and important, and he’s fair about the genealogy of whose ideas led to other ideas. Yes, it’s a very good explanation of what computation really is. Because Martin Davis is a mathematician, it’s exactly correct – whereas with a lot of other books, including my own, there are sacrifices to technical accuracy for the sake of a story. Davis is rigorous about what Turing did and didn’t prove, and why it’s important today. Computation is essentially a mapping between sequences of code and memory. It’s a back-and-forth between memory and code, so a profoundly simple thing yet it has powerful consequences. Everything we do, from talking on Skype to watching digital movies, is at heart this extremely simple process. And computers are the engines of that logic. Leibniz envisaged that the entire universe could be represented as digital code. That sounded absolutely crazy in the 17th century, but it’s the world we live in today. There’s almost nothing left that isn’t being digitised. Leibniz also built computers. He even designed – in a 1679 unpublished manuscript – a digital computer using black and white marbles running down tracks that behaved in exactly the same way that our computers work today, running electrons through wires. So there’s no doubt in my mind that Leibniz was the original prophet, Turing was the later prophet, and then you can argue over who actually built what. Yes, the philosophical implications of all this are very interesting."
Andrew Hodges · Buy on Amazon
"Alan Turing was born exactly 100 years ago, [editor’s note: this interview was done in 2012] and died aged 41. In those 41 years he led an amazing life that is covered with extraordinary grace, complexity and completeness by Andrew Hodges in this biography. It was first published in 1983 and remains in print. No one could do a better job than Andrew Hodges, who is himself a mathematician – it is truly a masterpiece. Although it’s not a book about the history of computing per se , it’s a must read if you want to understand how we got to our modern world of computing, and gives you a great picture of the life and times of Alan Turing at this critical period. Turing was convicted of gross indecency in 1952 [for homosexual relations, then illegal in Britain] and sentenced to either imprisonment or oestrogen injections [to reduce libido]. He chose the oestrogen injections, which have an effect on personality and were a brutal treatment. He died in 1954 in what people assume was a suicide, although we don’t know for sure. This book is also very good on the details of what Alan Turing did, and what happened more generally, at Bletchley Park. Although when Hodges wrote the book, in 1983, much of the Bletchley Park material was still secret. During the Second World War, the Germans were using a machine called Enigma. The British, thanks to work by Turing and his innumerable colleagues, broke the Enigma cipher at Bletchley Park through a series of clever mathematical and human tricks. It’s unbelievable how much they did with so few computational resources, largely with human ingenuity. “It’s unbelievable how much they did with so few computational resources, largely with human ingenuity.” The Germans broke the rule of secrecy of not repeating yourself, and occasionally began messages with the same string of code. The other amazing thing is why this story was kept secret for so long, and not released after the war. It was one of the great achievements of the 20th century, and is finally out in the open. Yes. It was all happening during the war, but in secret. The group at Bletchley Park was moving very far ahead in code breaking. And in America [in the late 1940s] we had a computer called the ENIAC which made huge progress, but also in secret. After the war, the secrecy was lifted to some extent, so suddenly these developments that had been incubating came out into the open. I would be the first to say that Great Britain was ahead. In a way, the British invented digital computing and the Americans took the credit. Indeed! There’s a very direct connection. The push that allowed von Neumann build his universal machine was to solve hydrodynamic questions to decide whether a hydrogen bomb was possible or not. So to an extent it was a story of cryptography on the British side and nuclear weapons design on the American side, with of course some overlap."
Jack Copeland · Buy on Amazon
"After Enigma, the Germans developed more powerful encryption with higher-speed digital equipment that was much harder to break. The British side, led by [the engineer] Tommy Flowers, built a vacuum tube machine called Colossus – an extremely powerful and sophisticated digital computer which helped the British to break these even stronger codes. By the end of the war there were at least 10 of these Colossus machines. Essentially, it was the birth of the computing industry at a time when no one else was building 10 copies of the same machine. But it was all kept under wraps at the end of the war. My suspicion is that it just didn’t go with the heroic history of the war to publicise that it was won partly by breaking codes and not purely by heroism in battle. Also, it may be a valid argument that we still depend on breaking codes and don’t want a new enemy to know how we are doing it. That’s certainly true in the US today. The NSA is a huge organisation that still keeps its secrets. But in the case of Bletchley Park and Colossus, I don’t think it’s done any harm to finally publicise what really happened. Copeland’s book is another masterpiece. I wish I had been able to read it 20 years ago, when I first became interested in this. It reveals in a very technically correct way how the German codes got broken. It’s a marvellous collection of first-person documents, memories and editorial glue to hold it all together. Yes they are. America sent people to Britain during and after the war, to learn what had been done, and Alan Turing came over to America to debrief the people who became the NSA. Very much so. There is still a cat and mouse arms race between computers which both make it easier to write harder codes and easier to break them."
David Alan Grier · Buy on Amazon
"This book is about the period in which Turing worked, the 1930s. It’s important and often ignored that the world of electronic and mechanical computing didn’t come out of thin air. It came out of a world in which we were doing a large amount of computation, but with people. In America, during the Great Depression, we had something called the Works Progress Administration. One of the things they did to create jobs was to set up vast engines of human computation to make mathematical tables and suchlike. Much of what later became electronic was first done by these people – before it was mechanised one step at a time. David Alan Grier’s grandmother was one of these human computers, so he had a very strong interest in digging up their stories. It’s a marvellous book, rich in the detail of what those times were like. Yes, they were called computers. Grier himself is an electronic engineer, chairman of the IEEE [Institute of Electrical and Electronics Engineers], and he delved into how today’s algorithms came from what was worked out by human beings. It’s a very rich subject. Exactly the same process was going on in England and elsewhere, with accounting laboratories and scientific computing centres where problems were fed to large groups of people. In computation, we are moving numbers around. Whether you move them between people with pencil and paper, or on silicon chips between gates at the speed of light, it’s the same process."
John Alderman (author) and Mark Richards (photographer) · Buy on Amazon
"Core Memory is completely different from the others, and in no way makes any attempt to be historically complete or chronologically correct. It’s an art book of absolutely stunning, high resolution photographs of computers in the Computer History Museum that moved from Boston to San Francisco. The photos are utterly gorgeous, and give you a visceral sense of the hand craftsmanship that went into these machines. Hand-soldered wires, massive disc drives five feet in diameter – things like that. Support Five Books Five Books interviews are expensive to produce. If you're enjoying this interview, please support us by donating a small amount . Colossus was pretty big, and ENIAC was so huge you could go inside it. This book gets into the innards of some of these old machines and turns them into works of art. It’s like a children’s book of dinosaurs – if you’re interested you will go through page after page, and if not you will look at three pages and put it aside. It goes into the early age of personal computing. It has the first Macintosh, and the first computers you could buy for under a thousand dollars. There’s a few of them in there, as a nod to the world we’re in today. But it’s primarily about the age of what we call “big iron” – the huge mainframe computers that had to be moved around with forklifts and trucks. I use an absolutely modern Mac laptop. In fact, I just got a new one. My last one was six years old. My new laptop, quite miraculously, has a solid state drive – it no longer has a magnetic disc spinning around and waiting to crash. That’s a fabulous step ahead. But my boatbuilding business at home still runs on an ancient Mac – not the earliest generation but a completely extinct operating system. And it’s amazing how many companies’ accounting systems still run on punch cards. I think Moore’s law is going to keep going. The question is, what are we going to do with it? Every episode of every bad show that’s ever been on television is on YouTube for free, and we still can’t use all our bits [of processing power]. So what is going to happen next is a very interesting question. Turing is not at all a dead prophet who is of historical interest only. Almost every word he wrote can be read today and speak to the future. He believed in true artificial intelligence, and I think he was right. Things like Google are the fruition of his vision, and we’re going to have to wait to see where that goes. The way Google is doing it is to keep everybody happy, make sure everything is free and keep everyone on their side. I don’t subscribe to the Terminator scenario [of computers becoming self-aware and enslaving mankind]. Human beings are a part of this, and are not going to be extinguished by it. I think we need to worry less about whether machines are becoming more intelligent, and more about whether humans are becoming less intelligent. The jury is out on that. You could make the argument that because of smartphones we are losing the ability to visualise maps in our brains. That frees up part of our brain, but what do we use it for? Get the weekly Five Books newsletter I think he’s partly right, and his concerns are definitely worth worrying about. It’s not clear which way it’s going to go. We have to be very cautious and watch what’s happening with the next generation very carefully – because it is entirely possible that we could start losing some of the intelligence that has evolved over such a long period of time. Very much so. We are losing a lot of our craftsmanship, our ability to do things with our hands. That’s sad and a mistake, but it’s happening and we have to make the best of it. I think we should try to preserve human knowledge that will be very hard to reconstruct, like how you rebuild a carburetor. Things that we take for granted are being lost left and right. You don’t want to preserve them as artifacts, you want to preserve a working knowledge of them as much as we can, while leaving space for new skills to develop. One thing is for certain – we’re in a very transitional period."